Overview
Justina’s surgical simulation feature provides a realistic 3D environment for practicing surgical procedures. Built with Babylon.js, the simulator offers real-time 3D rendering, physics-based interactions, and intuitive surgical tool controls.
The simulation runs entirely in the browser using WebGL, requiring no additional plugins or downloads.
Technical Architecture
The simulation engine is built on Babylon.js , a powerful WebGL-based 3D framework that enables:
Real-time 3D rendering with lighting and materials
Physics-based collision detection
Camera controls for multiple viewing angles
Advanced texture and shader support
Core Components
3D Scene Babylon.js Scene with custom lighting, camera positioning, and organ models
Surgical Tools Interactive instruments with collision detection and event tracking
UI Overlay Real-time coordinate display and simulation controls using Babylon.js GUI
Event System Captures surgical events like tumor touches and hemorrhages
3D Rendering Pipeline
The simulation initializes with the following setup:
const engine = new BABYLON . Engine ( canvasRef . current , true );
const scene = new BABYLON . Scene ( engine );
scene . clearColor = new BABYLON . Color4 ( 0.8 , 0.9 , 1 , 1 );
// Laparoscopic camera setup
const camera = new BABYLON . ArcRotateCamera (
"camera" ,
- Math . PI / 2 ,
Math . PI / 2.5 ,
10 ,
BABYLON . Vector3 . Zero (),
scene
);
camera . lowerRadiusLimit = 6 ;
camera . upperRadiusLimit = 20 ;
camera . attachControl ( canvasRef . current , true );
// Surgical lighting
new BABYLON . HemisphericLight ( "light" , new BABYLON . Vector3 ( 0 , 1 , 0 ), scene );
The camera limits simulate the restricted movement range of laparoscopic surgery, providing realistic training conditions.
Scalpel Instrument
The primary surgical tool is a 3D scalpel model that follows the user’s cursor in 3D space:
scene . onPointerMove = () => {
if ( ! scalpelMesh ) return ;
const ray = scene . createPickingRay (
scene . pointerX ,
scene . pointerY ,
BABYLON . Matrix . Identity (),
camera
);
const newPosition = ray . origin . add ( ray . direction . scale ( depth ));
const offset = new BABYLON . Vector3 ( 0.1 , - 0.2 , 0 );
scalpelMesh . position = newPosition . add ( offset );
cutter . position = scalpelMesh . position . clone ();
checkCollisions ();
};
User Controls
Move : Position the scalpel in 3D space
Click & Drag : Activate cutting mode
Scroll Wheel : Adjust depth (distance from camera)
Right-Click Drag : Rotate camera view
if ( pointerInfo . type === BABYLON . PointerEventTypes . POINTERWHEEL ) {
const wheelEvent = pointerInfo . event as WheelEvent ;
depth += wheelEvent . deltaY * 0.01 ;
}
Allows precise control of instrument depth along the viewing axis.
Surgical Environment
Organ Models
The simulation loads 3D organ models using GLTF/GLB format:
Promise . all ([
BABYLON . SceneLoader . ImportMeshAsync ( "" , "/models/" , "kidney.glb" , scene ),
BABYLON . SceneLoader . ImportMeshAsync ( "" , "/models/" , "scalpel.glb" , scene )
]). then (([ kidneyResult , scalpelResult ]) => {
// Configure kidney positioning and scale
const kidney = kidneyResult . meshes [ 0 ];
kidney . scaling = new BABYLON . Vector3 ( 0.5 , 0.5 , 0.5 );
// Center the organ in the viewport
const boundingInfo = kidney . getHierarchyBoundingVectors ();
const center = boundingInfo . min . add ( boundingInfo . max ). scale ( 0.5 );
kidney . position = kidney . position . subtract ( center );
});
Tumor Representation
Tumors are rendered as clusters of spherical fragments:
const tumorMaterial = new BABYLON . StandardMaterial ( "tumorMat" , scene );
tumorMaterial . diffuseColor = new BABYLON . Color3 ( 0.8 , 0 , 0.2 );
const fragmentCount = 25 ;
const radius = 0.6 ;
for ( let i = 0 ; i < fragmentCount ; i ++ ) {
const fragment = BABYLON . MeshBuilder . CreateSphere (
"tumorFragment" ,
{ diameter: 0.4 , segments: 8 },
scene
);
const randomOffset = new BABYLON . Vector3 (
( Math . random () - 0.5 ) * radius ,
( Math . random () - 0.5 ) * radius ,
( Math . random () - 0.5 ) * radius
);
fragment . position = new BABYLON . Vector3 ( 1.2 , 0.2 , - 0.7 ). add ( randomOffset );
fragment . material = tumorMaterial ;
tumorFragments . push ( fragment );
}
Collision Detection
The simulation uses mesh intersection to detect surgical events:
function checkCollisions () {
if ( ! simulationStarted || ! sceneReady || ! instrumentActive ) return ;
// Hemorrhage detection
if ( ! arteryCut && cutter . intersectsMesh ( arteryMesh , true )) {
arteryCut = true ;
const mat = new BABYLON . StandardMaterial ( "cut" , scene );
mat . diffuseColor = BABYLON . Color3 . Red ();
arteryMesh . material = mat ;
enviarEvento ( scalpelMesh . position . x , scalpelMesh . position . y , scalpelMesh . position . z , "HEMORRHAGE" );
}
// Tumor removal detection
tumorFragments . forEach (( fragment , index ) => {
if ( cutter . intersectsMesh ( fragment , true )) {
fragment . dispose ();
tumorFragments . splice ( index , 1 );
enviarEvento ( scalpelMesh . position . x , scalpelMesh . position . y , scalpelMesh . position . z , "TUMOR_TOUCH" );
}
});
}
All collision events are immediately transmitted via WebSocket for real-time telemetry analysis.
Real-Time Coordinate Display
The UI overlay shows live instrument coordinates:
const textX = new GUI . TextBlock ();
textX . text = `Coordenada X: ${ scalpelMesh . position . x . toFixed ( 2 ) } ` ;
const textY = new GUI . TextBlock ();
textY . text = `Coordenada Y: ${ scalpelMesh . position . y . toFixed ( 2 ) } ` ;
const textZ = new GUI . TextBlock ();
textZ . text = `Coordenada Z: ${ scalpelMesh . position . z . toFixed ( 2 ) } ` ;
This provides surgeons with precise spatial awareness during the procedure.
Render Loop Babylon.js uses requestAnimationFrame for smooth 60 FPS rendering
Model Optimization GLB models are optimized for web delivery with texture compression
Collision Efficiency Checks only run when instrument is active to minimize CPU usage
Memory Management Meshes are properly disposed when removed to prevent memory leaks
Simulation Workflow
Initialization : Load 3D models and setup scene
Start Surgery : Connect WebSocket and begin telemetry streaming
Active Phase : Track tool movements and surgical events
Completion : Send FINISH event and process AI analysis
Results : Display performance metrics and feedback
The simulation state is managed through React hooks and Babylon.js observables for seamless integration.
Browser Compatibility
The simulation requires:
WebGL 2.0 support
Modern browser (Chrome 90+, Firefox 88+, Safari 14+)
Minimum 4GB RAM
GPU acceleration recommended
Next Steps
Real-Time Telemetry Learn how movement data is captured and transmitted
AI Analysis Explore the AI pipeline that evaluates performance