Virtual Neurosurgery- training for the future Michael Vloeberghs1, Tony Glover2, Steve Benford



Download 3.92 Mb.
Page1/7
Date26.04.2018
Size3.92 Mb.
#46764
  1   2   3   4   5   6   7
Virtual Neurosurgery- Training for the future
Michael Vloeberghs1, Tony Glover2, Steve Benford2, Arthur Jones3, Peji Wang3, Adib Becker3.
1 Academic Division of Child Health, School of Human Development

2 School of Computer Science and Information Technology

3 School of Mechanical, Materials, Manufacturing Engineering and Management,


Correspondence address:

Mr Michael Vloeberghs, MD, PhD

Clinical Associate Professor

Consultant Paediatric Neurosurgeon

Paediatric Neurosurgery

Nottingham University Hospital

Clifton Blvd

Nottingham

NG72UH

United Kingdom



e-mail: michael.vloeberghs@nottingham.ac.uk
Key words: Virtual reality, Neurosurgical training, Haptics, Boundary elements, Working times directives.
Abstract:
Virtual Reality (VR) simulators have been created for various surgical specialties. The common theme is extensive use of graphics, confined spaces, limited functionality and limited tactile feedback. A development team at the University of Nottingham, UK, consisting of: computer scientists, mechanical engineers, graphic designers and a Neurosurgeon, set out to develop a haptic e.g. tactile simulator, for Neurosurgery making use of Boundary Element (BE) techniques. The relative homogeneity of the brain, allows boundary elements i.e. “surface only” rendering, to simulate the brain structure. The surface-only modelling feature of the BE formulation reduces the number of algebraic equations and saves computing time, by assuming the properties of the surface equal the properties of the body. A limited audit was done by Neurosurgical users confirming the potential of the simulator as a training tool.

This paper focuses on the application of the computational method and refers to the underlying mathematical structure. Full references are included regarding the mathematical methodology.


Introduction
The need for surgical simulation is driven by the limitation in training hours set by working time directives in Western countries and increasing litigation in surgical incidents. The net result is limited training opportunities and less chance for junior surgeons to acquire the necessary experience during regulated training.

Hands-on simulation is well established in aerospace training, and audit of simulation in medicine has shown a decrease in the number of adverse events in actual surgery (30).


Method
In order for surgery simulation to become established as training tool, VR simulators need to provide a diverse range of capabilities, close to reality (Table 1). This simulator was focused on the haptic capabilities and to a lesser extent on graphics. The development team decided that graphics are a secondary issue and easily implemented in comparison to the real-time BE computations.
Table 1: Surgical acts to be simulated in VR.

• Simulate the process of pushing and pulling

• Simulate the cutting or separation of tissue, including multiple-cuts and incisions.

• Simulate gravitational deformation of the cut tissue

• Allow for self-contact between the cut tissues

• Simulate post-cutting manipulation

• Allow for two-handed interaction

• Allow for continuous cutting and separation of tissue, e.g. to reach a tumour

• Simulate the complete removal of tissue, e.g. a separation of a tumour

Provide a realistic position and posture for the surgeon

Use 3D stereo vision

Incorporate surgical tools and implements physically connected to force-feedback devices

Show accurate visual 3D models with light projection and shadows

Utilise patient specific virtual models, e.g. from MRI data




A Boundary Element virtual surgery environment


The BE-based simulator was developed at the University of Nottingham, as a collaborative research project between the Schools of Mechanical Materials and Manufacturing Engineering, School of Computer Science and Information Technology and School of Human Development, Division of Child Health (34, 35).
Neurosurgery simulation is particularly challenging because it involves interaction with the gelatinous structure of the brain and lesions of varying consistency. This simulator allows the user to operate on an area of the brain surface, use diathermy (cutting), retract the brain substance and remove a mass from inside the brain substance.
The simulator is based on in-house created real-time BE software combined with advanced computer graphics and commercially available force-feedback haptic devices. (Table 2)

Table 2. Hardware set-up



• A PC (high specification, typically 3 GHz) fitted with a graphics card capable of rendering 3D images

• A monitor compatible with the stereo vision system, angled such that the image is reflected in a semi-silvered mirror and a visual refresh rate of at least 25 Hz

• 3D Stereo vision goggles and interface

• Two haptic devices for position sensing and force feedback and a haptic rendering rate around 1000 Hz (currently the Sensable Technologies PHANToM Omni system is used) (27)




Replace with these ?







Figure 1: The surgery simulation hardware. A custom built rig accommodates the computer, the haptic devices, the monitor and the reflective semitransparent mirror needed for stereoscopic vision.




Download 3.92 Mb.

Share with your friends:
  1   2   3   4   5   6   7




The database is protected by copyright ©ininet.org 2024
send message

    Main page