My thesis was the research and development of new tech for the simulation of surgically interactive virtual tissues. It includes haptic-rendering (making it into something you can touch/feel), novel medical training simulation applications, and reviews of the related tech, hardware, and the research area more generally.
Author: Greg S. Ruthenbeck. Download thesis as a PDF (from ResearchGate, 4MB).
Medical simulation has the potential to revolutionize the training of medical practitioners. Advantages include reduced risk to patients, increased access to rare scenarios and virtually unlimited repeat-ability. However, in order to fulfill its potential, medical simulators require techniques to provide realistic user interaction with the simulated patient. Specifically, compelling real-time simulations that allow the trainee to interact with and modify tissues, as if they were practicing on real patients.
A key challenge when simulating interactive tissue is reducing the computational processing required to simulate the mechanical behavior. One successful method of increasing the visual fidelity of deformable models while limiting the complexity of the mechanical simulation is to bind a coarse mechanical simulation to a more detailed shell mesh. But even with reduced complexity, the processing required for real-time interactive mechanical simulation often limits the fidelity of the medical simulation overall. With recent advances in the programmability and processing power of massively parallel processors such as graphics processing units (GPUs), suitably designed algorithms can achieve significant improvements in performance.
This thesis describes an ablatable soft-tissue simulation framework, a new approach to interactive mechanical simulation for virtual reality (VR) surgical training simulators that makes efficient use of parallel hardware to deliver a realistic and versatile interactive real-time soft tissue simulation for use in medical simulators.