This research is focused on the development and surgeon factors analysis of advanced visualization technology for the operating room. The hypothesis of this work is that applying advanced technology for the visualization of real-time medical data will enhance the performance, comfort and insight of the surgeon. It will then also improve the morbidity and mortality of patients.; In the first study, we use a passive robot arm to track a calibrated end-effector mounted video camera. In real time, we superimpose the live video view with the synchronized graphical view of CT-derived segmented object(s) of interest within a phantom skull (Augmented Reality (AR)). Using the same arm, we have also developed an Image Guided Surgery system (IGS) (Virtual Reality) able to show a tracked tool's trajectory on orthogonal image data scans and 3D models. Both systems are designed with client/server architecture for potential use in telepresence. A Human factors study was conducted using 21 subjects (3 surgeons) to try and see if differences in terms of time, errors and level of awareness of the patient 3D anatomy existed between the two systems. This study indicated that IGS took a statistically significant longer time than did AR. In addition, (although on the border of statistical significance (p value of 0.068)), IGS did have on average a greater number of errors indicating gaps in awareness of the phantom's anatomy.; In a second study, a comparison of display hardware for the video stream that is being viewed from the remote surgical site was conducted. The main question was: Does visualization of the remote video at the surgical site by a Head-up display improve the performance of the test subject over viewing a monitor? In this study we concluded (using 22 subjects) that the use of a Head-up display compared with a 45° angled monitor influences positively the performance of the surgeon.; We believe and have show via subject testing that Augmented Reality generation is a natural extension for the surgeon because it does both the 2D to 3D transformation and projects the views directly onto the patient view. We conjecture that medical robotic devices of the future should be able to use this technology to directly link these systems to patient data and provide the optimal visualization of that data for the surgical team. The design and methods of the AR prototype device can, we believe, be extrapolated for current medical robotics systems and IGS systems. There are distinct advantages and disadvantages for the use of both AR and IGS systems and hence, as future work we propose a hybrid on-demand AR/VR system for use in Robotic and Image Guided Surgery.
展开▼