Robotics from the bench – Research for ultrasound automation with augmented reality visualization

Type of publication:  Konferenzbericht
Publikationsstatus: Veröffentlicht
Buchtitel: Proceedings on MIC 2019
Serie: Article 1901i32
Jahr: 2019
Monat: Januar
Verlag: MIC 2019 - 17. Dreiländertreffen für Minimal-invasive Chirurgie
DOI: 10.18416/MIC.2019.1901i32
Abriss: Ultrasound imaging commonly used for diagnostics may also be used for radiation-free catheter and needle navigation. Due to the considerable skill and expertise it requires, ultrasound image acquisition and diagnostics are difficult to be done. Automated ultrasound image acquisition could potentially overcome this operator dependency. For the automation and standardization of the diagnostic ultrasound imaging process as well as for the operator-free automated catheter and needle navigation, we are currently developing a medical robotic device platform. A prototype of this robot-supported ultrasound platform was produced, on which various medical examination procedures can be developed. Three core technologies were applied therefor: - A force-sensitive 7-DoF robot arm (KUKA LBR iiwa), which can position both automatically and manually over a certain target area on the body surface. The arm has collision avoidance and ensures the dynamic force-controlled posture of the transducer to the patient. - 3D ultrasound is realized by a matrix probe and appropriate calculation algorithms on a modified ultrasound station (GE Vivid 7). It enables recording of large areas at high frame rates. Thus the target, its surrounding areas and nearby navigation points can be recorded simultaneously. - 3D ultrasound data streaming from ultrasound system to Microsoft HoloLens glasses and visualization at a certain distance from the ultrasound probe by tracking augmented reality markers. The functionality was verified using an ultrasound phantom (BluePhantom FAST ultrasound training model). The result is a 4D volume data set for the physician-assisted diagnostic evaluation. The process delivered reproducible real-time visualization results on a workstation where the volumes were simultaneously visualized and stored. Ultrasound volume data of the training model (matrix size 103x74x134) were sent from the ultrasound system to HoloLens in order to display them, showing a latency of 259 ± 86 ms. Automated ultrasound diagnostics and navigation should help to reduce the binding of clinical resources in the future, thereby enabling better reproducibility of imaging and reducing side effects from radiation exposure. The device platform will serve as the basis for further automated ultrasound diagnostics and therapy procedures.
Autoren: Böttger, Sven
von Haxthausen, Felix
Kleemann, Markus
Ernst, Floris
Schweikard, Achim