TY - GEN
T1 - Surgical Robot with Environment Reconstruction and Force Feedback
AU - Li, Xiao
AU - Kesavadas, Thenkurussi
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/10/26
Y1 - 2018/10/26
N2 - We present a new surgical robot hardware-in-the-loop simulator, with 3D surgical field reconstruction in RGB-D sensor range, which allows tool-tissue interactions to be presented as haptic feedback and thus provides the situation awareness of unwanted collision. First, the point cloud of the complete surgical environment is constructed from multiple frames of sensor data to avoid the occlusion issue. Then the user selects a region of interest where the robot's tool must avoid (also called forbidden region). The real-time haptic force rendering algorithm computes the interaction force which is then communicated to a haptic device at 1 kHz, to assist the surgeon to perform safe actions. The robot used is a RAVEN II system, RGB-D sensor is used to scan the environment, and two Omni haptic devices provide the 3-DoF haptic force. A registration pipeline is presented to complete the surgical environment point cloud mapping in preoperative surgery planning phase, which improves quality of haptic rendering in the presence of occlusion. Furthermore, we propose a feasible and fast algorithm which extends the existing work on the proxy-based method for haptic rendering between a Haptic Interaction Point (HIP) and a point cloud. The proposed methodology has the potential of improving the safety of surgical robots.
AB - We present a new surgical robot hardware-in-the-loop simulator, with 3D surgical field reconstruction in RGB-D sensor range, which allows tool-tissue interactions to be presented as haptic feedback and thus provides the situation awareness of unwanted collision. First, the point cloud of the complete surgical environment is constructed from multiple frames of sensor data to avoid the occlusion issue. Then the user selects a region of interest where the robot's tool must avoid (also called forbidden region). The real-time haptic force rendering algorithm computes the interaction force which is then communicated to a haptic device at 1 kHz, to assist the surgeon to perform safe actions. The robot used is a RAVEN II system, RGB-D sensor is used to scan the environment, and two Omni haptic devices provide the 3-DoF haptic force. A registration pipeline is presented to complete the surgical environment point cloud mapping in preoperative surgery planning phase, which improves quality of haptic rendering in the presence of occlusion. Furthermore, we propose a feasible and fast algorithm which extends the existing work on the proxy-based method for haptic rendering between a Haptic Interaction Point (HIP) and a point cloud. The proposed methodology has the potential of improving the safety of surgical robots.
UR - http://www.scopus.com/inward/record.url?scp=85056610495&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85056610495&partnerID=8YFLogxK
U2 - 10.1109/EMBC.2018.8512695
DO - 10.1109/EMBC.2018.8512695
M3 - Conference contribution
C2 - 30440759
AN - SCOPUS:85056610495
T3 - Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS
SP - 1861
EP - 1866
BT - 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2018
Y2 - 18 July 2018 through 21 July 2018
ER -