TY - GEN
T1 - Toward Zero-Shot Sim-to-Real Transfer Learning for Pneumatic Soft Robot 3D Proprioceptive Sensing
AU - Yoo, Uksang
AU - Zhao, Hanwen
AU - Altamirano, Alvaro
AU - Yuan, Wenzhen
AU - Feng, Chen
N1 - Funding Information:
∗ indicates equal contributions. ✉ Corresponding authors. The work is supported by NSF grant 2024882 and NSF Graduate Research Fellowship under Grant No. DGE2140739. 1Robotics Institute, Carnegie Mellon University, Pittsburgh, PA 15213, USA {uyoo, wenzheny}@andrew.cmu.edu 2New York University, Brooklyn, NY 11201, USA {hzhao, aa9420, cfeng}@nyu.edu Fig. 1. The proposed pipeline of sim-to-real transfer learning for vision-based soft robot. We generated simulation-based point cloud and corresponding internal camera views to train our neural network model. Then, we show that the trained model transfers zero-shot to the real world by testing with real-world images.
Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Pneumatic soft robots present many advantages in manipulation tasks. Notably, their inherent compliance makes them safe and reliable in unstructured and fragile environments. However, full-body shape sensing for pneumatic soft robots is challenging because of their high degrees of freedom and complex deformation behaviors. Vision-based proprioception sensing methods relying on embedded cameras and deep learning provide a good solution to proprioception sensing by extracting the full-body shape information from the high-dimensional sensing data. But the current training data collection process makes it difficult for many applications. To address this challenge, we propose and demonstrate a robust sim-to-real pipeline that allows the collection of the soft robot's shape information in high-fidelity point cloud representation. The model trained on simulated data was evaluated with real internal camera images. The results show that the model performed with averaged Chamfer distance of 8.85 mm and tip position error of 10.12 mm even with external perturbation for a pneumatic soft robot with a length of 100.0 mm. We also demonstrated the sim-to-real pipeline's potential for exploring different configurations of visual patterns to improve vision-based reconstruction results. The code and dataset are available at https://github.com/DeepSoRo/DeepSoRoSim2Real.
AB - Pneumatic soft robots present many advantages in manipulation tasks. Notably, their inherent compliance makes them safe and reliable in unstructured and fragile environments. However, full-body shape sensing for pneumatic soft robots is challenging because of their high degrees of freedom and complex deformation behaviors. Vision-based proprioception sensing methods relying on embedded cameras and deep learning provide a good solution to proprioception sensing by extracting the full-body shape information from the high-dimensional sensing data. But the current training data collection process makes it difficult for many applications. To address this challenge, we propose and demonstrate a robust sim-to-real pipeline that allows the collection of the soft robot's shape information in high-fidelity point cloud representation. The model trained on simulated data was evaluated with real internal camera images. The results show that the model performed with averaged Chamfer distance of 8.85 mm and tip position error of 10.12 mm even with external perturbation for a pneumatic soft robot with a length of 100.0 mm. We also demonstrated the sim-to-real pipeline's potential for exploring different configurations of visual patterns to improve vision-based reconstruction results. The code and dataset are available at https://github.com/DeepSoRo/DeepSoRoSim2Real.
UR - http://www.scopus.com/inward/record.url?scp=85168663180&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85168663180&partnerID=8YFLogxK
U2 - 10.1109/ICRA48891.2023.10160384
DO - 10.1109/ICRA48891.2023.10160384
M3 - Conference contribution
AN - SCOPUS:85168663180
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 544
EP - 551
BT - Proceedings - ICRA 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 IEEE International Conference on Robotics and Automation, ICRA 2023
Y2 - 29 May 2023 through 2 June 2023
ER -