Hybrid Eye-in-Hand/Eye-to-Hand Image Based Visual Servoing for Soft Continuum Arms

Ali Albeladi, Evan Ripperger, Seth Andrew Hutchinson, Girish Krishnan

Research output: Contribution to journalArticlepeer-review


Soft continuum arms (SCAs) that are controlled by visual servoing (VS) present trade-offs between the camera range and tracking accuracy. Cameras placed at a distance (eye-to-hand) can observe a larger workspace area and the SCA tip, while a camera at the end effector (eye-in-hand) can more accurately survey the target. In this letter, we present a hybrid eye-to-hand and eye-in-hand VS scheme to track a desired object in the SCA's worksapce. When the target is not in the field-of-view of the tip camera, hand-to-eye VS is implemented using a wide field-of-view camera on the soft robot's base, to servo the soft robot's tip to a feasible region where the target is expected to be seen by the tip camera. This region is estimated by solving an optimization problem that finds the best region to place the SCA assuming a constant curvature model for the SCA. When the target is seen by the tip camera, the system switches to a hand-in-eye controller that keeps the target in the desired image position of the tip camera. Experimental results on the popular BR2 SCA demonstrates the effectiveness of the hybrid VS scheme under practical settings that include external disturbances.

Original languageEnglish (US)
Pages (from-to)11298-11305
Number of pages8
JournalIEEE Robotics and Automation Letters
Issue number4
StatePublished - Oct 1 2022


  • Soft robotics
  • visual servoing

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Biomedical Engineering
  • Human-Computer Interaction
  • Mechanical Engineering
  • Computer Vision and Pattern Recognition
  • Computer Science Applications
  • Control and Optimization
  • Artificial Intelligence


Dive into the research topics of 'Hybrid Eye-in-Hand/Eye-to-Hand Image Based Visual Servoing for Soft Continuum Arms'. Together they form a unique fingerprint.

Cite this