Self-Supervised Motion Retargeting with Safety Guarantee

Sungjoon Choi, Min Jae Song, Hyemin Ahn, Joohyung Kim

Research output: Chapter in Book/Report/Conference proceedingConference contribution


In this paper, we present self-supervised shared latent embedding (S3LE), a data-driven motion retargeting method that enables the generation of natural motions in humanoid robots from motion capture data or RGB videos. While it requires paired data consisting of human poses and their corresponding robot configurations, it significantly alleviates the necessity of time-consuming data-collection via novel paired data generating processes. Our self-supervised learning procedure consists of two steps: automatically generating paired data to bootstrap the motion retargeting, and learning a projection-invariant mapping to handle the different expressivity of humans and humanoid robots. Furthermore, our method guarantees that the generated robot pose is collision-free and satisfies position limits by utilizing nonparametric regression in the shared latent space. We demonstrate that our method can generate expressive robotic motions from both the CMU motion capture database and YouTube videos.

Original languageEnglish (US)
Title of host publication2021 IEEE International Conference on Robotics and Automation, ICRA 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages7
ISBN (Electronic)9781728190778
StatePublished - 2021
Event2021 IEEE International Conference on Robotics and Automation, ICRA 2021 - Xi'an, China
Duration: May 30 2021Jun 5 2021

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729


Conference2021 IEEE International Conference on Robotics and Automation, ICRA 2021

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Electrical and Electronic Engineering
  • Control and Systems Engineering


Dive into the research topics of 'Self-Supervised Motion Retargeting with Safety Guarantee'. Together they form a unique fingerprint.

Cite this