In-hand object scanning via RGB-d video segmentation

Fan Wang, Kris Hauser

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper proposes a technique for 3D object scanning via in-hand manipulation, in which an object reoriented in front of a video camera with multiple grasps and regrasps. In-hand object tracking is a significant challenge under fast movement, rapid appearance changes, and occlusions. This paper proposes a novel video-segmentation-based object tracking algorithm that tracks arbitrary in-hand objects more effectively than existing techniques. It also describes a novel RGB-D in-hand object manipulation dataset consisting of several common household objects. Experiments show that the new method achieves 6% increase in accuracy compared to top performing video tracking algorithms and results in noticeably higher quality reconstructed models. Moreover, testing with a novice user on a set of 200 objects demonstrates relatively rapid construction of complete 3D object models.

Original languageEnglish (US)
Title of host publication2019 International Conference on Robotics and Automation, ICRA 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3296-3302
Number of pages7
ISBN (Electronic)9781538660263
DOIs
StatePublished - May 2019
Externally publishedYes
Event2019 International Conference on Robotics and Automation, ICRA 2019 - Montreal, Canada
Duration: May 20 2019May 24 2019

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729

Conference

Conference2019 International Conference on Robotics and Automation, ICRA 2019
Country/TerritoryCanada
CityMontreal
Period5/20/195/24/19

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Electrical and Electronic Engineering
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'In-hand object scanning via RGB-d video segmentation'. Together they form a unique fingerprint.

Cite this