Abstract
Modeling and manipulating elasto-plastic objects are essential capabilities for robots to perform complex industrial and household interaction tasks (e.g., stuffing dumplings, rolling sushi, and making pottery). However, due to the high degrees of freedom of elasto-plastic objects, significant challenges exist in virtually every aspect of the robotic manipulation pipeline, for example, representing the states, modeling the dynamics, and synthesizing the control signals. We propose to tackle these challenges by employing a particle-based representation for elasto-plastic objects in a model-based planning framework. Our system, RoboCraft, only assumes access to raw RGBD visual observations. It transforms the sensory data into particles and learns a particle-based dynamics model using graph neural networks (GNNs) to capture the structure of the underlying system. The learned model can then be coupled with model predictive control (MPC) algorithms to plan the robot’s behavior. We show through experiments that with just 10 min of real-world robot interaction data, our robot can learn a dynamics model that can be used to synthesize control signals to deform elasto-plastic objects into various complex target shapes, including shapes that the robot has never encountered before. We perform systematic evaluations in both simulation and the real world to demonstrate the robot’s manipulation capabilities.
Original language | English (US) |
---|---|
Pages (from-to) | 533-549 |
Number of pages | 17 |
Journal | International Journal of Robotics Research |
Volume | 43 |
Issue number | 4 |
DOIs | |
State | Published - Apr 2024 |
Keywords
- Robot learning
- deformable object manipulation
- visual perception and learning
ASJC Scopus subject areas
- Software
- Modeling and Simulation
- Mechanical Engineering
- Electrical and Electronic Engineering
- Artificial Intelligence
- Applied Mathematics