Generalizing motion edits with Gaussian processes

Leslie Ikemoto, Okan Arikan, David Forsyth

Research output: Contribution to journalArticlepeer-review

Abstract

One way that artists create compelling character animations is by manipulating details of a character's motion. This process is expensive and repetitive. We show that we can make such motion editing more efficient by generalizing the edits an animator makes on short sequences of motion to other sequences. Our method predicts frames for the motion using Gaussian process models of kinematics and dynamics. These estimates are combined with probabilistic inference. Our method can be used to propagate edits from examples to an entire sequence for an existing character, and it can also be used to map a motion from a control character to a very different target character. The technique shows good generalization. For example, we show that an estimator, learned from a few seconds of edited example animation using our methods, generalizes well enough to edit minutes of character animation in a high-quality fashion. Learning is interactive: An animator who wants to improve the output can provide small, correcting examples and the system will produce improved estimates of motion. We make this interactive learning process efficient and natural with a fast, full-body IK system with novel features. Finally, we present data from interviews with professional character animators that indicate that generalizing and propagating animator edits can save artists significant time and work.

Original languageEnglish (US)
Article number1
JournalACM Transactions on Graphics
Volume28
Issue number1
DOIs
StatePublished - Jan 1 2009

Keywords

  • Artist-guided content creation
  • Controllable motion editing

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design

Fingerprint Dive into the research topics of 'Generalizing motion edits with Gaussian processes'. Together they form a unique fingerprint.

Cite this