Generalizing motion edits with Gaussian processes

Leslie Ikemoto, Okan Arikan, David Forsyth

Research output: Contribution to journalArticle

Abstract

One way that artists create compelling character animations is by manipulating details of a character's motion. This process is expensive and repetitive. We show that we can make such motion editing more efficient by generalizing the edits an animator makes on short sequences of motion to other sequences. Our method predicts frames for the motion using Gaussian process models of kinematics and dynamics. These estimates are combined with probabilistic inference. Our method can be used to propagate edits from examples to an entire sequence for an existing character, and it can also be used to map a motion from a control character to a very different target character. The technique shows good generalization. For example, we show that an estimator, learned from a few seconds of edited example animation using our methods, generalizes well enough to edit minutes of character animation in a high-quality fashion. Learning is interactive: An animator who wants to improve the output can provide small, correcting examples and the system will produce improved estimates of motion. We make this interactive learning process efficient and natural with a fast, full-body IK system with novel features. Finally, we present data from interviews with professional character animators that indicate that generalizing and propagating animator edits can save artists significant time and work.

Original languageEnglish (US)
Article number1
JournalACM Transactions on Graphics
Volume28
Issue number1
DOIs
StatePublished - Jan 1 2009

Fingerprint

Animation
Kinematics

Keywords

  • Artist-guided content creation
  • Controllable motion editing

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design

Cite this

Generalizing motion edits with Gaussian processes. / Ikemoto, Leslie; Arikan, Okan; Forsyth, David.

In: ACM Transactions on Graphics, Vol. 28, No. 1, 1, 01.01.2009.

Research output: Contribution to journalArticle

Ikemoto, Leslie ; Arikan, Okan ; Forsyth, David. / Generalizing motion edits with Gaussian processes. In: ACM Transactions on Graphics. 2009 ; Vol. 28, No. 1.
@article{4bbeaa4c4aaf4d7bb7d026e3d61af4cb,
title = "Generalizing motion edits with Gaussian processes",
abstract = "One way that artists create compelling character animations is by manipulating details of a character's motion. This process is expensive and repetitive. We show that we can make such motion editing more efficient by generalizing the edits an animator makes on short sequences of motion to other sequences. Our method predicts frames for the motion using Gaussian process models of kinematics and dynamics. These estimates are combined with probabilistic inference. Our method can be used to propagate edits from examples to an entire sequence for an existing character, and it can also be used to map a motion from a control character to a very different target character. The technique shows good generalization. For example, we show that an estimator, learned from a few seconds of edited example animation using our methods, generalizes well enough to edit minutes of character animation in a high-quality fashion. Learning is interactive: An animator who wants to improve the output can provide small, correcting examples and the system will produce improved estimates of motion. We make this interactive learning process efficient and natural with a fast, full-body IK system with novel features. Finally, we present data from interviews with professional character animators that indicate that generalizing and propagating animator edits can save artists significant time and work.",
keywords = "Artist-guided content creation, Controllable motion editing",
author = "Leslie Ikemoto and Okan Arikan and David Forsyth",
year = "2009",
month = "1",
day = "1",
doi = "10.1145/1477926.1477927",
language = "English (US)",
volume = "28",
journal = "ACM Transactions on Computer Systems",
issn = "0730-0301",
publisher = "Association for Computing Machinery (ACM)",
number = "1",

}

TY - JOUR

T1 - Generalizing motion edits with Gaussian processes

AU - Ikemoto, Leslie

AU - Arikan, Okan

AU - Forsyth, David

PY - 2009/1/1

Y1 - 2009/1/1

N2 - One way that artists create compelling character animations is by manipulating details of a character's motion. This process is expensive and repetitive. We show that we can make such motion editing more efficient by generalizing the edits an animator makes on short sequences of motion to other sequences. Our method predicts frames for the motion using Gaussian process models of kinematics and dynamics. These estimates are combined with probabilistic inference. Our method can be used to propagate edits from examples to an entire sequence for an existing character, and it can also be used to map a motion from a control character to a very different target character. The technique shows good generalization. For example, we show that an estimator, learned from a few seconds of edited example animation using our methods, generalizes well enough to edit minutes of character animation in a high-quality fashion. Learning is interactive: An animator who wants to improve the output can provide small, correcting examples and the system will produce improved estimates of motion. We make this interactive learning process efficient and natural with a fast, full-body IK system with novel features. Finally, we present data from interviews with professional character animators that indicate that generalizing and propagating animator edits can save artists significant time and work.

AB - One way that artists create compelling character animations is by manipulating details of a character's motion. This process is expensive and repetitive. We show that we can make such motion editing more efficient by generalizing the edits an animator makes on short sequences of motion to other sequences. Our method predicts frames for the motion using Gaussian process models of kinematics and dynamics. These estimates are combined with probabilistic inference. Our method can be used to propagate edits from examples to an entire sequence for an existing character, and it can also be used to map a motion from a control character to a very different target character. The technique shows good generalization. For example, we show that an estimator, learned from a few seconds of edited example animation using our methods, generalizes well enough to edit minutes of character animation in a high-quality fashion. Learning is interactive: An animator who wants to improve the output can provide small, correcting examples and the system will produce improved estimates of motion. We make this interactive learning process efficient and natural with a fast, full-body IK system with novel features. Finally, we present data from interviews with professional character animators that indicate that generalizing and propagating animator edits can save artists significant time and work.

KW - Artist-guided content creation

KW - Controllable motion editing

UR - http://www.scopus.com/inward/record.url?scp=60349110894&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=60349110894&partnerID=8YFLogxK

U2 - 10.1145/1477926.1477927

DO - 10.1145/1477926.1477927

M3 - Article

AN - SCOPUS:60349110894

VL - 28

JO - ACM Transactions on Computer Systems

JF - ACM Transactions on Computer Systems

SN - 0730-0301

IS - 1

M1 - 1

ER -