Motion synthesis from annotations

Okan Arikan, David Alexander Forsyth, James F. O'Brien

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper describes a framework that allows a user to synthesize human motion while retaining control of its qualitative properties. The user paints a timeline with annotations - - like walk, run or jump - - from a vocabulary which is freely chosen by the user. The system then assembles frames from a motion database so that the final motion performs the specified actions at specified times. The motion can also be forced to pass through particular configurations at particular times, and to go to a particular position and orientation. Annotations can be painted positively (for example, must run), negatively (for example, may not run backwards) or as a don't-care. The system uses a novel search method, based around dynamic programming at several scales, to obtain a solution efficiently so that authoring is interactive. Our results demonstrate that the method can generate smooth, natural-looking motion.The annotation vocabulary can be chosen to fit the application, and allows specification of composite motions (run and jump simultaneously, for example). The process requires a collection of motion data that has been annotated with the chosen vocabulary. This paper also describes an effective tool, based around repeated use of support vector machines, that allows a user to annotate a large collection of motions quickly and easily so that they may be used with the synthesis algorithm.

Original languageEnglish (US)
Title of host publicationACM SIGGRAPH 2003 Papers, SIGGRAPH '03
Pages402-408
Number of pages7
DOIs
StatePublished - Dec 1 2003
Externally publishedYes
EventACM SIGGRAPH 2003 Papers, SIGGRAPH '03 - San Diego, CA, United States
Duration: Jul 27 2003Jul 31 2003

Publication series

NameACM SIGGRAPH 2003 Papers, SIGGRAPH '03

Other

OtherACM SIGGRAPH 2003 Papers, SIGGRAPH '03
CountryUnited States
CitySan Diego, CA
Period7/27/037/31/03

Fingerprint

Dynamic programming
Paint
Support vector machines
Specifications
Composite materials

Keywords

  • animation with constraints
  • clustering
  • human motion
  • motion synthesis
  • optimization

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Computer Vision and Pattern Recognition
  • Software

Cite this

Arikan, O., Forsyth, D. A., & O'Brien, J. F. (2003). Motion synthesis from annotations. In ACM SIGGRAPH 2003 Papers, SIGGRAPH '03 (pp. 402-408). (ACM SIGGRAPH 2003 Papers, SIGGRAPH '03). https://doi.org/10.1145/1201775.882284

Motion synthesis from annotations. / Arikan, Okan; Forsyth, David Alexander; O'Brien, James F.

ACM SIGGRAPH 2003 Papers, SIGGRAPH '03. 2003. p. 402-408 (ACM SIGGRAPH 2003 Papers, SIGGRAPH '03).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Arikan, O, Forsyth, DA & O'Brien, JF 2003, Motion synthesis from annotations. in ACM SIGGRAPH 2003 Papers, SIGGRAPH '03. ACM SIGGRAPH 2003 Papers, SIGGRAPH '03, pp. 402-408, ACM SIGGRAPH 2003 Papers, SIGGRAPH '03, San Diego, CA, United States, 7/27/03. https://doi.org/10.1145/1201775.882284
Arikan O, Forsyth DA, O'Brien JF. Motion synthesis from annotations. In ACM SIGGRAPH 2003 Papers, SIGGRAPH '03. 2003. p. 402-408. (ACM SIGGRAPH 2003 Papers, SIGGRAPH '03). https://doi.org/10.1145/1201775.882284
Arikan, Okan ; Forsyth, David Alexander ; O'Brien, James F. / Motion synthesis from annotations. ACM SIGGRAPH 2003 Papers, SIGGRAPH '03. 2003. pp. 402-408 (ACM SIGGRAPH 2003 Papers, SIGGRAPH '03).
@inproceedings{13de9cf591724c76833cf009d3d13fed,
title = "Motion synthesis from annotations",
abstract = "This paper describes a framework that allows a user to synthesize human motion while retaining control of its qualitative properties. The user paints a timeline with annotations - - like walk, run or jump - - from a vocabulary which is freely chosen by the user. The system then assembles frames from a motion database so that the final motion performs the specified actions at specified times. The motion can also be forced to pass through particular configurations at particular times, and to go to a particular position and orientation. Annotations can be painted positively (for example, must run), negatively (for example, may not run backwards) or as a don't-care. The system uses a novel search method, based around dynamic programming at several scales, to obtain a solution efficiently so that authoring is interactive. Our results demonstrate that the method can generate smooth, natural-looking motion.The annotation vocabulary can be chosen to fit the application, and allows specification of composite motions (run and jump simultaneously, for example). The process requires a collection of motion data that has been annotated with the chosen vocabulary. This paper also describes an effective tool, based around repeated use of support vector machines, that allows a user to annotate a large collection of motions quickly and easily so that they may be used with the synthesis algorithm.",
keywords = "animation with constraints, clustering, human motion, motion synthesis, optimization",
author = "Okan Arikan and Forsyth, {David Alexander} and O'Brien, {James F.}",
year = "2003",
month = "12",
day = "1",
doi = "10.1145/1201775.882284",
language = "English (US)",
isbn = "1581137095",
series = "ACM SIGGRAPH 2003 Papers, SIGGRAPH '03",
pages = "402--408",
booktitle = "ACM SIGGRAPH 2003 Papers, SIGGRAPH '03",

}

TY - GEN

T1 - Motion synthesis from annotations

AU - Arikan, Okan

AU - Forsyth, David Alexander

AU - O'Brien, James F.

PY - 2003/12/1

Y1 - 2003/12/1

N2 - This paper describes a framework that allows a user to synthesize human motion while retaining control of its qualitative properties. The user paints a timeline with annotations - - like walk, run or jump - - from a vocabulary which is freely chosen by the user. The system then assembles frames from a motion database so that the final motion performs the specified actions at specified times. The motion can also be forced to pass through particular configurations at particular times, and to go to a particular position and orientation. Annotations can be painted positively (for example, must run), negatively (for example, may not run backwards) or as a don't-care. The system uses a novel search method, based around dynamic programming at several scales, to obtain a solution efficiently so that authoring is interactive. Our results demonstrate that the method can generate smooth, natural-looking motion.The annotation vocabulary can be chosen to fit the application, and allows specification of composite motions (run and jump simultaneously, for example). The process requires a collection of motion data that has been annotated with the chosen vocabulary. This paper also describes an effective tool, based around repeated use of support vector machines, that allows a user to annotate a large collection of motions quickly and easily so that they may be used with the synthesis algorithm.

AB - This paper describes a framework that allows a user to synthesize human motion while retaining control of its qualitative properties. The user paints a timeline with annotations - - like walk, run or jump - - from a vocabulary which is freely chosen by the user. The system then assembles frames from a motion database so that the final motion performs the specified actions at specified times. The motion can also be forced to pass through particular configurations at particular times, and to go to a particular position and orientation. Annotations can be painted positively (for example, must run), negatively (for example, may not run backwards) or as a don't-care. The system uses a novel search method, based around dynamic programming at several scales, to obtain a solution efficiently so that authoring is interactive. Our results demonstrate that the method can generate smooth, natural-looking motion.The annotation vocabulary can be chosen to fit the application, and allows specification of composite motions (run and jump simultaneously, for example). The process requires a collection of motion data that has been annotated with the chosen vocabulary. This paper also describes an effective tool, based around repeated use of support vector machines, that allows a user to annotate a large collection of motions quickly and easily so that they may be used with the synthesis algorithm.

KW - animation with constraints

KW - clustering

KW - human motion

KW - motion synthesis

KW - optimization

UR - http://www.scopus.com/inward/record.url?scp=33644491096&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33644491096&partnerID=8YFLogxK

U2 - 10.1145/1201775.882284

DO - 10.1145/1201775.882284

M3 - Conference contribution

AN - SCOPUS:33644491096

SN - 1581137095

SN - 9781581137095

T3 - ACM SIGGRAPH 2003 Papers, SIGGRAPH '03

SP - 402

EP - 408

BT - ACM SIGGRAPH 2003 Papers, SIGGRAPH '03

ER -