TY - GEN
T1 - Motion synthesis from annotations
AU - Arikan, Okan
AU - Forsyth, David A.
AU - O'Brien, James F.
PY - 2003
Y1 - 2003
N2 - This paper describes a framework that allows a user to synthesize human motion while retaining control of its qualitative properties. The user paints a timeline with annotations - - like walk, run or jump - - from a vocabulary which is freely chosen by the user. The system then assembles frames from a motion database so that the final motion performs the specified actions at specified times. The motion can also be forced to pass through particular configurations at particular times, and to go to a particular position and orientation. Annotations can be painted positively (for example, must run), negatively (for example, may not run backwards) or as a don't-care. The system uses a novel search method, based around dynamic programming at several scales, to obtain a solution efficiently so that authoring is interactive. Our results demonstrate that the method can generate smooth, natural-looking motion.The annotation vocabulary can be chosen to fit the application, and allows specification of composite motions (run and jump simultaneously, for example). The process requires a collection of motion data that has been annotated with the chosen vocabulary. This paper also describes an effective tool, based around repeated use of support vector machines, that allows a user to annotate a large collection of motions quickly and easily so that they may be used with the synthesis algorithm.
AB - This paper describes a framework that allows a user to synthesize human motion while retaining control of its qualitative properties. The user paints a timeline with annotations - - like walk, run or jump - - from a vocabulary which is freely chosen by the user. The system then assembles frames from a motion database so that the final motion performs the specified actions at specified times. The motion can also be forced to pass through particular configurations at particular times, and to go to a particular position and orientation. Annotations can be painted positively (for example, must run), negatively (for example, may not run backwards) or as a don't-care. The system uses a novel search method, based around dynamic programming at several scales, to obtain a solution efficiently so that authoring is interactive. Our results demonstrate that the method can generate smooth, natural-looking motion.The annotation vocabulary can be chosen to fit the application, and allows specification of composite motions (run and jump simultaneously, for example). The process requires a collection of motion data that has been annotated with the chosen vocabulary. This paper also describes an effective tool, based around repeated use of support vector machines, that allows a user to annotate a large collection of motions quickly and easily so that they may be used with the synthesis algorithm.
KW - animation with constraints
KW - clustering
KW - human motion
KW - motion synthesis
KW - optimization
UR - http://www.scopus.com/inward/record.url?scp=33644491096&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33644491096&partnerID=8YFLogxK
U2 - 10.1145/1201775.882284
DO - 10.1145/1201775.882284
M3 - Conference contribution
AN - SCOPUS:33644491096
SN - 1581137095
SN - 9781581137095
T3 - ACM SIGGRAPH 2003 Papers, SIGGRAPH '03
SP - 402
EP - 408
BT - ACM SIGGRAPH 2003 Papers, SIGGRAPH '03
T2 - ACM SIGGRAPH 2003 Papers, SIGGRAPH '03
Y2 - 27 July 2003 through 31 July 2003
ER -