Style-based abstractions for human motion classification

Amy LaViers, Magnus Egerstedt

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper presents an approach to motion analysis for robotics in which a quantitative definition of "style of motion" is used to classify movements. In particular, we present a method for generating a "best match" signal for empirical data via a two stage optimal control formulation. The first stage consists of the generation of trajectories that mimic empirical data. In the second stage, an inverse problem is solved in order to obtain the "stylistic parameters" that best recreate the empirical data. This method is amenable to human motion analysis in that it not only produces a matching trajectory but, in doing so, classifies its quality. This classification allows for the production of additional trajectories, between any two endpoints, in the same style as the empirical reference data. The method not only enables robotic mimicry of human style but can also provide insights into genres of stylized movement, equipping cy-berphysical systems with a deeper interpretation of human movement.

Original languageEnglish (US)
Title of host publication2014 ACM/IEEE International Conference on Cyber-Physical Systems, ICCPS 2014
PublisherIEEE Computer Society
Pages84-91
Number of pages8
ISBN (Print)9781479949311
DOIs
StatePublished - 2014
Externally publishedYes
Event5th IEEE/ACM International Conference on Cyber-Physical Systems, ICCPS 2014 - Berlin, Germany
Duration: Apr 14 2014Apr 17 2014

Publication series

Name2014 ACM/IEEE International Conference on Cyber-Physical Systems, ICCPS 2014

Other

Other5th IEEE/ACM International Conference on Cyber-Physical Systems, ICCPS 2014
Country/TerritoryGermany
CityBerlin
Period4/14/144/17/14

ASJC Scopus subject areas

  • Control and Systems Engineering

Fingerprint

Dive into the research topics of 'Style-based abstractions for human motion classification'. Together they form a unique fingerprint.

Cite this