Discriminative estimation of 3D human pose using Gaussian processes

Xu Zhao, Huazhong Ning, Yuncai Liu, Thomas Huang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we present an efficient discriminative method for human pose estimation. This method learns a direct mapping from visual observations to human body configurations. The framework requires that the visual features should be powerful enough to discriminate the subtle differences between similar human poses. We propose to describe the image features using salient interest points that are represented by SIFT-like descriptors. The descriptor encode the position, appearance, and local structural information simultaneously. Bag-of-words representation is used to model the distribution of feature space. The descriptor can tolerate a range of illumination and position variations because it is computed on overlapped patches. We use Gaussian process regression to model the mapping from visual observations to human poses. This probabilistic regression algorithm is effective and robust to the pose estimation problem. We test our approach on the HumanEva data set. Experimental results demonstrate that our approach achieves the state of the art performance.

Original languageEnglish (US)
Title of host publication2008 19th International Conference on Pattern Recognition, ICPR 2008
StatePublished - 2008
Event2008 19th International Conference on Pattern Recognition, ICPR 2008 - Tampa, FL, United States
Duration: Dec 8 2008Dec 11 2008

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651

Other

Other2008 19th International Conference on Pattern Recognition, ICPR 2008
Country/TerritoryUnited States
CityTampa, FL
Period12/8/0812/11/08

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Discriminative estimation of 3D human pose using Gaussian processes'. Together they form a unique fingerprint.

Cite this