Gait-based gender classification in unconstrained environments

Jiwen Lu, Gang Wang, Thomas S. Huang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper investigates the problem of gait-based gender classification in unconstrained environments. Different from existing human gait analysis and recognition methods which assume that humans walk in controlled environments, we aim to recognize human gender from uncontrolled gaits in which people can walk freely and the walking direction of human gaits may be time-varying in a singe video clip. Given each gait sequence collected in an uncontrolled manner, we first obtain human silhouettes using background substraction and cluster them into several groups. For each group, we compute the averaged gait image (AGI) as features. Then, we learn a distance metric under which the intraclass variations are minimized and the interclass variations are maximized, simultaneously, such that more discriminative information can be exploited for gender classification. Experimental results on our dataset demonstrate the efficacy of the proposed method.

Original languageEnglish (US)
Title of host publicationICPR 2012 - 21st International Conference on Pattern Recognition
Pages3284-3287
Number of pages4
StatePublished - Dec 1 2012
Event21st International Conference on Pattern Recognition, ICPR 2012 - Tsukuba, Japan
Duration: Nov 11 2012Nov 15 2012

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651

Other

Other21st International Conference on Pattern Recognition, ICPR 2012
Country/TerritoryJapan
CityTsukuba
Period11/11/1211/15/12

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Cite this