TY - GEN
T1 - Modeling the constraints of human hand motion
AU - Lin, John
AU - Wu, Ying
AU - Huang, T. S.
N1 - Publisher Copyright:
© 2000 IEEE.
PY - 2000
Y1 - 2000
N2 - Hand motion capture is one of the most important parts of gesture interfaces. Many current approaches to this task generally involve a formidable nonlinear optimization problem in a large search space. Motion capture can be achieved more cost-efficiently when considering the motion constraints of a hand. Although some constraints can be represented as equalities or inequalities, there exist many constraints which cannot be explicitly represented. In this paper, we propose a learning approach to model the hand configuration space directly. The redundancy of the configuration space can be eliminated by finding a lower-dimensional subspace of the original space. Finger motion is modeled in this subspace based on the linear behavior observed in the real motion data collected by a CyberGlove. Employing the constrained motion model, we are able to efficiently capture finger motion from video inputs. Several experiments show that our proposed model is helpful for capturing articulated motion.
AB - Hand motion capture is one of the most important parts of gesture interfaces. Many current approaches to this task generally involve a formidable nonlinear optimization problem in a large search space. Motion capture can be achieved more cost-efficiently when considering the motion constraints of a hand. Although some constraints can be represented as equalities or inequalities, there exist many constraints which cannot be explicitly represented. In this paper, we propose a learning approach to model the hand configuration space directly. The redundancy of the configuration space can be eliminated by finding a lower-dimensional subspace of the original space. Finger motion is modeled in this subspace based on the linear behavior observed in the real motion data collected by a CyberGlove. Employing the constrained motion model, we are able to efficiently capture finger motion from video inputs. Several experiments show that our proposed model is helpful for capturing articulated motion.
UR - http://www.scopus.com/inward/record.url?scp=84962185123&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84962185123&partnerID=8YFLogxK
U2 - 10.1109/HUMO.2000.897381
DO - 10.1109/HUMO.2000.897381
M3 - Conference contribution
AN - SCOPUS:84962185123
T3 - Proceedings - Workshop on Human Motion, HUMO 2000
SP - 121
EP - 126
BT - Proceedings - Workshop on Human Motion, HUMO 2000
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - Workshop on Human Motion, HUMO 2000
Y2 - 7 December 2000 through 8 December 2000
ER -