TY - GEN
T1 - Face as mouse through visual face tracking
AU - Tu, Jilin
AU - Huang, T.
AU - Tao, Hai
N1 - Publisher Copyright:
© 2005 IEEE.
PY - 2005
Y1 - 2005
N2 - This paper introduces a novel camera mouse driven by 3D model based visual face tracking technique. While camera becomes standard configuration for personal computer (PC) and computer speed becomes faster and faster, achieving human machine interaction through visual face tracking becomes a feasible solution to hand-free control. The human facial movement can be decomposed into rigid movement, e.g. rotation and translation, and non-rigid movement, such as the open/close of mouth, eyes, and facial expressions, etc. We introduce our visual face tracking system that can robustly and accurately retrieve these motion parameters from video at real-time. After calibration, the retrieved head orientation and translation can be employed to navigate the mouse cursor, and the detection of mouth movement can be utilized to trigger mouse events. 3 mouse control modes are investigated and compared. Experiments in Windows XP environment verify the convenience of navigation and operations using our face mouse. This technique can be an alternative input device for people with hand and speech disability and for futuristic vision-based game and interface.
AB - This paper introduces a novel camera mouse driven by 3D model based visual face tracking technique. While camera becomes standard configuration for personal computer (PC) and computer speed becomes faster and faster, achieving human machine interaction through visual face tracking becomes a feasible solution to hand-free control. The human facial movement can be decomposed into rigid movement, e.g. rotation and translation, and non-rigid movement, such as the open/close of mouth, eyes, and facial expressions, etc. We introduce our visual face tracking system that can robustly and accurately retrieve these motion parameters from video at real-time. After calibration, the retrieved head orientation and translation can be employed to navigate the mouse cursor, and the detection of mouth movement can be utilized to trigger mouse events. 3 mouse control modes are investigated and compared. Experiments in Windows XP environment verify the convenience of navigation and operations using our face mouse. This technique can be an alternative input device for people with hand and speech disability and for futuristic vision-based game and interface.
UR - http://www.scopus.com/inward/record.url?scp=34547189376&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=34547189376&partnerID=8YFLogxK
U2 - 10.1109/CRV.2005.39
DO - 10.1109/CRV.2005.39
M3 - Conference contribution
AN - SCOPUS:34547189376
T3 - Proceedings - 2nd Canadian Conference on Computer and Robot Vision, CRV 2005
SP - 339
EP - 346
BT - Proceedings - 2nd Canadian Conference on Computer and Robot Vision, CRV 2005
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2nd Canadian Conference on Computer and Robot Vision, CRV 2005
Y2 - 9 May 2005 through 11 May 2005
ER -