Abstract
In this paper, we present methods for tracking complex, articulated objects. We assume that an appearance model and the kinematic structure of the object to be tracked are given, leading to what is termed a model-based object tracker. At each time step, this tracker observes a new monocular grayscale image of the scene and combines information gathered from this image with knowledge of the previous configuration of the object to estimate the configuration of the object at the time the image was acquired. Each degree of freedom in the model has an uncertainty associated with it, indicating the confidence in the current estimate for that degree of freedom. These uncertainty estimates are updated after each observation. An extended Kalman filter with appropriate observation and system models is used to implement this updating process. The methods that we describe are potentially beneficial to areas such as automated visual tracking in general, visual servo control, and human computer interaction.
Original language | English (US) |
---|---|
Pages (from-to) | 28-36 |
Number of pages | 9 |
Journal | IEEE Transactions on Robotics and Automation |
Volume | 17 |
Issue number | 1 |
DOIs | |
State | Published - Feb 2001 |
Externally published | Yes |
Keywords
- Kalman filtering
- Object tracking
ASJC Scopus subject areas
- Control and Systems Engineering
- Electrical and Electronic Engineering