Model-based tracking of complex articulated objects

Kevin Nickels, Seth Hutchinson

Research output: Contribution to journalArticlepeer-review


In this paper, we present methods for tracking complex, articulated objects. We assume that an appearance model and the kinematic structure of the object to be tracked are given, leading to what is termed a model-based object tracker. At each time step, this tracker observes a new monocular grayscale image of the scene and combines information gathered from this image with knowledge of the previous configuration of the object to estimate the configuration of the object at the time the image was acquired. Each degree of freedom in the model has an uncertainty associated with it, indicating the confidence in the current estimate for that degree of freedom. These uncertainty estimates are updated after each observation. An extended Kalman filter with appropriate observation and system models is used to implement this updating process. The methods that we describe are potentially beneficial to areas such as automated visual tracking in general, visual servo control, and human computer interaction.

Original languageEnglish (US)
Pages (from-to)28-36
Number of pages9
JournalIEEE Transactions on Robotics and Automation
Issue number1
StatePublished - Feb 2001
Externally publishedYes


  • Kalman filtering
  • Object tracking

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Electrical and Electronic Engineering


Dive into the research topics of 'Model-based tracking of complex articulated objects'. Together they form a unique fingerprint.

Cite this