Model-based tracking of complex articulated objects

Kevin Nickels, Seth Hutchinson

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we present methods for tracking complex, articulated objects. We assume that an appearance model and the kinematic structure of the object to be tracked are given, leading to what is termed a model-based object tracker. At each time step, this tracker observes a new monocular grayscale image of the scene and combines information gathered from this image with knowledge of the previous configuration of the object to estimate the configuration of the object at the time the image was acquired. Each degree of freedom in the model has an uncertainty associated with it, indicating the confidence in the current estimate for that degree of freedom. These uncertainty estimates are updated after each observation. An extended Kalman filter with appropriate observation and system models is used to implement this updating process. The methods that we describe are potentially beneficial to areas such as automated visual tracking in general, visual servo control, and human computer interaction.

Original languageEnglish (US)
Pages (from-to)28-36
Number of pages9
JournalIEEE Transactions on Robotics and Automation
Volume17
Issue number1
DOIs
StatePublished - Feb 2001
Externally publishedYes

Keywords

  • Kalman filtering
  • Object tracking

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Model-based tracking of complex articulated objects'. Together they form a unique fingerprint.

Cite this