Object detection by estimating and combining high-level features

Geoffrey Levine, Gerald Dejong

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Many successful object detection systems characterize object classes with a statistical profile over a large number of local features. We present an enhancement to this method that learns to assemble local features into features that capture more global properties such as body shape and color distribution. The system then learns to combine these estimated global features to improve object detection accuracy. In our approach, each candidate object detection from an off-the-shelf gradient-based detection system is transformed into a conditional random field. This CRF is used to extract a most likely object silhouette, which is then processed into features based on color and shape. Finally, we show that on the difficult Pascal VOC 2007 data set, detection rates can be improved by combining these global features with the local features from a state-of-the-art gradient based approach.

Original languageEnglish (US)
Title of host publicationImage Analysis and Processing - ICIAP 2009 - 15th International Conference, Proceedings
Pages161-169
Number of pages9
DOIs
StatePublished - 2009
Event15th International Conference on Image Analysis and Processing - ICIAP 2009, Proceedings - Vietri sul Mare, Italy
Duration: Sep 8 2009Sep 11 2009

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume5716 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other15th International Conference on Image Analysis and Processing - ICIAP 2009, Proceedings
CountryItaly
CityVietri sul Mare
Period9/8/099/11/09

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Object detection by estimating and combining high-level features'. Together they form a unique fingerprint.

Cite this