A maximum entropy framework for part-based texture and object recognition

Svetlana Lazebnik, Cordelia Schmid, Jean Ponce

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper presents a probabilistic part-based approach for texture and object recognition. Textures are represented using a part dictionary found by quantizing the appearance of scale- or affine-invariant keypoints. Object classes are represented using a dictionary of composite semi-local parts, or groups of neighboring keypoints with stable and distinctive appearance and geometric layout. A discriminative maximum entropy framework is used to learn the posterior distribution of the class label given the occurrences of parts from the dictionary in the training set. Experiments on two texture and two object databases demonstrate the effectiveness of this framework for visual classification.

Original languageEnglish (US)
Title of host publicationProceedings - 10th IEEE International Conference on Computer Vision, ICCV 2005
Pages832-838
Number of pages7
DOIs
StatePublished - 2005
EventProceedings - 10th IEEE International Conference on Computer Vision, ICCV 2005 - Beijing, China
Duration: Oct 17 2005Oct 20 2005

Publication series

NameProceedings of the IEEE International Conference on Computer Vision
VolumeI

Other

OtherProceedings - 10th IEEE International Conference on Computer Vision, ICCV 2005
CountryChina
CityBeijing
Period10/17/0510/20/05

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'A maximum entropy framework for part-based texture and object recognition'. Together they form a unique fingerprint.

Cite this