Texel-based texture segmentation

Sinisa Todorovic, Narendra Ahuja

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Given an arbitrary image, our goal is to segment all distinct texture subimages. This is done by discovering distinct, cohesive groups of spatially repeating patterns, called texels, in the image, where each group defines the corresponding texture. Texels occupy image regions, whose photometric, geometric, structural, and spatial-layout properties are samples from an unknown pdf. If the image contains texture, by definition, the image will also contain a large number of statistically similar texels. This, in turn, will give rise to modes in the pdf of region properties. Texture segmentation can thus be formulated as identifying modes of this pdf. To this end, first, we use a low-level, multiscale segmentation to extract image regions at all scales present. Then, we use the meanshift with a new, variable-bandwidth, hierarchical kernel to identify modes of the pdf defined over the extracted hierarchy of image regions. The hierarchical kernel is aimed at capturing texel substructure. Experiments demonstrate that accounting for the structural properties of texels is critical for texture segmentation, leading to competitive performance vs. the state of the art.

Original languageEnglish (US)
Title of host publication2009 IEEE 12th International Conference on Computer Vision, ICCV 2009
Pages841-848
Number of pages8
DOIs
StatePublished - 2009
Event12th International Conference on Computer Vision, ICCV 2009 - Kyoto, Japan
Duration: Sep 29 2009Oct 2 2009

Publication series

NameProceedings of the IEEE International Conference on Computer Vision

Other

Other12th International Conference on Computer Vision, ICCV 2009
CountryJapan
CityKyoto
Period9/29/0910/2/09

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Texel-based texture segmentation'. Together they form a unique fingerprint.

Cite this