Extracting texels in 2.1D natural textures

Narendra Ahuja, Sinisa Todorovic

Research output: Contribution to conferencePaperpeer-review


This paper proposes the problem of unsupervised extraction of texture elements, called texels, which repeatedly occur in the image of a frontally viewed, homogeneous, 2.1D, planar texture, and presents a solution. 2.1D texture here means that the physical texels are thin objects lying along a surface that may partially occlude one another. The image texture is represented by the segmentation tree whose structure captures the recursive embedding of regions obtained from a multiscale image segmentation. In the segmentation tree, the texels appear as subtrees with similar structure, with nodes having similar photometric and geometric properties. A new learning algorithm is proposed for fusing these similar subtrees into a tree-union, which registers all visible texel parts, and thus represents a statistical, generative model of the complete (unoccluded) texel. The learning algorithm involves concurrent estimation of texel tree structure, as well as the probability distributions of its node properties. Texel detection and segmentation are achieved simultaneously by matching the segmentation tree of a new image with the texel model. Experiments conducted on a newly compiled dataset containing 2.1D natural textures demonstrate the validity of our approach.

Original languageEnglish (US)
StatePublished - 2007
Event2007 IEEE 11th International Conference on Computer Vision, ICCV - Rio de Janeiro, Brazil
Duration: Oct 14 2007Oct 21 2007


Other2007 IEEE 11th International Conference on Computer Vision, ICCV
CityRio de Janeiro

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition


Dive into the research topics of 'Extracting texels in 2.1D natural textures'. Together they form a unique fingerprint.

Cite this