Upsampling and denoising of depth maps via joint-segmentation

Miguel Tallón, S. Derin Babacan, Javier Mateos, Minh N. Do, Rafael Molina, Aggelos K. Katsaggelos

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The recent development of low-cost and fast time-of-flight cameras enabled measuring depth information at video frame rates. Although these cameras provide invaluable information for many 3D applications, their imaging capabilities are very limited both in terms of resolution and noise level. In this paper, we present a novel method for obtaining a high resolution depth map from a pair of a low resolution depth map and a corresponding high resolution color image. The proposed method exploits the correlation between the objects present in the color and depth map images via joint segmentation, which is then used to increase the resolution and remove noise via estimating conditional modes. Regions with inconsistent color and depth information are detected and corrected with our algorithm for increased robustness. Experimental results in terms of image quality and running times demonstrate the high performance of the method.

Original languageEnglish (US)
Title of host publicationProceedings of the 20th European Signal Processing Conference, EUSIPCO 2012
Pages245-249
Number of pages5
StatePublished - 2012
Event20th European Signal Processing Conference, EUSIPCO 2012 - Bucharest, Romania
Duration: Aug 27 2012Aug 31 2012

Publication series

NameEuropean Signal Processing Conference
ISSN (Print)2219-5491

Conference

Conference20th European Signal Processing Conference, EUSIPCO 2012
Country/TerritoryRomania
CityBucharest
Period8/27/128/31/12

Keywords

  • color segmentation
  • depth enhancement
  • multisensor image fusion
  • Time-of-flight cameras

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Upsampling and denoising of depth maps via joint-segmentation'. Together they form a unique fingerprint.

Cite this