Learning flipping and rotation invariant sparsifying transforms

Bihan Wen, Saiprasad Ravishankar, Yoram Bresler

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Adaptive sparse representation has been heavily exploited in signal processing and computer vision. Recently, sparsifying transform learning received interest for its cheap computation and optimal updates in the alternating algorithms. In this work, we develop a methodology for learning a Flipping and Rotation Invariant Sparsifying Transform, dubbed FRIST, to better represent natural images that contain textures with various geometrical directions. The proposed alternating learning algorithm involves efficient optimal updates. We demonstrate empirical convergence behavior of the proposed learning algorithm. Preliminary experiments show the usefulness of FRIST for image sparse representation, segmentation, robust inpainting, and MRI reconstruction with promising performances.

Original languageEnglish (US)
Title of host publication2016 IEEE International Conference on Image Processing, ICIP 2016 - Proceedings
PublisherIEEE Computer Society
Pages3857-3861
Number of pages5
ISBN (Electronic)9781467399616
DOIs
StatePublished - Aug 3 2016
Event23rd IEEE International Conference on Image Processing, ICIP 2016 - Phoenix, United States
Duration: Sep 25 2016Sep 28 2016

Publication series

NameProceedings - International Conference on Image Processing, ICIP
Volume2016-August
ISSN (Print)1522-4880

Other

Other23rd IEEE International Conference on Image Processing, ICIP 2016
Country/TerritoryUnited States
CityPhoenix
Period9/25/169/28/16

Keywords

  • Clustering
  • Inpainting
  • Magnetic resonance imaging
  • Sparse representation
  • Sparsifying transform

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition
  • Signal Processing

Fingerprint

Dive into the research topics of 'Learning flipping and rotation invariant sparsifying transforms'. Together they form a unique fingerprint.

Cite this