Simultaneous discriminative projection and dictionary learning for sparse representation based classification

Haichao Zhang, Yanning Zhang, Thomas S. Huang

Research output: Contribution to journalArticlepeer-review

Abstract

Sparsity driven classification method has been popular recently due to its effectiveness in various classification tasks. It is based on the assumption that samples of the same class live in the same subspace, thus a test sample can be well represented by the training samples of the same class. Previous methods model the subspace for each class with either the training samples directly or dictionaries trained for each class separately. Although enabling strong reconstructive ability, these methods may not have desirable discriminative ability, especially when there are high correlations among the samples of different classes. In this paper, we propose to learn simultaneously a discriminative projection and a dictionary that are optimized for the sparse representation based classifier, to extract discriminative information from the raw data while respecting the sparse representation assumption. By formulating the task of projection and dictionary learning into an optimization framework, we can learn the discriminative projection and dictionary effectively. Extensive experiments are carried out on various datasets and the experimental results verify the efficacy of the proposed method.

Original languageEnglish (US)
Pages (from-to)346-354
Number of pages9
JournalPattern Recognition
Volume46
Issue number1
DOIs
StatePublished - Jan 2013

Keywords

  • Dictionary learning
  • Discriminative projection learning
  • Metric learning
  • Sparse representation based classification

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Simultaneous discriminative projection and dictionary learning for sparse representation based classification'. Together they form a unique fingerprint.

Cite this