Joint-structured-sparsity-based classification for multiple-measurement transient acoustic signals

Haichao Zhang, Yanning Zhang, Nasser M. Nasrabadi, Thomas S. Huang

Research output: Contribution to journalArticlepeer-review

Abstract

This paper investigates the joint-structured-sparsity-based methods for transient acoustic signal classification with multiple measurements. By joint structured sparsity, we not only use the sparsity prior for each measurement but we also exploit the structural information across the sparse representation vectors of multiple measurements. Several different sparse prior models are investigated in this paper to exploit the correlations among the multiple measurements with the notion of the joint structured sparsity for improving the classification accuracy. Specifically, we propose models with the joint structured sparsity under different assumptions: same sparse code model, common sparse pattern model, and a newly proposed joint dynamic sparse model. For the joint dynamic sparse model, we also develop an efficient greedy algorithm to solve it. Extensive experiments are carried out on real acoustic data sets, and the results are compared with the conventional discriminative classifiers in order to verify the effectiveness of the proposed method.

Original languageEnglish (US)
Article number6200352
Pages (from-to)1586-1598
Number of pages13
JournalIEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Volume42
Issue number6
DOIs
StatePublished - 2012

Keywords

  • Joint sparse representation
  • joint structured sparsity
  • multiple-measurement transient acoustic signal classification

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Information Systems
  • Human-Computer Interaction
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Joint-structured-sparsity-based classification for multiple-measurement transient acoustic signals'. Together they form a unique fingerprint.

Cite this