Automatic Group Sparse Coding

Fei Wang, Noah Lee, Jimeng Sun, Jianying Hu, Shahram Ebadollahi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Sparse Coding (SC), which models the data vectors as sparse linear combinations over basis vectors (i.e., dictionary), has been widely applied in machine learning, signal processing and neuroscience. Recently, one specific SC technique, Group Sparse Coding (GSC), has been proposed to learn a common dictionary over multiple different groups of data, where the data groups are assumed to be pre-defined. In practice, this may not always be the case. In this paper, we propose Automatic Group Sparse Coding (AutoGSC), which can (1) discover the hidden data groups; (2) learn a common dictionary over different data groups; and (3) learn an individual dictionary for each data group. Finally, we conduct experiments on both synthetic and real world data sets to demonstrate the effectiveness of AutoGSC, and compare it with traditional sparse coding and Nonnegative Matrix Factorization (NMF) methods.

Original languageEnglish (US)
Title of host publicationProceedings of the 25th AAAI Conference on Artificial Intelligence, AAAI 2011
PublisherAmerican Association for Artificial Intelligence (AAAI) Press
Pages495-500
Number of pages6
ISBN (Electronic)9781577355083
StatePublished - Aug 11 2011
Externally publishedYes
Event25th AAAI Conference on Artificial Intelligence, AAAI 2011 - San Francisco, United States
Duration: Aug 7 2011Aug 11 2011

Publication series

NameProceedings of the 25th AAAI Conference on Artificial Intelligence, AAAI 2011

Conference

Conference25th AAAI Conference on Artificial Intelligence, AAAI 2011
Country/TerritoryUnited States
CitySan Francisco
Period8/7/118/11/11

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Automatic Group Sparse Coding'. Together they form a unique fingerprint.

Cite this