Sparse submodular probabilistic PCA

Rajiv Khanna, Joydeep Ghosh, Russell A. Poldrack, Oluwasanmi Koyejo

Research output: Contribution to journalConference articlepeer-review

Abstract

We propose a novel approach for sparse probabilistic principal component analysis, that combines a low rank representation for the latent factors and loadings with a novel sparse variational inference approach for estimating distributions of latent variables subject to sparse support constraints. Inference and parameter estimation for the resulting model is achieved via expectation maximization with a novel variational inference method for the E-step that induces sparsity. We show that this inference problem can be reduced to discrete optimal support selection. The discrete optimization is submodular, hence, greedy selection is guaranteed to achieve 1-1/e fraction of the optimal. Empirical studies indicate effectiveness of the proposed approach for the recovery of a parsimonious decomposition as compared to established baseline methods. We also evaluate our method against state-of-the-art methods on high dimensional fMRI data, and show that the method performs as well as or better than other methods.

Original languageEnglish (US)
Pages (from-to)453-461
Number of pages9
JournalJournal of Machine Learning Research
Volume38
StatePublished - Jan 1 2015
Externally publishedYes
Event18th International Conference on Artificial Intelligence and Statistics, AISTATS 2015 - San Diego, United States
Duration: May 9 2015May 12 2015

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Sparse submodular probabilistic PCA'. Together they form a unique fingerprint.

Cite this