Sparse submodular probabilistic PCA

Rajiv Khanna, Joydeep Ghosh, Russell A. Poldrack, Oluwasanmi Oluseye Koyejo

Research output: Contribution to journalConference article

Abstract

We propose a novel approach for sparse probabilistic principal component analysis, that combines a low rank representation for the latent factors and loadings with a novel sparse variational inference approach for estimating distributions of latent variables subject to sparse support constraints. Inference and parameter estimation for the resulting model is achieved via expectation maximization with a novel variational inference method for the E-step that induces sparsity. We show that this inference problem can be reduced to discrete optimal support selection. The discrete optimization is submodular, hence, greedy selection is guaranteed to achieve 1-1/e fraction of the optimal. Empirical studies indicate effectiveness of the proposed approach for the recovery of a parsimonious decomposition as compared to established baseline methods. We also evaluate our method against state-of-the-art methods on high dimensional fMRI data, and show that the method performs as well as or better than other methods.

Original languageEnglish (US)
Pages (from-to)453-461
Number of pages9
JournalJournal of Machine Learning Research
Volume38
StatePublished - Jan 1 2015
Externally publishedYes
Event18th International Conference on Artificial Intelligence and Statistics, AISTATS 2015 - San Diego, United States
Duration: May 9 2015May 12 2015

Fingerprint

Parameter estimation
Principal component analysis
Decomposition
Recovery
Functional Magnetic Resonance Imaging
Expectation Maximization
Probabilistic Analysis
Discrete Optimization
Latent Variables
Sparsity
Empirical Study
Principal Component Analysis
Parameter Estimation
Baseline
High-dimensional
Decompose
Evaluate
Magnetic Resonance Imaging
Model

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Cite this

Khanna, R., Ghosh, J., Poldrack, R. A., & Koyejo, O. O. (2015). Sparse submodular probabilistic PCA. Journal of Machine Learning Research, 38, 453-461.

Sparse submodular probabilistic PCA. / Khanna, Rajiv; Ghosh, Joydeep; Poldrack, Russell A.; Koyejo, Oluwasanmi Oluseye.

In: Journal of Machine Learning Research, Vol. 38, 01.01.2015, p. 453-461.

Research output: Contribution to journalConference article

Khanna, R, Ghosh, J, Poldrack, RA & Koyejo, OO 2015, 'Sparse submodular probabilistic PCA', Journal of Machine Learning Research, vol. 38, pp. 453-461.
Khanna R, Ghosh J, Poldrack RA, Koyejo OO. Sparse submodular probabilistic PCA. Journal of Machine Learning Research. 2015 Jan 1;38:453-461.
Khanna, Rajiv ; Ghosh, Joydeep ; Poldrack, Russell A. ; Koyejo, Oluwasanmi Oluseye. / Sparse submodular probabilistic PCA. In: Journal of Machine Learning Research. 2015 ; Vol. 38. pp. 453-461.
@article{5c37620eed6e4a4faee73fa61409b1b9,
title = "Sparse submodular probabilistic PCA",
abstract = "We propose a novel approach for sparse probabilistic principal component analysis, that combines a low rank representation for the latent factors and loadings with a novel sparse variational inference approach for estimating distributions of latent variables subject to sparse support constraints. Inference and parameter estimation for the resulting model is achieved via expectation maximization with a novel variational inference method for the E-step that induces sparsity. We show that this inference problem can be reduced to discrete optimal support selection. The discrete optimization is submodular, hence, greedy selection is guaranteed to achieve 1-1/e fraction of the optimal. Empirical studies indicate effectiveness of the proposed approach for the recovery of a parsimonious decomposition as compared to established baseline methods. We also evaluate our method against state-of-the-art methods on high dimensional fMRI data, and show that the method performs as well as or better than other methods.",
author = "Rajiv Khanna and Joydeep Ghosh and Poldrack, {Russell A.} and Koyejo, {Oluwasanmi Oluseye}",
year = "2015",
month = "1",
day = "1",
language = "English (US)",
volume = "38",
pages = "453--461",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

TY - JOUR

T1 - Sparse submodular probabilistic PCA

AU - Khanna, Rajiv

AU - Ghosh, Joydeep

AU - Poldrack, Russell A.

AU - Koyejo, Oluwasanmi Oluseye

PY - 2015/1/1

Y1 - 2015/1/1

N2 - We propose a novel approach for sparse probabilistic principal component analysis, that combines a low rank representation for the latent factors and loadings with a novel sparse variational inference approach for estimating distributions of latent variables subject to sparse support constraints. Inference and parameter estimation for the resulting model is achieved via expectation maximization with a novel variational inference method for the E-step that induces sparsity. We show that this inference problem can be reduced to discrete optimal support selection. The discrete optimization is submodular, hence, greedy selection is guaranteed to achieve 1-1/e fraction of the optimal. Empirical studies indicate effectiveness of the proposed approach for the recovery of a parsimonious decomposition as compared to established baseline methods. We also evaluate our method against state-of-the-art methods on high dimensional fMRI data, and show that the method performs as well as or better than other methods.

AB - We propose a novel approach for sparse probabilistic principal component analysis, that combines a low rank representation for the latent factors and loadings with a novel sparse variational inference approach for estimating distributions of latent variables subject to sparse support constraints. Inference and parameter estimation for the resulting model is achieved via expectation maximization with a novel variational inference method for the E-step that induces sparsity. We show that this inference problem can be reduced to discrete optimal support selection. The discrete optimization is submodular, hence, greedy selection is guaranteed to achieve 1-1/e fraction of the optimal. Empirical studies indicate effectiveness of the proposed approach for the recovery of a parsimonious decomposition as compared to established baseline methods. We also evaluate our method against state-of-the-art methods on high dimensional fMRI data, and show that the method performs as well as or better than other methods.

UR - http://www.scopus.com/inward/record.url?scp=84954315394&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84954315394&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:84954315394

VL - 38

SP - 453

EP - 461

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -