Abstract
We propose a novel approach for sparse probabilistic principal component analysis, that combines a low rank representation for the latent factors and loadings with a novel sparse variational inference approach for estimating distributions of latent variables subject to sparse support constraints. Inference and parameter estimation for the resulting model is achieved via expectation maximization with a novel variational inference method for the E-step that induces sparsity. We show that this inference problem can be reduced to discrete optimal support selection. The discrete optimization is submodular, hence, greedy selection is guaranteed to achieve 1-1/e fraction of the optimal. Empirical studies indicate effectiveness of the proposed approach for the recovery of a parsimonious decomposition as compared to established baseline methods. We also evaluate our method against state-of-the-art methods on high dimensional fMRI data, and show that the method performs as well as or better than other methods.
| Original language | English (US) |
|---|---|
| Pages (from-to) | 453-461 |
| Number of pages | 9 |
| Journal | Journal of Machine Learning Research |
| Volume | 38 |
| State | Published - 2015 |
| Externally published | Yes |
| Event | 18th International Conference on Artificial Intelligence and Statistics, AISTATS 2015 - San Diego, United States Duration: May 9 2015 → May 12 2015 |
ASJC Scopus subject areas
- Software
- Control and Systems Engineering
- Statistics and Probability
- Artificial Intelligence
Fingerprint
Dive into the research topics of 'Sparse submodular probabilistic PCA'. Together they form a unique fingerprint.Cite this
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS