Abstract
Sparse Principal Component Analysis (PCA) is a prevalent tool across a plethora of sub-fields of applied statistics. While several results have characterized the recovery error of the principal eigenvectors, these are typically in spectral or Frobenius norms. In this paper, we provide entrywise ℓ2,∞ bounds for Sparse PCA under a general high-dimensional subgaussian design. In particular, our results hold for any algorithm that selects the correct support with high probability, those that are sparsistent. Our bound improves upon known results by providing a finer characterization of the estimation error, and our proof uses techniques recently developed for entrywise subspace perturbation theory.
Original language | English (US) |
---|---|
Pages (from-to) | 6591-6629 |
Number of pages | 39 |
Journal | Proceedings of Machine Learning Research |
Volume | 151 |
State | Published - 2022 |
Externally published | Yes |
Event | 25th International Conference on Artificial Intelligence and Statistics, AISTATS 2022 - Virtual, Online, Spain Duration: Mar 28 2022 → Mar 30 2022 |
ASJC Scopus subject areas
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability