Abstract
We investigate the sample complexity of recovering tensors with low symmetric rank from symmetric rank-one measurements, a setting particularly motivated by the study of higher-order interactions in statistics and the analysis of two-layer polynomial neural networks. Using a covering number argument, we analyze the performance of the symmetric rank minimization program and establish near-optimal sample complexity bounds when the underlying distribution is log-concave. Our measurement model involves random symmetric rank-one tensors, leading to involved probability calculations. To address these challenges, we employ the Carbery-Wright inequality, a powerful tool for studying anti-concentration properties of random polynomials, and leverage orthogonal polynomial expansions. Additionally, we provide a sample complexity lower bound via Fano's inequality, and discuss broader implications of our results for two-layer polynomial networks.
Original language | English (US) |
---|---|
Pages (from-to) | 649-652 |
Number of pages | 4 |
Journal | Proceedings of Machine Learning Research |
Volume | 272 |
State | Published - 2025 |
Event | 36th International Conference on Algorithmic Learning Theory, ALT 2025 - Milan, Italy Duration: Feb 24 2025 → Feb 27 2025 |
Keywords
- covering numbers
- log-concave distributions
- low-rank
- rank minimization
- Symmetric tensors
- tensor recovery
ASJC Scopus subject areas
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability