Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements

Research output: Contribution to journalConference articlepeer-review

Abstract

We investigate the sample complexity of recovering tensors with low symmetric rank from symmetric rank-one measurements, a setting particularly motivated by the study of higher-order interactions in statistics and the analysis of two-layer polynomial neural networks. Using a covering number argument, we analyze the performance of the symmetric rank minimization program and establish near-optimal sample complexity bounds when the underlying distribution is log-concave. Our measurement model involves random symmetric rank-one tensors, leading to involved probability calculations. To address these challenges, we employ the Carbery-Wright inequality, a powerful tool for studying anti-concentration properties of random polynomials, and leverage orthogonal polynomial expansions. Additionally, we provide a sample complexity lower bound via Fano's inequality, and discuss broader implications of our results for two-layer polynomial networks.

Original languageEnglish (US)
Pages (from-to)649-652
Number of pages4
JournalProceedings of Machine Learning Research
Volume272
StatePublished - 2025
Event36th International Conference on Algorithmic Learning Theory, ALT 2025 - Milan, Italy
Duration: Feb 24 2025Feb 27 2025

Keywords

  • covering numbers
  • log-concave distributions
  • low-rank
  • rank minimization
  • Symmetric tensors
  • tensor recovery

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements'. Together they form a unique fingerprint.

Cite this