Tail inequalities for sums of random matrices that depend on the intrinsic dimension

Daniel Hsu, Sham M. Kakadey, Tong Zhang

Research output: Contribution to journalArticlepeer-review

Abstract

This work provides exponential tail inequalities for sums of random matrices that depend only on intrinsic dimensions rather than explicit matrix dimensions. These tail inequalities are similar to the matrix versions of the Chernoff bound and Bernstein inequality except with the explicit matrix dimensions replaced by a trace quantity that can be small even when the explicit dimensions are large or infinite. Some applications to covariance estimation and approximate matrix multiplication are given to illustrate the utility of the new bounds.

Original languageEnglish (US)
Pages (from-to)1-13
Number of pages13
JournalElectronic Communications in Probability
Volume17
DOIs
StatePublished - Jan 1 2012
Externally publishedYes

Keywords

  • Intrinsic dimension
  • Large deviation
  • Random matrix

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Tail inequalities for sums of random matrices that depend on the intrinsic dimension'. Together they form a unique fingerprint.

Cite this