Fast low-rank approximation for covariance matrices

Mohamed Ali Belabbas, Patrick J. Wolfe

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Computing an efficient low-rank approximation of a given positive definite matrix is a ubiquitous task in statistical signal processing and numerical linear algebra. The optimal solution is well known and is given by the singular value decomposition; however, its complexity scales as the cube of the matrix dimension. Here we introduce a low-complexity alternative which approximates this optimal low-rank solution, together with a bound on its worst-case error. Our methodology also reveals a connection between the approximation of matrix products and Schur complements. We present simulation results that verify performance improvements relative to contemporary randomized algorithms for low-rank approximation.

Original languageEnglish (US)
Title of host publication2007 2nd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMPSAP
Pages293-296
Number of pages4
DOIs
StatePublished - 2007
Externally publishedYes
Event2007 2nd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMPSAP - St. Thomas, Virgin Islands, U.S.
Duration: Dec 12 2007Dec 14 2007

Publication series

Name2007 2nd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMPSAP

Other

Other2007 2nd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMPSAP
Country/TerritoryVirgin Islands, U.S.
CitySt. Thomas
Period12/12/0712/14/07

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Networks and Communications
  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Fast low-rank approximation for covariance matrices'. Together they form a unique fingerprint.

Cite this