Decentralized sketching of low-rank matrices

Rakshith S. Srinivasa, Kiryung Lee, Marius Junge, Justin Romberg

Research output: Contribution to journalConference articlepeer-review

Abstract

We address a low-rank matrix recovery problem where each column of a rank-r matrix X ? Rd1×d2 is compressed beyond the point of individual recovery to RL with L « d1. Leveraging the joint structure among the columns, we propose a method to recover the matrix to within an e relative error in the Frobenius norm from a total of O(r(d1 + d2) log6(d1 + d2)/e2) observations. This guarantee holds uniformly for all incoherent matrices of rank r. In our method, we propose to use a novel matrix norm called the mixed-norm along with the maximum `2-norm of the columns to design a new convex relaxation for low-rank recovery that is tailored to our observation model. We also show that the proposed mixed-norm, the standard nuclear norm, and the max-norm are particular instances of convex regularization of low-rankness via tensor norms. Finally, we provide a scalable ADMM algorithm for the mixed-norm-based method and demonstrate its empirical performance via large-scale simulations.

Original languageEnglish (US)
JournalAdvances in Neural Information Processing Systems
Volume32
StatePublished - 2019
Event33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019 - Vancouver, Canada
Duration: Dec 8 2019Dec 14 2019

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Decentralized sketching of low-rank matrices'. Together they form a unique fingerprint.

Cite this