Abstract
Matrix factorization is a popular approach for large-scale matrix completion. The optimization formulation based on matrix factorization, even with huge size, can be solved very efficiently through the standard optimization algorithms in practice. However, due to the non-convexity caused by the factorization model, there is a limited theoretical understanding of whether these algorithms will generate a good solution. In this paper, we establish a theoretical guarantee for the factorization-based formulation to correctly recover the underlying low-rank matrix. In particular, we show that under similar conditions to those in previous works, many standard optimization algorithms converge to the global optima of a factorization-based formulation and recover the true low-rank matrix. We study the local geometry of a properly regularized objective and prove that any stationary point in a certain local region is globally optimal. A major difference of this paper from the existing results is that we do not need resampling (i.e., using independent samples at each iteration) in either the algorithm or its analysis.
Original language | English (US) |
---|---|
Article number | 7536166 |
Pages (from-to) | 6535-6579 |
Number of pages | 45 |
Journal | IEEE Transactions on Information Theory |
Volume | 62 |
Issue number | 11 |
DOIs | |
State | Published - Nov 2016 |
Externally published | Yes |
Keywords
- Matrix completion
- Perturbation analysis
- SGD
- alternating minimization
- matrix factorization
- nonconvex optimization
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences