TY - JOUR
T1 - Guaranteed Matrix Completion via Non-Convex Factorization
AU - Sun, Ruoyu
AU - Luo, Zhi Quan
N1 - Funding Information:
This work was supported in part by NSF under Grant CCF-1526434, in part by NSFC under Grant 61571384, and in part by a Doctoral Dissertation Fellowship from the Graduate School of the University of Minnesota. This paper was presented in part at the 2015 IEEE FOCS.
Publisher Copyright:
© 2016 IEEE.
PY - 2016/11
Y1 - 2016/11
N2 - Matrix factorization is a popular approach for large-scale matrix completion. The optimization formulation based on matrix factorization, even with huge size, can be solved very efficiently through the standard optimization algorithms in practice. However, due to the non-convexity caused by the factorization model, there is a limited theoretical understanding of whether these algorithms will generate a good solution. In this paper, we establish a theoretical guarantee for the factorization-based formulation to correctly recover the underlying low-rank matrix. In particular, we show that under similar conditions to those in previous works, many standard optimization algorithms converge to the global optima of a factorization-based formulation and recover the true low-rank matrix. We study the local geometry of a properly regularized objective and prove that any stationary point in a certain local region is globally optimal. A major difference of this paper from the existing results is that we do not need resampling (i.e., using independent samples at each iteration) in either the algorithm or its analysis.
AB - Matrix factorization is a popular approach for large-scale matrix completion. The optimization formulation based on matrix factorization, even with huge size, can be solved very efficiently through the standard optimization algorithms in practice. However, due to the non-convexity caused by the factorization model, there is a limited theoretical understanding of whether these algorithms will generate a good solution. In this paper, we establish a theoretical guarantee for the factorization-based formulation to correctly recover the underlying low-rank matrix. In particular, we show that under similar conditions to those in previous works, many standard optimization algorithms converge to the global optima of a factorization-based formulation and recover the true low-rank matrix. We study the local geometry of a properly regularized objective and prove that any stationary point in a certain local region is globally optimal. A major difference of this paper from the existing results is that we do not need resampling (i.e., using independent samples at each iteration) in either the algorithm or its analysis.
KW - Matrix completion
KW - Perturbation analysis
KW - SGD
KW - alternating minimization
KW - matrix factorization
KW - nonconvex optimization
UR - http://www.scopus.com/inward/record.url?scp=84994056035&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84994056035&partnerID=8YFLogxK
U2 - 10.1109/TIT.2016.2598574
DO - 10.1109/TIT.2016.2598574
M3 - Article
AN - SCOPUS:84994056035
VL - 62
SP - 6535
EP - 6579
JO - IRE Professional Group on Information Theory
JF - IRE Professional Group on Information Theory
SN - 0018-9448
IS - 11
M1 - 7536166
ER -