TY - JOUR
T1 - Sharp restricted isometry bounds for the inexistence of spurious local minima in nonconvex matrix recovery
AU - Zhang, Richard Y.
AU - Sojoudi, Somayeh
AU - Lavaei, Javad
N1 - Funding Information:
We are grateful to Salar Fattahi for a meticulous reading and detailed comments, and to Salar Fattahi and Cᅵdric Josz for fruitful discussions. We thank two anonymous reviewers for helpful comments and for pointing out typos. This work was supported by grants from ONR, AFOSR, ARO, and NSF.
Publisher Copyright:
© 2019 Richard Y. Zhang, Somayeh Sojoudi, and Javad Lavaei.
PY - 2019/6/1
Y1 - 2019/6/1
N2 - Nonconvex matrix recovery is known to contain no spurious local minima under a restricted isometry property (RIP) with a sufficiently small RIP constant δ. If δ is too large, however, then counterexamples containing spurious local minima are known to exist. In this paper, we introduce a proof technique that is capable of establishing sharp thresholds on δ to guarantee the inexistence of spurious local minima. Using the technique, we prove that in the case of a rank-1 ground truth, an RIP constant of δ < 1=2 is both necessary and sufficient for exact recovery from any arbitrary initial point (such as a random point). We also prove a local recovery result: Given an initial point x0 satisfying f(x0) ≤(1-δ)2f(0), any descent algorithm that converges to second-order optimality guarantees exact recovery.
AB - Nonconvex matrix recovery is known to contain no spurious local minima under a restricted isometry property (RIP) with a sufficiently small RIP constant δ. If δ is too large, however, then counterexamples containing spurious local minima are known to exist. In this paper, we introduce a proof technique that is capable of establishing sharp thresholds on δ to guarantee the inexistence of spurious local minima. Using the technique, we prove that in the case of a rank-1 ground truth, an RIP constant of δ < 1=2 is both necessary and sufficient for exact recovery from any arbitrary initial point (such as a random point). We also prove a local recovery result: Given an initial point x0 satisfying f(x0) ≤(1-δ)2f(0), any descent algorithm that converges to second-order optimality guarantees exact recovery.
KW - Matrix factorization
KW - Matrix sensing
KW - Nonconvex optimization
KW - Restricted Isometry Property
KW - Spurious local minima
UR - http://www.scopus.com/inward/record.url?scp=85072612921&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85072612921&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:85072612921
SN - 1532-4435
VL - 20
JO - Journal of Machine Learning Research
JF - Journal of Machine Learning Research
ER -