TY - JOUR
T1 - A theory on the absence of spurious solutions for nonconvex and nonsmooth optimization
AU - Josz, C.
AU - Ouyang, Y.
AU - Zhang, R. Y.
AU - Lavaei, J.
AU - Sojoudi, S.
N1 - Funding Information:
This work was supported by the ONR Awards N00014-17-1-2933 ONR and N00014-18-1-2526, NSF Award 1808859, DARPA Award D16AP00002, and AFOSR Award FA9550- 17-1-0163. We wish to thank the anonymous reviewers for their valuable feedback, as well as Chris Dock for fruitful discussions.
Funding Information:
This work was supported by the ONR Awards N00014-17-1-2933 ONR and N00014-18-1-2526, NSF Award 1808859, DARPA Award D16AP00002, and AFOSR Award FA9550-17-1-0163. We wish to thank the anonymous reviewers for their valuable feedback, as well as Chris Dock for fruitful discussions.
Publisher Copyright:
© 2018 Curran Associates Inc.All rights reserved.
PY - 2018/1/1
Y1 - 2018/1/1
N2 - We study the set of continuous functions that admit no spurious local optima (i.e. local minima that are not global minima) which we term global functions. They satisfy various powerful properties for analyzing nonconvex and nonsmooth optimization problems. For instance, they satisfy a theorem akin to the fundamental uniform limit theorem in the analysis regarding continuous functions. Global functions are also endowed with useful properties regarding the composition of functions and change of variables. Using these new results, we show that a class of nonconvex and nonsmooth optimization problems arising in tensor decomposition applications are global functions. This is the first result concerning nonconvex methods for nonsmooth objective functions. Our result provides a theoretical guarantee for the widely-used `1 norm to avoid outliers in nonconvex optimization.
AB - We study the set of continuous functions that admit no spurious local optima (i.e. local minima that are not global minima) which we term global functions. They satisfy various powerful properties for analyzing nonconvex and nonsmooth optimization problems. For instance, they satisfy a theorem akin to the fundamental uniform limit theorem in the analysis regarding continuous functions. Global functions are also endowed with useful properties regarding the composition of functions and change of variables. Using these new results, we show that a class of nonconvex and nonsmooth optimization problems arising in tensor decomposition applications are global functions. This is the first result concerning nonconvex methods for nonsmooth objective functions. Our result provides a theoretical guarantee for the widely-used `1 norm to avoid outliers in nonconvex optimization.
UR - http://www.scopus.com/inward/record.url?scp=85064821954&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85064821954&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85064821954
VL - 2018-December
SP - 2441
EP - 2449
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
SN - 1049-5258
T2 - 32nd Conference on Neural Information Processing Systems, NeurIPS 2018
Y2 - 2 December 2018 through 8 December 2018
ER -