Suboptimal Local Minima Exist for Wide Neural Networks with Smooth Activations

Tian Ding, Dawei Li, Ruoyu Sun

Research output: Contribution to journalArticlepeer-review


Does a large width eliminate all suboptimal local minima for neural nets? An affirmative answer was given by a classic result published in 1995 for one-hidden-layer wide neural nets with a sigmoid activation function, but this result has not been extended to the multilayer case. Recently, it was shown that, with piecewise linear activations, suboptimal local minima exist even for wide nets. Given the classic positive result on smooth activation and the negative result on nonsmooth activations, an interesting open question is: Does a large width eliminate all suboptimal local minima for deep neural nets with smooth activation? In this paper, we give a largely negative answer to this question. Specifically, we prove that, for neural networks with generic input data and smooth nonlinear activation functions, suboptimal local minima can exist no matter how wide the network is (as long as the last hidden layer has at least two neurons). Therefore, the classic result of no suboptimal local minimum for a one-hidden-layer network does not hold. Whereas this classic result assumes sigmoid activation, our counterexample covers a large set of activation functions (dense in the set of continuous functions), indicating that the limitation is not a result of the specific activation. Together with recent progress on piecewise linear activations, our result indicates that suboptimal local minima are common for wide neural nets.

Original languageEnglish (US)
Pages (from-to)2784-2814
Number of pages31
JournalMathematics of Operations Research
Issue number4
StatePublished - Nov 2022


  • deep learning
  • landscape
  • local minimum
  • neural network

ASJC Scopus subject areas

  • General Mathematics
  • Computer Science Applications
  • Management Science and Operations Research


Dive into the research topics of 'Suboptimal Local Minima Exist for Wide Neural Networks with Smooth Activations'. Together they form a unique fingerprint.

Cite this