Adding one neuron can eliminate all bad local minima

Shiyu Liang, Ruoyu Sun, Jason D. Lee, Rayadurgam Srikant

Research output: Contribution to journalConference article

Abstract

One of the main difficulties in analyzing neural networks is the non-convexity of the loss function which may have many bad local minima. In this paper, we study the landscape of neural networks for binary classification tasks. Under mild assumptions, we prove that after adding one special neuron with a skip connection to the output, or one special neuron per layer, every local minimum is a global minimum.

Original languageEnglish (US)
Pages (from-to)4350-4360
Number of pages11
JournalAdvances in Neural Information Processing Systems
Volume2018-December
StatePublished - Jan 1 2018
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: Dec 2 2018Dec 8 2018

Fingerprint

Neurons
Neural networks

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Adding one neuron can eliminate all bad local minima. / Liang, Shiyu; Sun, Ruoyu; Lee, Jason D.; Srikant, Rayadurgam.

In: Advances in Neural Information Processing Systems, Vol. 2018-December, 01.01.2018, p. 4350-4360.

Research output: Contribution to journalConference article

@article{48d46cda0609408d8a5fea82d8fd3c38,
title = "Adding one neuron can eliminate all bad local minima",
abstract = "One of the main difficulties in analyzing neural networks is the non-convexity of the loss function which may have many bad local minima. In this paper, we study the landscape of neural networks for binary classification tasks. Under mild assumptions, we prove that after adding one special neuron with a skip connection to the output, or one special neuron per layer, every local minimum is a global minimum.",
author = "Shiyu Liang and Ruoyu Sun and Lee, {Jason D.} and Rayadurgam Srikant",
year = "2018",
month = "1",
day = "1",
language = "English (US)",
volume = "2018-December",
pages = "4350--4360",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

TY - JOUR

T1 - Adding one neuron can eliminate all bad local minima

AU - Liang, Shiyu

AU - Sun, Ruoyu

AU - Lee, Jason D.

AU - Srikant, Rayadurgam

PY - 2018/1/1

Y1 - 2018/1/1

N2 - One of the main difficulties in analyzing neural networks is the non-convexity of the loss function which may have many bad local minima. In this paper, we study the landscape of neural networks for binary classification tasks. Under mild assumptions, we prove that after adding one special neuron with a skip connection to the output, or one special neuron per layer, every local minimum is a global minimum.

AB - One of the main difficulties in analyzing neural networks is the non-convexity of the loss function which may have many bad local minima. In this paper, we study the landscape of neural networks for binary classification tasks. Under mild assumptions, we prove that after adding one special neuron with a skip connection to the output, or one special neuron per layer, every local minimum is a global minimum.

UR - http://www.scopus.com/inward/record.url?scp=85064842273&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85064842273&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85064842273

VL - 2018-December

SP - 4350

EP - 4360

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -