Guided Learning of Nonconvex Models through Successive Functional Gradient Optimization

Research output: Contribution to journalConference articlepeer-review

Abstract

This paper presents a framework of successive functional gradient optimization for training non convex models such as neural networks, where training is driven by mirror descent in a function space. We provide a theoretical analysis and empirical study of the training method derived from this framework. It is shown that the method leads to better performance than that of standard training techniques.

Original languageEnglish (US)
JournalProceedings of Machine Learning Research
Volume119
StatePublished - 2020
Externally publishedYes
Event37th International Conference on Machine Learning, ICML 2020 - Virtual, Online
Duration: Jul 13 2020Jul 18 2020

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Guided Learning of Nonconvex Models through Successive Functional Gradient Optimization'. Together they form a unique fingerprint.

Cite this