Accelerating nonconvex learning via replica exchange Langevin diffusion

Yi Chen, Jinglin Chen, Jing Dong, Jian Peng, Zhaoran Wang

Research output: Contribution to conferencePaper

Abstract

Langevin diffusion is a powerful method for nonconvex optimization, which enables the escape from local minima by injecting noise into the gradient. In particular, the temperature parameter controlling the noise level gives rise to a tradeoff between “global exploration” and “local exploitation”, which correspond to high and low temperatures. To attain the advantages of both regimes, we propose to use replica exchange, which swaps between two Langevin diffusions with different temperatures. We theoretically analyze the acceleration effect of replica exchange from two perspectives: (i) the convergence in χ2-divergence, and (ii) the large deviation principle. Such an acceleration effect allows us to faster approach the global minima. Furthermore, by discretizing the replica exchange Langevin diffusion, we obtain a discrete-time algorithm. For such an algorithm, we quantify its discretization error in theory and demonstrate its acceleration effect in practice.

Original languageEnglish (US)
StatePublished - Jan 1 2019
Event7th International Conference on Learning Representations, ICLR 2019 - New Orleans, United States
Duration: May 6 2019May 9 2019

Conference

Conference7th International Conference on Learning Representations, ICLR 2019
CountryUnited States
CityNew Orleans
Period5/6/195/9/19

Fingerprint

learning
Temperature
divergence
exploitation
regime
time
Low Temperature
Exploitation
Divergence
Deviation

ASJC Scopus subject areas

  • Education
  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Cite this

Chen, Y., Chen, J., Dong, J., Peng, J., & Wang, Z. (2019). Accelerating nonconvex learning via replica exchange Langevin diffusion. Paper presented at 7th International Conference on Learning Representations, ICLR 2019, New Orleans, United States.

Accelerating nonconvex learning via replica exchange Langevin diffusion. / Chen, Yi; Chen, Jinglin; Dong, Jing; Peng, Jian; Wang, Zhaoran.

2019. Paper presented at 7th International Conference on Learning Representations, ICLR 2019, New Orleans, United States.

Research output: Contribution to conferencePaper

Chen, Y, Chen, J, Dong, J, Peng, J & Wang, Z 2019, 'Accelerating nonconvex learning via replica exchange Langevin diffusion' Paper presented at 7th International Conference on Learning Representations, ICLR 2019, New Orleans, United States, 5/6/19 - 5/9/19, .
Chen Y, Chen J, Dong J, Peng J, Wang Z. Accelerating nonconvex learning via replica exchange Langevin diffusion. 2019. Paper presented at 7th International Conference on Learning Representations, ICLR 2019, New Orleans, United States.
Chen, Yi ; Chen, Jinglin ; Dong, Jing ; Peng, Jian ; Wang, Zhaoran. / Accelerating nonconvex learning via replica exchange Langevin diffusion. Paper presented at 7th International Conference on Learning Representations, ICLR 2019, New Orleans, United States.
@conference{71c6379f1ee94cdca33fa3e85259d022,
title = "Accelerating nonconvex learning via replica exchange Langevin diffusion",
abstract = "Langevin diffusion is a powerful method for nonconvex optimization, which enables the escape from local minima by injecting noise into the gradient. In particular, the temperature parameter controlling the noise level gives rise to a tradeoff between “global exploration” and “local exploitation”, which correspond to high and low temperatures. To attain the advantages of both regimes, we propose to use replica exchange, which swaps between two Langevin diffusions with different temperatures. We theoretically analyze the acceleration effect of replica exchange from two perspectives: (i) the convergence in χ2-divergence, and (ii) the large deviation principle. Such an acceleration effect allows us to faster approach the global minima. Furthermore, by discretizing the replica exchange Langevin diffusion, we obtain a discrete-time algorithm. For such an algorithm, we quantify its discretization error in theory and demonstrate its acceleration effect in practice.",
author = "Yi Chen and Jinglin Chen and Jing Dong and Jian Peng and Zhaoran Wang",
year = "2019",
month = "1",
day = "1",
language = "English (US)",
note = "7th International Conference on Learning Representations, ICLR 2019 ; Conference date: 06-05-2019 Through 09-05-2019",

}

TY - CONF

T1 - Accelerating nonconvex learning via replica exchange Langevin diffusion

AU - Chen, Yi

AU - Chen, Jinglin

AU - Dong, Jing

AU - Peng, Jian

AU - Wang, Zhaoran

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Langevin diffusion is a powerful method for nonconvex optimization, which enables the escape from local minima by injecting noise into the gradient. In particular, the temperature parameter controlling the noise level gives rise to a tradeoff between “global exploration” and “local exploitation”, which correspond to high and low temperatures. To attain the advantages of both regimes, we propose to use replica exchange, which swaps between two Langevin diffusions with different temperatures. We theoretically analyze the acceleration effect of replica exchange from two perspectives: (i) the convergence in χ2-divergence, and (ii) the large deviation principle. Such an acceleration effect allows us to faster approach the global minima. Furthermore, by discretizing the replica exchange Langevin diffusion, we obtain a discrete-time algorithm. For such an algorithm, we quantify its discretization error in theory and demonstrate its acceleration effect in practice.

AB - Langevin diffusion is a powerful method for nonconvex optimization, which enables the escape from local minima by injecting noise into the gradient. In particular, the temperature parameter controlling the noise level gives rise to a tradeoff between “global exploration” and “local exploitation”, which correspond to high and low temperatures. To attain the advantages of both regimes, we propose to use replica exchange, which swaps between two Langevin diffusions with different temperatures. We theoretically analyze the acceleration effect of replica exchange from two perspectives: (i) the convergence in χ2-divergence, and (ii) the large deviation principle. Such an acceleration effect allows us to faster approach the global minima. Furthermore, by discretizing the replica exchange Langevin diffusion, we obtain a discrete-time algorithm. For such an algorithm, we quantify its discretization error in theory and demonstrate its acceleration effect in practice.

UR - http://www.scopus.com/inward/record.url?scp=85071179657&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85071179657&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:85071179657

ER -