Skip to main navigation
Skip to search
Skip to main content
Illinois Experts Home
LOGIN & Help
Home
Profiles
Research units
Research & Scholarship
Datasets
Honors
Press/Media
Activities
Search by expertise, name or affiliation
On the convergence of a class of Adam-type algorithms for non-convex optimization
Xiangyi Chen
, Sijia Liu
, Ruoyu Sun
, Mingyi Hong
Industrial and Enterprise Systems Engineering
Coordinated Science Lab
Electrical and Computer Engineering
Research output
:
Contribution to conference
›
Paper
›
peer-review
Overview
Fingerprint
Fingerprint
Dive into the research topics of 'On the convergence of a class of Adam-type algorithms for non-convex optimization'. Together they form a unique fingerprint.
Sort by
Weight
Alphabetically
Keyphrases
Adagrad
100%
Adams Method
100%
Adaptive Gradient
50%
AMSGrad
50%
Convergence Analysis
50%
Convergence Behavior
50%
Convergence Rate
50%
First-order
50%
Gradient-based
50%
Learning Rate
50%
Momentum Algorithm
50%
Nonconvex
50%
Nonconvex Optimization
100%
Nonconvex Problem
50%
Nonconvex Stochastic Optimization
50%
Popular
50%
Search Direction
50%
Training Deep Neural Networks
50%
Computer Science
Analysis Framework
100%
Convergence Behavior
100%
Convergence Rate
100%
Convex Optimization
100%
Deep Neural Network
100%
Learning Rate
100%
Mathematical Convergence
100%
Search Direction
100%
Stochastic Optimization
100%
Sufficient Condition
100%
Mathematics
Convergence Analysis
50%
Convergence Behavior
50%
Convergence Rate
50%
Deep Neural Network
50%
Nonconvex Problem
50%
Open Question
50%
Search Direction
50%
Stochastics
50%
Sufficient Condition
50%
Type Method
100%