Skip to main navigation
Skip to search
Skip to main content
Illinois Experts Home
LOGIN & Help
Link opens in a new tab
Search content at Illinois Experts
Home
Profiles
Research units
Research & Scholarship
Datasets
Honors
Press/Media
Activities
The Dynamics of Gradient Descent for Overparametrized Neural Networks
Siddhartha Satpathi
,
R. Srikant
Electrical and Computer Engineering
Siebel School of Computing and Data Science
Coordinated Science Lab
Office of the Vice Chancellor for Research and Innovation
Research output
:
Contribution to journal
›
Conference article
›
peer-review
Overview
Fingerprint
Fingerprint
Dive into the research topics of 'The Dynamics of Gradient Descent for Overparametrized Neural Networks'. Together they form a unique fingerprint.
Sort by
Weight
Alphabetically
Keyphrases
Alternative Proof
20%
Gradient Descent
100%
Linear Approximation
20%
Loss Function
20%
Lyapunov Analysis
20%
Minimum Norm Solution
20%
Neural Network
20%
Neural Network Weights
20%
Overparametrized Neural Network
100%
Parameter Values
20%
Prediction Function
20%
Single Hidden Layer Neural Network
20%
Squared Loss
20%
Training Error
40%
Zero-training
20%
Mathematics
Function Prediction
25%
Initial Condition
25%
Linear Approximation
25%
Loss Function
25%
Network Weight
25%
Neural Network
100%