Skip to main navigation
Skip to search
Skip to main content
Illinois Experts Home
LOGIN & Help
Home
Profiles
Research units
Research & Scholarship
Datasets
Honors
Press/Media
Activities
Search by expertise, name or affiliation
GENERALIZATION BOUNDS VIA DISTILLATION
Daniel Hsu
, Ziwei Ji
, Matus Telgarsky
, Lan Wang
Siebel School of Computing and Data Science
Electrical and Computer Engineering
Research output
:
Contribution to conference
›
Paper
›
peer-review
Overview
Fingerprint
Fingerprint
Dive into the research topics of 'GENERALIZATION BOUNDS VIA DISTILLATION'. Together they form a unique fingerprint.
Sort by
Weight
Alphabetically
Keyphrases
Generalization Bounds
100%
Distillation
100%
Well-behaved
25%
High Complexity
25%
Reduction Method
25%
Generalization Performance
25%
Complex Networks
25%
Uniform Convergence
25%
Convergence Analysis
25%
Data Augmentation
25%
Empirical Phenomena
25%
MNIST
25%
Poor Generalization
25%
Convolutional Layer
25%
CIFAR-10
25%
Skip Connection
25%
Fully Connected Layer
25%
Computational Graph
25%
Computer Science
Mathematical Convergence
100%
Data Augmentation
100%
Generalization Performance
100%
Convolutional Layer
100%
Computation Graph
100%
Fully Connected Layer
100%
Engineering
Original Network
100%
Convolutional Layer
50%
Concrete Form
50%