Information-theoretic analysis of generalization capability of learning algorithms

Aolin Xu, Maxim Raginsky

Research output: Contribution to journalConference article

Abstract

We derive upper bounds on the generalization error of a learning algorithm in terms of the mutual information between its input and output. The bounds provide an information-theoretic understanding of generalization in learning problems, and give theoretical guidelines for striking the right balance between data fit and generalization by controlling the input-output mutual information. We propose a number of methods for this purpose, among which are algorithms that regularize the ERM algorithm with relative entropy or with random noise. Our work extends and leads to nontrivial improvements on the recent results of Russo and Zou.

Original languageEnglish (US)
Pages (from-to)2525-2534
Number of pages10
JournalAdvances in Neural Information Processing Systems
Volume2017-December
StatePublished - Jan 1 2017
Event31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States
Duration: Dec 4 2017Dec 9 2017

Fingerprint

Information analysis
Learning algorithms
Entropy

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Information-theoretic analysis of generalization capability of learning algorithms. / Xu, Aolin; Raginsky, Maxim.

In: Advances in Neural Information Processing Systems, Vol. 2017-December, 01.01.2017, p. 2525-2534.

Research output: Contribution to journalConference article

@article{81d8cccbbe8946029a15938a0adf6b88,
title = "Information-theoretic analysis of generalization capability of learning algorithms",
abstract = "We derive upper bounds on the generalization error of a learning algorithm in terms of the mutual information between its input and output. The bounds provide an information-theoretic understanding of generalization in learning problems, and give theoretical guidelines for striking the right balance between data fit and generalization by controlling the input-output mutual information. We propose a number of methods for this purpose, among which are algorithms that regularize the ERM algorithm with relative entropy or with random noise. Our work extends and leads to nontrivial improvements on the recent results of Russo and Zou.",
author = "Aolin Xu and Maxim Raginsky",
year = "2017",
month = "1",
day = "1",
language = "English (US)",
volume = "2017-December",
pages = "2525--2534",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

TY - JOUR

T1 - Information-theoretic analysis of generalization capability of learning algorithms

AU - Xu, Aolin

AU - Raginsky, Maxim

PY - 2017/1/1

Y1 - 2017/1/1

N2 - We derive upper bounds on the generalization error of a learning algorithm in terms of the mutual information between its input and output. The bounds provide an information-theoretic understanding of generalization in learning problems, and give theoretical guidelines for striking the right balance between data fit and generalization by controlling the input-output mutual information. We propose a number of methods for this purpose, among which are algorithms that regularize the ERM algorithm with relative entropy or with random noise. Our work extends and leads to nontrivial improvements on the recent results of Russo and Zou.

AB - We derive upper bounds on the generalization error of a learning algorithm in terms of the mutual information between its input and output. The bounds provide an information-theoretic understanding of generalization in learning problems, and give theoretical guidelines for striking the right balance between data fit and generalization by controlling the input-output mutual information. We propose a number of methods for this purpose, among which are algorithms that regularize the ERM algorithm with relative entropy or with random noise. Our work extends and leads to nontrivial improvements on the recent results of Russo and Zou.

UR - http://www.scopus.com/inward/record.url?scp=85047009011&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85047009011&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85047009011

VL - 2017-December

SP - 2525

EP - 2534

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -