Semi-supervised training of Gaussian mixture models by conditional entropy minimization

Research output: Contribution to conferencePaper

Abstract

In this paper, we propose a new semi-supervised training method for Gaussian Mixture Models. We add a conditional entropy minimizer to the maximum mutual information criteria, which enables to incorporate unlabeled data in a discriminative training fashion. The training method is simple but surprisingly effective. The preconditioned conjugate gradient method provides a reasonable convergence rate for parameter update. The phonetic classification experiments on the TIMIT corpus demonstrate significant improvements due to unlabeled data via our training criteria.

Original languageEnglish (US)
Pages1353-1356
Number of pages4
StatePublished - Dec 1 2010
Event11th Annual Conference of the International Speech Communication Association: Spoken Language Processing for All, INTERSPEECH 2010 - Makuhari, Chiba, Japan
Duration: Sep 26 2010Sep 30 2010

Other

Other11th Annual Conference of the International Speech Communication Association: Spoken Language Processing for All, INTERSPEECH 2010
CountryJapan
CityMakuhari, Chiba
Period9/26/109/30/10

Fingerprint

Entropy
Phonetics
Mixture Model

Keywords

  • Conditional entropy
  • Gaussian Mixture Models
  • Phonetic classification
  • Semi-supervised learning

ASJC Scopus subject areas

  • Language and Linguistics
  • Speech and Hearing

Cite this

Huang, J. T., & Hasegawa-Johnson, M. A. (2010). Semi-supervised training of Gaussian mixture models by conditional entropy minimization. 1353-1356. Paper presented at 11th Annual Conference of the International Speech Communication Association: Spoken Language Processing for All, INTERSPEECH 2010, Makuhari, Chiba, Japan.

Semi-supervised training of Gaussian mixture models by conditional entropy minimization. / Huang, Jui Ting; Hasegawa-Johnson, Mark Allan.

2010. 1353-1356 Paper presented at 11th Annual Conference of the International Speech Communication Association: Spoken Language Processing for All, INTERSPEECH 2010, Makuhari, Chiba, Japan.

Research output: Contribution to conferencePaper

Huang, JT & Hasegawa-Johnson, MA 2010, 'Semi-supervised training of Gaussian mixture models by conditional entropy minimization', Paper presented at 11th Annual Conference of the International Speech Communication Association: Spoken Language Processing for All, INTERSPEECH 2010, Makuhari, Chiba, Japan, 9/26/10 - 9/30/10 pp. 1353-1356.
Huang JT, Hasegawa-Johnson MA. Semi-supervised training of Gaussian mixture models by conditional entropy minimization. 2010. Paper presented at 11th Annual Conference of the International Speech Communication Association: Spoken Language Processing for All, INTERSPEECH 2010, Makuhari, Chiba, Japan.
Huang, Jui Ting ; Hasegawa-Johnson, Mark Allan. / Semi-supervised training of Gaussian mixture models by conditional entropy minimization. Paper presented at 11th Annual Conference of the International Speech Communication Association: Spoken Language Processing for All, INTERSPEECH 2010, Makuhari, Chiba, Japan.4 p.
@conference{b3d5b57a45314119999976b82686a19a,
title = "Semi-supervised training of Gaussian mixture models by conditional entropy minimization",
abstract = "In this paper, we propose a new semi-supervised training method for Gaussian Mixture Models. We add a conditional entropy minimizer to the maximum mutual information criteria, which enables to incorporate unlabeled data in a discriminative training fashion. The training method is simple but surprisingly effective. The preconditioned conjugate gradient method provides a reasonable convergence rate for parameter update. The phonetic classification experiments on the TIMIT corpus demonstrate significant improvements due to unlabeled data via our training criteria.",
keywords = "Conditional entropy, Gaussian Mixture Models, Phonetic classification, Semi-supervised learning",
author = "Huang, {Jui Ting} and Hasegawa-Johnson, {Mark Allan}",
year = "2010",
month = "12",
day = "1",
language = "English (US)",
pages = "1353--1356",
note = "11th Annual Conference of the International Speech Communication Association: Spoken Language Processing for All, INTERSPEECH 2010 ; Conference date: 26-09-2010 Through 30-09-2010",

}

TY - CONF

T1 - Semi-supervised training of Gaussian mixture models by conditional entropy minimization

AU - Huang, Jui Ting

AU - Hasegawa-Johnson, Mark Allan

PY - 2010/12/1

Y1 - 2010/12/1

N2 - In this paper, we propose a new semi-supervised training method for Gaussian Mixture Models. We add a conditional entropy minimizer to the maximum mutual information criteria, which enables to incorporate unlabeled data in a discriminative training fashion. The training method is simple but surprisingly effective. The preconditioned conjugate gradient method provides a reasonable convergence rate for parameter update. The phonetic classification experiments on the TIMIT corpus demonstrate significant improvements due to unlabeled data via our training criteria.

AB - In this paper, we propose a new semi-supervised training method for Gaussian Mixture Models. We add a conditional entropy minimizer to the maximum mutual information criteria, which enables to incorporate unlabeled data in a discriminative training fashion. The training method is simple but surprisingly effective. The preconditioned conjugate gradient method provides a reasonable convergence rate for parameter update. The phonetic classification experiments on the TIMIT corpus demonstrate significant improvements due to unlabeled data via our training criteria.

KW - Conditional entropy

KW - Gaussian Mixture Models

KW - Phonetic classification

KW - Semi-supervised learning

UR - http://www.scopus.com/inward/record.url?scp=79959857219&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=79959857219&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:79959857219

SP - 1353

EP - 1356

ER -