Manifold preserving hierarchical topic models for quantization and approximation

Minje Kim, Paris Smaragdis

Research output: Contribution to conferencePaper

Abstract

We present two complementary topic models to address the analysis of mixture data lying on manifolds. First, we propose a quantization method with an additional mid-layer latent variable, which selects only data points that best preserve the manifold structure of the input data. In order to address the case of modeling all the in-between parts of that manifold using this reduced representation of the input, we introduce a new model that provides a manifold-aware interpolation method. We demonstrate the advantages of these models with experiments on the hand-written digit recognition and the speech source separation tasks.

Original languageEnglish (US)
Pages2410-2418
Number of pages9
StatePublished - Jan 1 2013
Event30th International Conference on Machine Learning, ICML 2013 - Atlanta, GA, United States
Duration: Jun 16 2013Jun 21 2013

Other

Other30th International Conference on Machine Learning, ICML 2013
CountryUnited States
CityAtlanta, GA
Period6/16/136/21/13

Fingerprint

Source separation
Interpolation
experiment
Experiments

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Sociology and Political Science

Cite this

Kim, M., & Smaragdis, P. (2013). Manifold preserving hierarchical topic models for quantization and approximation. 2410-2418. Paper presented at 30th International Conference on Machine Learning, ICML 2013, Atlanta, GA, United States.

Manifold preserving hierarchical topic models for quantization and approximation. / Kim, Minje; Smaragdis, Paris.

2013. 2410-2418 Paper presented at 30th International Conference on Machine Learning, ICML 2013, Atlanta, GA, United States.

Research output: Contribution to conferencePaper

Kim, M & Smaragdis, P 2013, 'Manifold preserving hierarchical topic models for quantization and approximation', Paper presented at 30th International Conference on Machine Learning, ICML 2013, Atlanta, GA, United States, 6/16/13 - 6/21/13 pp. 2410-2418.
Kim M, Smaragdis P. Manifold preserving hierarchical topic models for quantization and approximation. 2013. Paper presented at 30th International Conference on Machine Learning, ICML 2013, Atlanta, GA, United States.
Kim, Minje ; Smaragdis, Paris. / Manifold preserving hierarchical topic models for quantization and approximation. Paper presented at 30th International Conference on Machine Learning, ICML 2013, Atlanta, GA, United States.9 p.
@conference{3563d199492f431f984644aa59b7c42f,
title = "Manifold preserving hierarchical topic models for quantization and approximation",
abstract = "We present two complementary topic models to address the analysis of mixture data lying on manifolds. First, we propose a quantization method with an additional mid-layer latent variable, which selects only data points that best preserve the manifold structure of the input data. In order to address the case of modeling all the in-between parts of that manifold using this reduced representation of the input, we introduce a new model that provides a manifold-aware interpolation method. We demonstrate the advantages of these models with experiments on the hand-written digit recognition and the speech source separation tasks.",
author = "Minje Kim and Paris Smaragdis",
year = "2013",
month = "1",
day = "1",
language = "English (US)",
pages = "2410--2418",
note = "30th International Conference on Machine Learning, ICML 2013 ; Conference date: 16-06-2013 Through 21-06-2013",

}

TY - CONF

T1 - Manifold preserving hierarchical topic models for quantization and approximation

AU - Kim, Minje

AU - Smaragdis, Paris

PY - 2013/1/1

Y1 - 2013/1/1

N2 - We present two complementary topic models to address the analysis of mixture data lying on manifolds. First, we propose a quantization method with an additional mid-layer latent variable, which selects only data points that best preserve the manifold structure of the input data. In order to address the case of modeling all the in-between parts of that manifold using this reduced representation of the input, we introduce a new model that provides a manifold-aware interpolation method. We demonstrate the advantages of these models with experiments on the hand-written digit recognition and the speech source separation tasks.

AB - We present two complementary topic models to address the analysis of mixture data lying on manifolds. First, we propose a quantization method with an additional mid-layer latent variable, which selects only data points that best preserve the manifold structure of the input data. In order to address the case of modeling all the in-between parts of that manifold using this reduced representation of the input, we introduce a new model that provides a manifold-aware interpolation method. We demonstrate the advantages of these models with experiments on the hand-written digit recognition and the speech source separation tasks.

UR - http://www.scopus.com/inward/record.url?scp=84897560916&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84897560916&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:84897560916

SP - 2410

EP - 2418

ER -