Laplacian regularized Gaussian mixture model for data clustering

Xiaofei He, Deng Cai, Yuanlong Shao, Hujun Bao, Jiawei Han

Research output: Contribution to journalArticle

Abstract

Gaussian Mixture Models (GMMs) are among the most statistically mature methods for clustering. Each cluster is represented by a Gaussian distribution. The clustering process thereby turns to estimate the parameters of the Gaussian mixture, usually by the Expectation-Maximization algorithm. In this paper, we consider the case where the probability distribution that generates the data is supported on a submanifold of the ambient space. It is natural to assume that if two points are close in the intrinsic geometry of the probability distribution, then their conditional probability distributions are similar. Specifically, we introduce a regularized probabilistic model based on manifold structure for data clustering, called Laplacian regularized Gaussian Mixture Model (LapGMM). The data manifold is modeled by a nearest neighbor graph, and the graph structure is incorporated in the maximum likelihood objective function. As a result, the obtained conditional probability distribution varies smoothly along the geodesics of the data manifold. Experimental results on real data sets demonstrate the effectiveness of the proposed approach.

Original languageEnglish (US)
Article number5677520
Pages (from-to)1406-1418
Number of pages13
JournalIEEE Transactions on Knowledge and Data Engineering
Volume23
Issue number9
DOIs
StatePublished - Jun 21 2011

Keywords

  • Gaussian mixture model
  • clustering
  • graph laplacian
  • manifold structure

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Computational Theory and Mathematics

Fingerprint Dive into the research topics of 'Laplacian regularized Gaussian mixture model for data clustering'. Together they form a unique fingerprint.

  • Cite this