A spectral algorithm for learning Hidden Markov Models

Daniel Hsu, Sham M. Kakade, Tong Zhang

Research output: Contribution to journalArticlepeer-review

Abstract

Hidden Markov Models (HMMs) are one of the most fundamental and widely used statistical tools for modeling discrete time series. In general, learning HMMs from data is computationally hard (under cryptographic assumptions), and practitioners typically resort to search heuristics which suffer from the usual local optima issues. We prove that under a natural separation condition (bounds on the smallest singular value of the HMM parameters), there is an efficient and provably correct algorithm for learning HMMs. The sample complexity of the algorithm does not explicitly depend on the number of distinct (discrete) observations - it implicitly depends on this quantity through spectral properties of the underlying HMM. This makes the algorithm particularly applicable to settings with a large number of observations, such as those in natural language processing where the space of observation is sometimes the words in a language. The algorithm is also simple, employing only a singular value decomposition and matrix multiplications.

Original languageEnglish (US)
Pages (from-to)1460-1480
Number of pages21
JournalJournal of Computer and System Sciences
Volume78
Issue number5
DOIs
StatePublished - Sep 2012
Externally publishedYes

Keywords

  • Hidden Markov Models
  • Latent variable models
  • Learning probability distributions
  • Observable operator models
  • Singular value decomposition
  • Spectral algorithm
  • Time series
  • Unsupervised learning

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Networks and Communications
  • Computational Theory and Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'A spectral algorithm for learning Hidden Markov Models'. Together they form a unique fingerprint.

Cite this