High-dimensional classification via nonparametric empirical Bayes and maximum likelihood inference

Lee H. Dicker, Sihai D. Zhao

Research output: Contribution to journalArticlepeer-review


We propose new nonparametric empirical Bayes methods for high-dimensional classification. Our classifiers are designed to approximate the Bayes classifier in a hypothesized hierarchical model, where the prior distributions for the model parameters are estimated nonparametrically from the training data. As is common with nonparametric empirical Bayes, the proposed classifiers are effective in high-dimensional settings even when the underlying model parameters are in fact nonrandom. We use nonparametric maximum likelihood estimates of the prior distributions, following the elegant approach studied by Kiefer & Wolfowitz in the 1950s. However, our implementation is based on a recent convex optimization framework for approximating these estimates that is well-suited for large-scale problems. We derive new theoretical results on the accuracy of the approximate estimator, which help control the misclassification rate of one of our classifiers. We show that our methods outperform several existing methods in simulations and perform well when gene expression microarray data is used to classify cancer patients.

Original languageEnglish (US)
Pages (from-to)21-34
Number of pages14
Issue number1
StatePublished - Jan 1 2015


  • Classification
  • Convex optimization
  • Empirical Bayes estimation
  • Kiefer-Wolfowitz estimator
  • Mixture models
  • Nonparametric maximum likelihood estimation

ASJC Scopus subject areas

  • Statistics and Probability
  • Mathematics(all)
  • Agricultural and Biological Sciences (miscellaneous)
  • Agricultural and Biological Sciences(all)
  • Statistics, Probability and Uncertainty
  • Applied Mathematics


Dive into the research topics of 'High-dimensional classification via nonparametric empirical Bayes and maximum likelihood inference'. Together they form a unique fingerprint.

Cite this