Bayesian Regularization for Graphical Models With Unequal Shrinkage

Research output: Contribution to journalArticlepeer-review


We consider a Bayesian framework for estimating a high-dimensional sparse precision matrix, in which adaptive shrinkage and sparsity are induced by a mixture of Laplace priors. Besides discussing our formulation from the Bayesian standpoint, we investigate the MAP (maximum a posteriori) estimator from a penalized likelihood perspective that gives rise to a new nonconvex penalty approximating the ℓ0 penalty. Optimal error rates for estimation consistency in terms of various matrix norms along with selection consistency for sparse structure recovery are shown for the unique MAP estimator under mild conditions. For fast and efficient computation, an EM algorithm is proposed to compute the MAP estimator of the precision matrix and (approximate) posterior probabilities on the edges of the underlying sparse structure. Through extensive simulation studies and a real application to a call center data, we have demonstrated the fine performance of our method compared with existing alternatives. Supplementary materials for this article are available online.

Original languageEnglish (US)
Pages (from-to)1218-1231
Number of pages14
JournalJournal of the American Statistical Association
Issue number527
StatePublished - Jul 3 2019


  • Bayesian regularization
  • Precision matrix estimation
  • Sparse Gaussian graphical model
  • Spike-and-slab priors

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Bayesian Regularization for Graphical Models With Unequal Shrinkage'. Together they form a unique fingerprint.

Cite this