Estimation of KL Divergence: Optimal Minimax Rate

Yuheng Bu, Shaofeng Zou, Yingbin Liang, Venugopal V. Veeravalli

Research output: Contribution to journalArticlepeer-review


The problem of estimating the Kullback-Leibler divergence D(P||Q) between two unknown distributions P and Q is studied, under the assumption that the alphabet size k of the distributions can scale to infinity. The estimation is based on m independent samples drawn from P and n independent samples drawn from Q. It is first shown that there does not exist any consistent estimator that guarantees asymptotically small worst case quadratic risk over the set of all pairs of distributions. A restricted set that contains pairs of distributions, with density ratio bounded by a function f(k) is further considered. An augmented plug-in estimator is proposed, and its worst case quadratic risk is shown to be within a constant factor of ((k/m)+(kf(k)/n))2+(log2f(k)/m)+(f(k)/n), if m and n exceed a constant factor of k and kf(k), respectively. Moreover, the minimax quadratic risk is characterized to be within a constant factor of ((k/(m log k))+(kf(k)/(n log k)))2+(log2f(k)/m)+(f(k)/n), if m and n exceed a constant factor of k/log(k) and kf(k)/log k, respectively. The lower bound on the minimax quadratic risk is characterized by employing a generalized Le Cam's method. A minimax optimal estimator is then constructed by employing both the polynomial approximation and the plug-in approaches.

Original languageEnglish (US)
Pages (from-to)2648-2674
Number of pages27
JournalIEEE Transactions on Information Theory
Issue number4
StatePublished - Apr 2018


  • Functional approximation
  • mean squared error
  • minimax lower bound
  • plug-in estimator
  • polynomial approximation

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


Dive into the research topics of 'Estimation of KL Divergence: Optimal Minimax Rate'. Together they form a unique fingerprint.

Cite this