Adaptive risk bounds in univariate total variation denoising and trend filtering

Adityanand Guntuboyina, Donovan Lieu, Sabyasachi Chatterjee, Bodhisattva Sen

Research output: Contribution to journalArticlepeer-review


We study trend filtering, a relatively recent method for univariate nonparametric regression. For a given integer r ≥ 1, the rth order trend filtering estimator is defined as the minimizer of the sum of squared errors when we constrain (or penalize) the sum of the absolute rth order discrete derivatives of the fitted function at the design points. For r = 1, the estimator reduces to total variation regularization which has received much attention in the statistics and image processing literature. In this paper, we study the performance of the trend filtering estimator for every r ≥ 1, both in the constrained and penalized forms. Our main results show that in the strong sparsity setting when the underlying function is a (discrete) spline with few “knots,” the risk (under the global squared error loss) of the trend filtering estimator (with an appropriate choice of the tuning parameter) achieves the parametric n−1-rate, up to a logarithmic (multiplicative) factor. Our results therefore provide support for the use of trend filtering, for every r ≥ 1, in the strong sparsity setting.

Original languageEnglish (US)
Pages (from-to)205-229
Number of pages25
JournalAnnals of Statistics
Issue number1
StatePublished - 2020


  • Adaptive splines
  • Discrete splines
  • Fat shattering
  • Higher order total variation regularization
  • Metric entropy bounds
  • Nonparametric function estimation
  • Risk bounds
  • Subdifferential
  • Tangent cone

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Adaptive risk bounds in univariate total variation denoising and trend filtering'. Together they form a unique fingerprint.

Cite this