TY - JOUR
T1 - Adaptive risk bounds in univariate total variation denoising and trend filtering
AU - Guntuboyina, Adityanand
AU - Lieu, Donovan
AU - Chatterjee, Sabyasachi
AU - Sen, Bodhisattva
N1 - Funding Information:
Acknowledgments. We thank Ryan Tibshirani for informing us about the reference Steidl, Didas and Neumann [34] and for many other helpful comments. We are also extremely thankful to the Associate Editor and the anonymous referees for very detailed comments on an earlier version of the paper. Their feedback greatly improved the quality of the paper. The first author was supported by NSF CAREER Grant DMS-1654589. The fourth author was supported by NSF Grant DMS-1150435.
Publisher Copyright:
© Institute of Mathematical Statistics, 2020
PY - 2020
Y1 - 2020
N2 - We study trend filtering, a relatively recent method for univariate nonparametric regression. For a given integer r ≥ 1, the rth order trend filtering estimator is defined as the minimizer of the sum of squared errors when we constrain (or penalize) the sum of the absolute rth order discrete derivatives of the fitted function at the design points. For r = 1, the estimator reduces to total variation regularization which has received much attention in the statistics and image processing literature. In this paper, we study the performance of the trend filtering estimator for every r ≥ 1, both in the constrained and penalized forms. Our main results show that in the strong sparsity setting when the underlying function is a (discrete) spline with few “knots,” the risk (under the global squared error loss) of the trend filtering estimator (with an appropriate choice of the tuning parameter) achieves the parametric n−1-rate, up to a logarithmic (multiplicative) factor. Our results therefore provide support for the use of trend filtering, for every r ≥ 1, in the strong sparsity setting.
AB - We study trend filtering, a relatively recent method for univariate nonparametric regression. For a given integer r ≥ 1, the rth order trend filtering estimator is defined as the minimizer of the sum of squared errors when we constrain (or penalize) the sum of the absolute rth order discrete derivatives of the fitted function at the design points. For r = 1, the estimator reduces to total variation regularization which has received much attention in the statistics and image processing literature. In this paper, we study the performance of the trend filtering estimator for every r ≥ 1, both in the constrained and penalized forms. Our main results show that in the strong sparsity setting when the underlying function is a (discrete) spline with few “knots,” the risk (under the global squared error loss) of the trend filtering estimator (with an appropriate choice of the tuning parameter) achieves the parametric n−1-rate, up to a logarithmic (multiplicative) factor. Our results therefore provide support for the use of trend filtering, for every r ≥ 1, in the strong sparsity setting.
KW - Adaptive splines
KW - Discrete splines
KW - Fat shattering
KW - Higher order total variation regularization
KW - Metric entropy bounds
KW - Nonparametric function estimation
KW - Risk bounds
KW - Subdifferential
KW - Tangent cone
UR - http://www.scopus.com/inward/record.url?scp=85081200129&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85081200129&partnerID=8YFLogxK
U2 - 10.1214/18-AOS1799
DO - 10.1214/18-AOS1799
M3 - Article
AN - SCOPUS:85081200129
SN - 0090-5364
VL - 48
SP - 205
EP - 229
JO - Annals of Statistics
JF - Annals of Statistics
IS - 1
ER -