Abstract
The main result of this article is that we obtain an elementwise error bound for the Fused Lasso estimator for any general convex loss function ρ. We then focus on the special cases when either ρ is the square loss function (for mean regression) or is the quantile loss function (for quantile regression) for which we derive new pointwise error bounds. Even though error bounds for the usual Fused Lasso estimator and its quantile version have been studied before; our bound appears to be new. This is because all previous works bound a global loss function like the sum of squared error, or a sum of huber losses in the case of quantile regression in Padilla and Chatterjee (Biometrika 109 (2022) 751–768). Clearly, element wise bounds are stronger than global loss error bounds as it reveals how the loss behaves locally at each point. Our element wise error bound also has a clean and explicit dependence on the tuning parameter λ which informs the user of a good choice of λ. In addition, our bound is nonasymptotic with explicit constants and is able to recover almost all the known results for Fused Lasso (both mean and quantile regression) with additional improvements in some cases.
Original language | English (US) |
---|---|
Pages (from-to) | 2691-2718 |
Number of pages | 28 |
Journal | Bernoulli |
Volume | 29 |
Issue number | 4 |
DOIs | |
State | Published - Nov 2023 |
Keywords
- Adaptive risk bounds
- generalized Fused Lasso
- law of iterated logarithm
- nonasymptotic risk bounds
- nonparametric quantile regression
- pointwise risk bounds
- total variation denoising
ASJC Scopus subject areas
- Statistics and Probability