Abstract
We consider the problem of estimating an unknown θ ∈ ℝn from noisy observations under the constraint that θ belongs to certain convex polyhedral cones in ℝn. Under this setting, we prove bounds for the risk of the least squares estimator (LSE). The obtained risk bound behaves differently depending on the true sequence θ which highlights the adaptive behavior of θ. As special cases of our general result, we derive risk bounds for the LSE in univariate isotonic and convex regression. We study the risk bound in isotonic regression in greater detail: we show that the isotonic LSE converges at a whole range of rates from log n/n (when θ is constant) to n-2/3 (when θ is uniformly increasing in a certain sense). We argue that the bound presents a benchmark for the risk of any estimator in isotonic regression by proving nonasymptotic local minimax lower bounds. We prove an analogue of our bound for model misspecification where the true θ is not necessarily nondecreasing.
Original language | English (US) |
---|---|
Pages (from-to) | 1774-1800 |
Number of pages | 27 |
Journal | Annals of Statistics |
Volume | 43 |
Issue number | 4 |
DOIs | |
State | Published - Aug 1 2015 |
Externally published | Yes |
Keywords
- Adaptation
- Convex polyhedral cones
- Global risk bounds
- Local minimax bounds
- Model misspecification
- Statistical dimension
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty