Model selection for regression on a random design
ESAIM: Probability and Statistics (2002)
- Volume: 6, page 127-146
- ISSN: 1292-8100
Access Full Article
topAbstract
topHow to cite
topBaraud, Yannick. "Model selection for regression on a random design." ESAIM: Probability and Statistics 6 (2002): 127-146. <http://eudml.org/doc/244664>.
@article{Baraud2002,
abstract = {We consider the problem of estimating an unknown regression function when the design is random with values in $\mathbb \{R\}^k$. Our estimation procedure is based on model selection and does not rely on any prior information on the target function. We start with a collection of linear functional spaces and build, on a data selected space among this collection, the least-squares estimator. We study the performance of an estimator which is obtained by modifying this least-squares estimator on a set of small probability. For the so-defined estimator, we establish nonasymptotic risk bounds that can be related to oracle inequalities. As a consequence of these, we show that our estimator possesses adaptive properties in the minimax sense over large families of Besov balls $\{\mathcal \{B\}\}_\{\alpha ,l,\infty \}(R)$ with $R>0$, $l\ge 1$ and $\alpha >\alpha _l$ where $\alpha _l$ is a positive number satisfying $1/l-1/2\le \alpha _l<1/l$. We also study the particular case where the regression function is additive and then obtain an additive estimator which converges at the same rate as it does when $k=1$.},
author = {Baraud, Yannick},
journal = {ESAIM: Probability and Statistics},
keywords = {nonparametric regression; least-squares estimators; penalized criteria; minimax rates; Besov spaces; model selection; adaptive estimation},
language = {eng},
pages = {127-146},
publisher = {EDP-Sciences},
title = {Model selection for regression on a random design},
url = {http://eudml.org/doc/244664},
volume = {6},
year = {2002},
}
TY - JOUR
AU - Baraud, Yannick
TI - Model selection for regression on a random design
JO - ESAIM: Probability and Statistics
PY - 2002
PB - EDP-Sciences
VL - 6
SP - 127
EP - 146
AB - We consider the problem of estimating an unknown regression function when the design is random with values in $\mathbb {R}^k$. Our estimation procedure is based on model selection and does not rely on any prior information on the target function. We start with a collection of linear functional spaces and build, on a data selected space among this collection, the least-squares estimator. We study the performance of an estimator which is obtained by modifying this least-squares estimator on a set of small probability. For the so-defined estimator, we establish nonasymptotic risk bounds that can be related to oracle inequalities. As a consequence of these, we show that our estimator possesses adaptive properties in the minimax sense over large families of Besov balls ${\mathcal {B}}_{\alpha ,l,\infty }(R)$ with $R>0$, $l\ge 1$ and $\alpha >\alpha _l$ where $\alpha _l$ is a positive number satisfying $1/l-1/2\le \alpha _l<1/l$. We also study the particular case where the regression function is additive and then obtain an additive estimator which converges at the same rate as it does when $k=1$.
LA - eng
KW - nonparametric regression; least-squares estimators; penalized criteria; minimax rates; Besov spaces; model selection; adaptive estimation
UR - http://eudml.org/doc/244664
ER -
References
top- [1] Y. Baraud, Model selection for regression on a fixed design. Probab. Theory Related Fields 117 (2000) 467-493. Zbl0997.62027MR1777129
- [2] A. Barron, L. Birgé and P. Massart, Risk bounds for model selection via penalization. Probab. Theory Related Fields 113 (1999) 301-413. Zbl0946.62036MR1679028
- [3] A.R. Barron and T.M. Cover, Minimum complexity density estimation. IEEE Trans. Inform. Theory 37 (1991) 1738. Zbl0743.62003MR1111806
- [4] L. Birgé and P. Massart, An adaptive compression algorithm in Besov spaces. Constr. Approx. 16 (2000) 1-36. Zbl1004.41006MR1848840
- [5] L. Birgé and P. Massart, Minimum contrast estimators on sieves: Exponential bounds and rates of convergence. Bernoulli 4 (1998) 329-375. Zbl0954.62033MR1653272
- [6] L. Birgé and P. Massart, Gaussian model selection. JEMS 3 (2001) 203-268. Zbl1037.62001MR1848946
- [7] L. Birgéand Massart, A generalized criterion for Gaussian model selection, Technical Report. University Paris 6, PMA-647 (2001).
- [8] L. Birgé and Y. Rozenholc, How many bins should be put in a regular histogram, Technical Report. University Paris 6, PMA-721 (2002). Zbl1136.62329
- [9] O. Catoni, Statistical learning theory and stochastic optimization, in École d’été de probabilités de Saint-Flour. Springer (2001). Zbl1076.93002
- [10] A. Cohen, I. Daubechies and P. Vial, Wavelet and fast wavelet transform on an interval. Appl. Comp. Harmon. Anal. 1 (1993) 54-81. Zbl0795.42018MR1256527
- [11] I. Daubechies, Ten lectures on wavelets. SIAM: Philadelphia (1992). Zbl0776.42018MR1162107
- [12] R.A. DeVore and G.G. Lorentz, Constructive approximation. Springer-Verlag, Berlin (1993). Zbl0797.41016MR1261635
- [13] D.L. Donoho and I.M. Johnstone, Ideal spatial adaptation via wavelet shrinkage. Biometrika 81 (1994) 425-455. Zbl0815.62019MR1311089
- [14] D.L. Donoho and I.M. Johnstone, Minimax estimation via wavelet shrinkage. Ann. Statist. 26 (1998) 879-921. Zbl0935.62041MR1635414
- [15] M. Kohler, Inequalities for uniform deviations of averages from expectations with applications to nonparametric regression. J. Statist. Plann. Inference 89 (2000) 1-23. Zbl0982.62035MR1794410
- [16] M. Kohler, Nonparametric regression function estimation using interaction least square splines and complexity regularization. Metrika 47 (1998) 147-163. Zbl1093.62528MR1622144
- [17] A.P. Korostelev and A.B. Tsybakov, Minimax theory of image reconstruction. Springer-Verlag, New York NY, Lecture Notes in Statis. (1993). Zbl0833.62039MR1226450
- [18] C.J. Stone, Additive regression and other nonparametric models. Ann. Statist. 13 (1985) 689-705. Zbl0605.62065MR790566
- [19] M. Wegkamp, Model selection in non-parametric regression, Preprint. Yale University (2000). Zbl1019.62037
- [20] Y. Yang, Model selection for nonparametric regression. Statist. Sinica 9 (1999) 475-499. Zbl0921.62051MR1707850
- [21] Y. Yang, Combining different procedures for adaptive regression. J. Multivariate Anal. 74 (2000) 135-161. Zbl0964.62032MR1790617
- [22] Y. Yang and A. Barron, Information-Theoretic determination of minimax rates of convergence. Ann. Statist. 27 (1999) 1564-1599. Zbl0978.62008MR1742500
Citations in EuDML Documents
top- Stéphane Gaïffas, On pointwise adaptive curve estimation based on inhomogeneous data
- Gaëlle Chagny, Penalization versus Goldenshluger − Lepski strategies in warped bases regression
- Xavier Gendre, Model selection and estimation of a component in additive regression
- Lucien Birgé, Model selection via testing : an alternative to (penalized) maximum likelihood estimators
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.