Model selection for (auto-)regression with dependent data
Yannick Baraud; F. Comte; G. Viennet
ESAIM: Probability and Statistics (2001)
- Volume: 5, page 33-49
- ISSN: 1292-8100
Access Full Article
topAbstract
topHow to cite
topBaraud, Yannick, Comte, F., and Viennet, G.. "Model selection for (auto-)regression with dependent data." ESAIM: Probability and Statistics 5 (2001): 33-49. <http://eudml.org/doc/104278>.
@article{Baraud2001,
abstract = {In this paper, we study the problem of non parametric estimation of an unknown regression function from dependent data with sub-gaussian errors. As a particular case, we handle the autoregressive framework. For this purpose, we consider a collection of finite dimensional linear spaces (e.g. linear spaces spanned by wavelets or piecewise polynomials on a possibly irregular grid) and we estimate the regression function by a least-squares estimator built on a data driven selected linear space among the collection. This data driven choice is performed via the minimization of a penalized criterion akin to the Mallows’ $C_p$. We state non asymptotic risk bounds for our estimator in some $Ł_2$-norm and we show that it is adaptive in the minimax sense over a large class of Besov balls of the form $\{\mathcal \{B\}\}_\{\alpha ,p,\infty \}(R)$ with $p\ge 1$.},
author = {Baraud, Yannick, Comte, F., Viennet, G.},
journal = {ESAIM: Probability and Statistics},
keywords = {nonparametric regression; least-squares estimator; adaptive estimation; autoregression; mixing processes},
language = {eng},
pages = {33-49},
publisher = {EDP-Sciences},
title = {Model selection for (auto-)regression with dependent data},
url = {http://eudml.org/doc/104278},
volume = {5},
year = {2001},
}
TY - JOUR
AU - Baraud, Yannick
AU - Comte, F.
AU - Viennet, G.
TI - Model selection for (auto-)regression with dependent data
JO - ESAIM: Probability and Statistics
PY - 2001
PB - EDP-Sciences
VL - 5
SP - 33
EP - 49
AB - In this paper, we study the problem of non parametric estimation of an unknown regression function from dependent data with sub-gaussian errors. As a particular case, we handle the autoregressive framework. For this purpose, we consider a collection of finite dimensional linear spaces (e.g. linear spaces spanned by wavelets or piecewise polynomials on a possibly irregular grid) and we estimate the regression function by a least-squares estimator built on a data driven selected linear space among the collection. This data driven choice is performed via the minimization of a penalized criterion akin to the Mallows’ $C_p$. We state non asymptotic risk bounds for our estimator in some $Ł_2$-norm and we show that it is adaptive in the minimax sense over a large class of Besov balls of the form ${\mathcal {B}}_{\alpha ,p,\infty }(R)$ with $p\ge 1$.
LA - eng
KW - nonparametric regression; least-squares estimator; adaptive estimation; autoregression; mixing processes
UR - http://eudml.org/doc/104278
ER -
References
top- [1] H. Akaike, Information theory and an extension of the maximum likelihood principle, in Proc. 2nd International Symposium on Information Theory, edited by P.N. Petrov and F. Csaki. Akademia Kiado, Budapest (1973) 267-281. Zbl0283.62006MR483125
- [2] H. Akaike, A new look at the statistical model identification. IEEE Trans. Automat. Control 19 (1984) 716-723. Zbl0314.62039MR423716
- [3] P. Ango Nze, Geometric and subgeometric rates for markovian processes in the neighbourhood of linearity. C. R. Acad. Sci. Paris 326 (1998) 371-376. Zbl0918.60052MR1648493
- [4] Y. Baraud, Model selection for regression on a fixed design. Probab. Theory Related Fields 117 (2000) 467-493. Zbl0997.62027MR1777129
- [5] Y. Baraud, Model selection for regression on a random design, Preprint 01-10. DMA, École Normale Supérieure (2001). MR1918295
- [6] Y. Baraud, F. Comte and G. Viennet, Adaptive estimation in autoregression or -mixing regression via model selection. Ann. Statist. (to appear). Zbl1012.62034MR1865343
- [7] A. Barron, L. Birgé and P. Massart, Risks bounds for model selection via penalization. Probab. Theory Related Fields 113 (1999) 301-413. Zbl0946.62036MR1679028
- [8] L. Birgé and P. Massart, An adaptive compression algorithm in Besov spaces. Constr. Approx. 16 (2000) 1-36. Zbl1004.41006MR1848840
- [9] L. Birgé and Y. Rozenholc, How many bins must be put in a regular histogram. Working paper (2001). Zbl1136.62329
- [10] A. Cohen, I. Daubechies and P. Vial, Wavelet and fast wavelet transform on an interval. Appl. Comput. Harmon. Anal. 1 (1993) 54-81. Zbl0795.42018MR1256527
- [11] I. Daubechies, Ten lectures on wavelets. SIAM: Philadelphia (1992). Zbl0776.42018MR1162107
- [12] R.A. Devore and C.G. Lorentz, Constructive Approximation. Springer-Verlag (1993). Zbl0797.41016MR1261635
- [13] D.L. Donoho and I.M. Johnstone, Minimax estimation via wavelet shrinkage. Ann. Statist. 26 (1998) 879-921. Zbl0935.62041MR1635414
- [14] P. Doukhan, Mixing properties and examples. Springer-Verlag (1994). Zbl0801.60027MR1312160
- [15] M. Duflo, Random Iterative Models. Springer, Berlin, New-York (1997). Zbl0868.62069MR1485774
- [16] M. Hoffmann, On nonparametric estimation in nonlinear AR(1)-models. Statist. Probab. Lett. 44 (1999) 29-45. Zbl0954.62049MR1706307
- [17] I.A. Ibragimov, On the spectrum of stationary Gaussian sequences satisfying the strong mixing condition I: Necessary conditions. Theory Probab. Appl. 10 (1965) 85-106. Zbl0131.18101MR174091
- [18] M. Kohler, On optimal rates of convergence for nonparametric regression with random design, Working Paper. Stuttgart University (1997).
- [19] A.R. Kolmogorov and Y.A. Rozanov, On the strong mixing conditions for stationary Gaussian sequences. Theory Probab. Appl. 5 (1960) 204-207. Zbl0106.12005
- [20] K.C. Li, Asymptotic optimality for , cross-validation and generalized cross-validation: Discrete index set. Ann. Statist. 15 (1987) 958-975. Zbl0653.62037MR902239
- [21] G.G. Lorentz, M. von Golitschek and Y. Makokov, Constructive Approximation, Advanced Problems. Springer, Berlin (1996). Zbl0910.41001MR1393437
- [22] C.L. Mallows, Some comments on . Technometrics 15 (1973) 661-675. Zbl0269.62061
- [23] A. Meyer, Quelques inégalités sur les martingales d’après Dubins et Freedman, Séminaire de Probabilités de l’Université de Strasbourg. Vols. 68/69 (1969) 162-169. Zbl0211.21802
- [24] D.S. Modha and E. Masry, Minimum complexity regression estimation with weakly dependent observations. IEEE Trans. Inform. Theory 42 (1996) 2133-2145. Zbl0868.62015MR1447519
- [25] D.S. Modha and E. Masry, Memory-universal prediction of stationary random processes. IEEE Trans. Inform. Theory 44 (1998) 117-133. Zbl0938.62106MR1486652
- [26] M. Neumann and J.-P. Kreiss, Regression-type inference in nonparametric autoregression. Ann. Statist. 26 (1998) 1570-1613. Zbl0935.62049MR1647701
- [27] B.T. Polyak and A. Tsybakov, A family of asymptotically optimal methods for choosing the order of a projective regression estimate. Theory Probab. Appl. 37 (1992) 471-481. Zbl0806.62031MR1214355
- [28] R. Shibata, Selection of the order of an autoregressive model by Akaike’s information criterion. Biometrika 63 (1976) 117-126. Zbl0358.62048
- [29] R. Shibata, An optimal selection of regression variables. Biometrika 68 (1981) 45-54. Zbl0464.62054MR614940
- [30] S. Van de Geer, Exponential inequalities for martingales, with application to maximum likelihood estimation for counting processes. Ann. Statist. 23 (1995) 1779-1801. Zbl0852.60019MR1370307
- [31] V.A. Volonskii and Y.A. Rozanov, Some limit theorems for random functions. I. Theory Probab. Appl. 4 (1959) 179-197. Zbl0092.33502MR105741
Citations in EuDML Documents
top- Eva Löcherbach, Dasha Loukianova, Oleg Loukianov, Penalized nonparametric drift estimation for a continuously observed one-dimensional diffusion process
- Pascal Massart, Sélection de modèle : de la théorie à la pratique
- Marie Sauvé, Histogram selection in non Gaussian regression
- Eva Löcherbach, Dasha Loukianova, Oleg Loukianov, Penalized nonparametric drift estimation for a continuously observed one-dimensional diffusion process
- Yannick Baraud, Lucien Birgé, Estimating composite functions by model selection
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.