Model selection for (auto-)regression with dependent data
Yannick Baraud; F. Comte; G. Viennet
ESAIM: Probability and Statistics (2010)
- Volume: 5, page 33-49
- ISSN: 1292-8100
Access Full Article
topAbstract
topHow to cite
topBaraud, Yannick, Comte, F., and Viennet, G.. "Model selection for (auto-)regression with dependent data." ESAIM: Probability and Statistics 5 (2010): 33-49. <http://eudml.org/doc/116584>.
@article{Baraud2010,
abstract = {
In this paper, we study the problem of non parametric estimation
of an unknown regression function from dependent data with
sub-Gaussian errors. As a particular case, we handle the
autoregressive framework. For this purpose, we consider a
collection of finite dimensional linear spaces (e.g. linear spaces
spanned by wavelets or piecewise polynomials on a possibly
irregular grid) and we estimate the regression function by a
least-squares estimator built on a data driven selected linear
space among the collection. This data driven choice is performed
via the minimization of a penalized criterion akin to the Mallows'
Cp. We state non asymptotic risk bounds for our estimator in
some $\{\mathbb\{L\}\}_2$-norm and we show that it is adaptive in the minimax
sense over a large class of Besov balls of the form Bα,p,∞(R) with p ≥ 1.
},
author = {Baraud, Yannick, Comte, F., Viennet, G.},
journal = {ESAIM: Probability and Statistics},
keywords = {Nonparametric regression; least-squares estimator; adaptive estimation; autoregression; mixing
processes.},
language = {eng},
month = {3},
pages = {33-49},
publisher = {EDP Sciences},
title = {Model selection for (auto-)regression with dependent data},
url = {http://eudml.org/doc/116584},
volume = {5},
year = {2010},
}
TY - JOUR
AU - Baraud, Yannick
AU - Comte, F.
AU - Viennet, G.
TI - Model selection for (auto-)regression with dependent data
JO - ESAIM: Probability and Statistics
DA - 2010/3//
PB - EDP Sciences
VL - 5
SP - 33
EP - 49
AB -
In this paper, we study the problem of non parametric estimation
of an unknown regression function from dependent data with
sub-Gaussian errors. As a particular case, we handle the
autoregressive framework. For this purpose, we consider a
collection of finite dimensional linear spaces (e.g. linear spaces
spanned by wavelets or piecewise polynomials on a possibly
irregular grid) and we estimate the regression function by a
least-squares estimator built on a data driven selected linear
space among the collection. This data driven choice is performed
via the minimization of a penalized criterion akin to the Mallows'
Cp. We state non asymptotic risk bounds for our estimator in
some ${\mathbb{L}}_2$-norm and we show that it is adaptive in the minimax
sense over a large class of Besov balls of the form Bα,p,∞(R) with p ≥ 1.
LA - eng
KW - Nonparametric regression; least-squares estimator; adaptive estimation; autoregression; mixing
processes.
UR - http://eudml.org/doc/116584
ER -
References
top- H. Akaike, Information theory and an extension of the maximum likelihood principle, in Proc. 2nd International Symposium on Information Theory, edited by P.N. Petrov and F. Csaki. Akademia Kiado, Budapest (1973) 267-281.
- H. Akaike, A new look at the statistical model identification. IEEE Trans. Automat. Control19 (1984) 716-723.
- P. Ango Nze, Geometric and subgeometric rates for markovian processes in the neighbourhood of linearity. C. R. Acad. Sci. Paris326 (1998) 371-376.
- Y. Baraud, Model selection for regression on a fixed design. Probab. Theory Related Fields117 (2000) 467-493.
- Y. Baraud, Model selection for regression on a random design, Preprint 01-10. DMA, École Normale Supérieure (2001).
- Y. Baraud, F. Comte and G. Viennet, Adaptive estimation in autoregression or β-mixing regression via model selection. Ann. Statist. (to appear).
- A. Barron, L. Birgé and P. Massart, Risks bounds for model selection via penalization. Probab. Theory Related Fields113 (1999) 301-413.
- L. Birgé and P. Massart, An adaptive compression algorithm in Besov spaces. Constr. Approx.16 (2000) 1-36.
- L. Birgé and Y. Rozenholc, How many bins must be put in a regular histogram. Working paper (2001).
- A. Cohen, I. Daubechies and P. Vial, Wavelet and fast wavelet transform on an interval. Appl. Comput. Harmon. Anal.1 (1993) 54-81.
- I. Daubechies, Ten lectures on wavelets. SIAM: Philadelphia (1992).
- R.A. Devore and C.G. Lorentz, Constructive Approximation. Springer-Verlag (1993).
- D.L. Donoho and I.M. Johnstone, Minimax estimation via wavelet shrinkage. Ann. Statist.26 (1998) 879-921.
- P. Doukhan, Mixing properties and examples. Springer-Verlag (1994).
- M. Duflo, Random Iterative Models. Springer, Berlin, New-York (1997).
- M. Hoffmann, On nonparametric estimation in nonlinear AR(1)-models. Statist. Probab. Lett.44 (1999) 29-45.
- I.A. Ibragimov, On the spectrum of stationary Gaussian sequences satisfying the strong mixing condition I: Necessary conditions. Theory Probab. Appl.10 (1965) 85-106.
- M. Kohler, On optimal rates of convergence for nonparametric regression with random design, Working Paper. Stuttgart University (1997).
- A.R. Kolmogorov and Y.A. Rozanov, On the strong mixing conditions for stationary Gaussian sequences. Theory Probab. Appl.5 (1960) 204-207.
- K.C. Li, Asymptotic optimality for Cp, Cl cross-validation and generalized cross-validation: Discrete index set. Ann. Statist.15 (1987) 958-975.
- G.G. Lorentz, M. von Golitschek and Y. Makokov, Constructive Approximation, Advanced Problems. Springer, Berlin (1996).
- C.L. Mallows, Some comments on Cp. Technometrics15 (1973) 661-675.
- A. Meyer, Quelques inégalités sur les martingales d'après Dubins et Freedman, Séminaire de Probabilités de l'Université de Strasbourg. Vols. 68/69 (1969) 162-169.
- D.S. Modha and E. Masry, Minimum complexity regression estimation with weakly dependent observations. IEEE Trans. Inform. Theory42 (1996) 2133-2145.
- D.S. Modha and E. Masry, Memory-universal prediction of stationary random processes. IEEE Trans. Inform. Theory44 (1998) 117-133.
- M. Neumann and J.-P. Kreiss, Regression-type inference in nonparametric autoregression. Ann. Statist.26 (1998) 1570-1613.
- B.T. Polyak and A. Tsybakov, A family of asymptotically optimal methods for choosing the order of a projective regression estimate. Theory Probab. Appl.37 (1992) 471-481.
- R. Shibata, Selection of the order of an autoregressive model by Akaike's information criterion. Biometrika63 (1976) 117-126.
- R. Shibata, An optimal selection of regression variables. Biometrika68 (1981) 45-54.
- S. Van de Geer, Exponential inequalities for martingales, with application to maximum likelihood estimation for counting processes. Ann. Statist.23 (1995) 1779-1801.
- V.A. Volonskii and Y.A. Rozanov, Some limit theorems for random functions. I. Theory Probab. Appl.4 (1959) 179-197.
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.