Displaying similar documents to “Model selection for regression on a random design”

Model selection for (auto-)regression with dependent data

Yannick Baraud, F. Comte, G. Viennet (2001)

ESAIM: Probability and Statistics

Similarity:

In this paper, we study the problem of non parametric estimation of an unknown regression function from dependent data with sub-gaussian errors. As a particular case, we handle the autoregressive framework. For this purpose, we consider a collection of finite dimensional linear spaces (e.g. linear spaces spanned by wavelets or piecewise polynomials on a possibly irregular grid) and we estimate the regression function by a least-squares estimator built on a data driven selected linear...

Smoothing and preservation of irregularities using local linear fitting

Irène Gijbels (2008)

Applications of Mathematics

Similarity:

For nonparametric estimation of a smooth regression function, local linear fitting is a widely-used method. The goal of this paper is to briefly review how to use this method when the unknown curve possibly has some irregularities, such as jumps or peaks, at unknown locations. It is then explained how the same basic method can be used when estimating unsmooth probability densities and conditional variance functions.

Using randomization to improve performance of a variance estimator of strongly dependent errors

Artur Bryk (2012)

Applicationes Mathematicae

Similarity:

We consider a fixed-design regression model with long-range dependent errors which form a moving average or Gaussian process. We introduce an artificial randomization of grid points at which observations are taken in order to diminish the impact of strong dependence. We estimate the variance of the errors using the Rice estimator. The estimator is shown to exhibit weak (i.e. in probability) consistency. Simulation results confirm this property for moderate and large sample sizes when...

A note on the rate of convergence of local polynomial estimators in regression models

Friedrich Liese, Ingo Steinke (2001)

Kybernetika

Similarity:

Local polynomials are used to construct estimators for the value m ( x 0 ) of the regression function m and the values of the derivatives D γ m ( x 0 ) in a general class of nonparametric regression models. The covariables are allowed to be random or non-random. Only asymptotic conditions on the average distribution of the covariables are used as smoothness of the experimental design. This smoothness condition is discussed in detail. The optimal stochastic rate of convergence of the estimators is established....

Exponential regression

Lubomír Kubáček, Ludmila Kubáčková, Eva Tesaříková (2001)

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

Similarity: