Displaying similar documents to “Consistency of trigonometric and polynomial regression estimators”

L 1 -penalization in functional linear regression with subgaussian design

Vladimir Koltchinskii, Stanislav Minsker (2014)

Journal de l’École polytechnique — Mathématiques

Similarity:

We study functional regression with random subgaussian design and real-valued response. The focus is on the problems in which the regression function can be well approximated by a functional linear model with the slope function being “sparse” in the sense that it can be represented as a sum of a small number of well separated “spikes”. This can be viewed as an extension of now classical sparse estimation problems to the case of infinite dictionaries. We study an estimator of the regression...

Sensitivity analysis of M -estimators of non-linear regression models

Asunción Rubio, Francisco Quintana, Jan Ámos Víšek (1994)

Commentationes Mathematicae Universitatis Carolinae

Similarity:

An asymptotic formula for the difference of the M -estimates of the regression coefficients of the non-linear model for all n observations and for n - 1 observations is presented under conditions covering the twice absolutely continuous ϱ -functions. Then the implications for the M -estimation of the regression model are discussed.

Instrumental weighted variables under heteroscedasticity Part I – Consistency

Jan Ámos Víšek (2017)

Kybernetika

Similarity:

The proof of consistency instrumental weighted variables, the robust version of the classical instrumental variables is given. It is proved that all solutions of the corresponding normal equations are contained, with high probability, in a ball, the radius of which can be selected - asymptotically - arbitrarily small. Then also n -consistency is proved. An extended numerical study (the Part II of the paper) offers a picture of behavior of the estimator for finite samples under various...

M -estimators of structural parameters in pseudolinear models

Friedrich Liese, Igor Vajda (1999)

Applications of Mathematics

Similarity:

Real valued M -estimators θ ^ n : = min 1 n ρ ( Y i - τ ( θ ) ) in a statistical model with observations Y i F θ 0 are replaced by p -valued M -estimators β ^ n : = min 1 n ρ ( Y i - τ ( u ( z i T β ) ) ) in a new model with observations Y i F u ( z i t β 0 ) , where z i p are regressors, β 0 p is a structural parameter and u : a structural function of the new model. Sufficient conditions for the consistency of β ^ n are derived, motivated by the sufficiency conditions for the simpler “parent estimator” θ ^ n . The result is a general method of consistent estimation in a class of nonlinear (pseudolinear) statistical...

Model selection for regression on a random design

Yannick Baraud (2002)

ESAIM: Probability and Statistics

Similarity:

We consider the problem of estimating an unknown regression function when the design is random with values in k . Our estimation procedure is based on model selection and does not rely on any prior information on the target function. We start with a collection of linear functional spaces and build, on a data selected space among this collection, the least-squares estimator. We study the performance of an estimator which is obtained by modifying this least-squares estimator on a set of...

Sufficient conditions for the strong consistency of least squares estimator with α-stable errors

João Tiago Mexia, João Lita da Silva (2007)

Discussiones Mathematicae Probability and Statistics

Similarity:

Let Y i = x i T β + e i , 1 ≤ i ≤ n, n ≥ 1 be a linear regression model and suppose that the random errors e₁, e₂, ... are independent and α-stable. In this paper, we obtain sufficient conditions for the strong consistency of the least squares estimator β̃ of β under additional assumptions on the non-random sequence x₁, x₂,... of real vectors.

A note on the rate of convergence of local polynomial estimators in regression models

Friedrich Liese, Ingo Steinke (2001)

Kybernetika

Similarity:

Local polynomials are used to construct estimators for the value m ( x 0 ) of the regression function m and the values of the derivatives D γ m ( x 0 ) in a general class of nonparametric regression models. The covariables are allowed to be random or non-random. Only asymptotic conditions on the average distribution of the covariables are used as smoothness of the experimental design. This smoothness condition is discussed in detail. The optimal stochastic rate of convergence of the estimators is established....

Least-squares trigonometric regression estimation

Waldemar Popiński (1999)

Applicationes Mathematicae

Similarity:

The problem of nonparametric function fitting using the complete orthogonal system of trigonometric functions e k , k=0,1,2,..., for the observation model y i = f ( x i n ) + η i , i=1,...,n, is considered, where η i are uncorrelated random variables with zero mean value and finite variance, and the observation points x i n [ 0 , 2 π ] , i=1,...,n, are equidistant. Conditions for convergence of the mean-square prediction error ( 1 / n ) i = 1 n E ( f ( x i n ) - f ^ N ( n ) ( x i n ) ) 2 , the integrated mean-square error E f - f ^ N ( n ) 2 and the pointwise mean-square error E ( f ( x ) - N ( n ) ( x ) ) 2 of the estimator f ^ N ( n ) ( x ) = k = 0 N ( n ) c ^ k e k ( x ) for f ∈...