Displaying similar documents to “Orthogonal series regression estimators for an irregularly spaced design”

Convergence rates of orthogonal series regression estimators

Waldemar Popiński (2000)

Applicationes Mathematicae

Similarity:

General conditions for convergence rates of nonparametric orthogonal series estimators of the regression function f(x)=E(Y | X = x) are considered. The estimators are obtained by the least squares method on the basis of a random observation sample (Yi,Xi), i=1,...,n, where X i A d have marginal distribution with density ϱ L 1 ( A ) and Var( Y | X = x) is bounded on A. Convergence rates of the errors E X ( f ( X ) - f ^ N ( X ) ) 2 and f - f ^ N for the estimator f ^ N ( x ) = k = 1 N c ^ k e k ( x ) , constructed using an orthonormal system e k , k=1,2,..., in L 2 ( A ) are obtained. ...

A note on orthogonal series regression function estimators

Waldemar Popiński (1999)

Applicationes Mathematicae

Similarity:

The problem of nonparametric estimation of the regression function f(x) = E(Y | X=x) using the orthonormal system of trigonometric functions or Legendre polynomials e k , k=0,1,2,..., is considered in the case where a sample of i.i.d. copies ( X i , Y i ) , i=1,...,n, of the random variable (X,Y) is available and the marginal distribution of X has density ϱ ∈ L 1 [a,b]. The constructed estimators are of the form f ^ n ( x ) = k = 0 N ( n ) c ^ k e k ( x ) , where the coefficients c ^ 0 , c ^ 1 , . . . , c ^ N are determined by minimizing the empirical risk n - 1 i = 1 n ( Y i - k = 0 N c k e k ( X i ) ) 2 . Sufficient conditions...

Least-squares trigonometric regression estimation

Waldemar Popiński (1999)

Applicationes Mathematicae

Similarity:

The problem of nonparametric function fitting using the complete orthogonal system of trigonometric functions e k , k=0,1,2,..., for the observation model y i = f ( x i n ) + η i , i=1,...,n, is considered, where η i are uncorrelated random variables with zero mean value and finite variance, and the observation points x i n [ 0 , 2 π ] , i=1,...,n, are equidistant. Conditions for convergence of the mean-square prediction error ( 1 / n ) i = 1 n E ( f ( x i n ) - f ^ N ( n ) ( x i n ) ) 2 , the integrated mean-square error E f - f ^ N ( n ) 2 and the pointwise mean-square error E ( f ( x ) - N ( n ) ( x ) ) 2 of the estimator f ^ N ( n ) ( x ) = k = 0 N ( n ) c ^ k e k ( x ) for f ∈...

Consistency of linear and quadratic least squares estimators in regression models with covariance stationary errors

František Štulajter (1991)

Applications of Mathematics

Similarity:

The least squres invariant quadratic estimator of an unknown covariance function of a stochastic process is defined and a sufficient condition for consistency of this estimator is derived. The mean value of the observed process is assumed to fulfil a linear regresion model. A sufficient condition for consistency of the least squares estimator of the regression parameters is derived, too.

Redescending M-estimators in regression analysis, cluster analysis and image analysis

Christine H. Müller (2004)

Discussiones Mathematicae Probability and Statistics

Similarity:

We give a review on the properties and applications of M-estimators with redescending score function. For regression analysis, some of these redescending M-estimators can attain the maximum breakdown point which is possible in this setup. Moreover, some of them are the solutions of the problem of maximizing the efficiency under bounded influence function when the regression coefficient and the scale parameter are estimated simultaneously. Hence redescending M-estimators satisfy several...