Least-squares trigonometric regression estimation

Waldemar Popiński

Applicationes Mathematicae (1999)

  • Volume: 26, Issue: 2, page 121-131
  • ISSN: 1233-7234

Abstract

top
The problem of nonparametric function fitting using the complete orthogonal system of trigonometric functions e k , k=0,1,2,..., for the observation model y i = f ( x i n ) + η i , i=1,...,n, is considered, where η i are uncorrelated random variables with zero mean value and finite variance, and the observation points x i n [ 0 , 2 π ] , i=1,...,n, are equidistant. Conditions for convergence of the mean-square prediction error ( 1 / n ) i = 1 n E ( f ( x i n ) - f ^ N ( n ) ( x i n ) ) 2 , the integrated mean-square error E f - f ^ N ( n ) 2 and the pointwise mean-square error E ( f ( x ) - N ( n ) ( x ) ) 2 of the estimator f ^ N ( n ) ( x ) = k = 0 N ( n ) c ^ k e k ( x ) for f ∈ C[0,2π] and c ^ 0 , c ^ 1 , . . . , c ^ N ( n ) obtained by the least squares method are studied.

How to cite

top

Popiński, Waldemar. "Least-squares trigonometric regression estimation." Applicationes Mathematicae 26.2 (1999): 121-131. <http://eudml.org/doc/219229>.

@article{Popiński1999,
abstract = {The problem of nonparametric function fitting using the complete orthogonal system of trigonometric functions $e_k$, k=0,1,2,..., for the observation model $y_i = f(x_\{in\}) + η_i$, i=1,...,n, is considered, where $η_i$ are uncorrelated random variables with zero mean value and finite variance, and the observation points $x_\{in\} ∈ [0,2π]$, i=1,...,n, are equidistant. Conditions for convergence of the mean-square prediction error $(1/n)\sum _\{i=1\}^n E(f(x_\{in\})-\widehat\{f\}_\{N(n)\}(x_\{in\}))^2$, the integrated mean-square error $E ‖f-\widehat\{f\}_\{N(n)\}‖^2$ and the pointwise mean-square error $E(f(x)-_\{N(n)\}(x))^2$ of the estimator $\widehat\{f\}_\{N(n)\}(x) = \sum _\{k=0\}^\{N(n)\} \widehat\{c\}_k e_k(x)$ for f ∈ C[0,2π] and $\widehat\{c\}_0,\widehat\{c\}_1,...,\widehat\{c\}_\{N(n)\}$ obtained by the least squares method are studied.},
author = {Popiński, Waldemar},
journal = {Applicationes Mathematicae},
keywords = {consistent estimator; least squares method; Fourier coefficients; trigonometric polynomial; regression function; trigonometric polynomials; least squares; consistent estimates},
language = {eng},
number = {2},
pages = {121-131},
title = {Least-squares trigonometric regression estimation},
url = {http://eudml.org/doc/219229},
volume = {26},
year = {1999},
}

TY - JOUR
AU - Popiński, Waldemar
TI - Least-squares trigonometric regression estimation
JO - Applicationes Mathematicae
PY - 1999
VL - 26
IS - 2
SP - 121
EP - 131
AB - The problem of nonparametric function fitting using the complete orthogonal system of trigonometric functions $e_k$, k=0,1,2,..., for the observation model $y_i = f(x_{in}) + η_i$, i=1,...,n, is considered, where $η_i$ are uncorrelated random variables with zero mean value and finite variance, and the observation points $x_{in} ∈ [0,2π]$, i=1,...,n, are equidistant. Conditions for convergence of the mean-square prediction error $(1/n)\sum _{i=1}^n E(f(x_{in})-\widehat{f}_{N(n)}(x_{in}))^2$, the integrated mean-square error $E ‖f-\widehat{f}_{N(n)}‖^2$ and the pointwise mean-square error $E(f(x)-_{N(n)}(x))^2$ of the estimator $\widehat{f}_{N(n)}(x) = \sum _{k=0}^{N(n)} \widehat{c}_k e_k(x)$ for f ∈ C[0,2π] and $\widehat{c}_0,\widehat{c}_1,...,\widehat{c}_{N(n)}$ obtained by the least squares method are studied.
LA - eng
KW - consistent estimator; least squares method; Fourier coefficients; trigonometric polynomial; regression function; trigonometric polynomials; least squares; consistent estimates
UR - http://eudml.org/doc/219229
ER -

References

top
  1. [1] B. Droge, On finite-sample properties of adaptive least-squares regression estimates, Statistics 24 (1993), 181-203. Zbl0808.62035
  2. [2] R. L. Eubank and P. Speckman, Convergence rates for trigonometric and polynomial-trigonometric regression estimators, Statist. Probab. Lett. 11 (1991), 119-124. Zbl0712.62037
  3. [3] T. Gasser, L. Sroka and C. Jennen-Steinmetz, Residual variance and residual pattern in nonlinear regression, Biometrika 73 (1986), 625-633. Zbl0649.62035
  4. [4] P. Hall, J. W. Kay and D. M. Titterington, Asymptotically optimal difference-based estimation of variance in nonparametric regression, ibid. 77 (1990), 521-528. 
  5. [5] P. Hall and P. Patil, On wavelet methods for estimating smooth functions, J. Bernoulli Soc. 1 (1995), 41-58. Zbl0830.62037
  6. [6] G. G. Lorentz, Approximation of Functions, Holt, Reinehart & Winston, New York, 1966. Zbl0153.38901
  7. [7] C. L. Mallows, Some comments on C p , Technometrics 15 (1973), 661-675. 
  8. [8] E. Nadaraya, Limit distribution of the integrated squared error of trigonometric series regression estimator, Proc. Georgian Acad. Sci. Math. 1 (1993), 221-237. Zbl0796.62039
  9. [9] B. T. Polyak and A. B. Tsybakov, Asymptotic optimality of the C p criterion in projection type estimation of regression functions, Teor. Veroyatnost. Primenen. 35 (1990), 305-317 (in Russian). 
  10. [10] E. Rafajłowicz, Nonparametric least-squares estimation of a regression function, Statistics 19 (1988), 349-358. Zbl0649.62034
  11. [11] A. Zygmund, Trigonometrical Series, Dover, 1955. Zbl0065.05604

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.