The linear model with variance-covariance components and jackknife estimation

Jaromír Kudeláš

Applications of Mathematics (1994)

  • Volume: 39, Issue: 2, page 111-125
  • ISSN: 0862-7940

Abstract

top
Let θ * be a biased estimate of the parameter ϑ based on all observations x 1 , , x n and let θ - i * ( i = 1 , 2 , , n ) be the same estimate of the parameter ϑ obtained after deletion of the i -th observation. If the expectation of the estimators θ * and θ - i * are expressed as E ( θ * ) = ϑ + a ( n ) b ( ϑ ) E ( θ - i * ) = ϑ + a ( n - 1 ) b ( ϑ ) i = 1 , 2 , , n , where a ( n ) is a known sequence of real numbers and b ( ϑ ) is a function of ϑ , then this system of equations can be regarded as a linear model. The least squares method gives the generalized jackknife estimator. Using this method, it is possible to obtain the unbiased estimator of the parameter ϑ .

How to cite

top

Kudeláš, Jaromír. "The linear model with variance-covariance components and jackknife estimation." Applications of Mathematics 39.2 (1994): 111-125. <http://eudml.org/doc/32874>.

@article{Kudeláš1994,
abstract = {Let $\theta ^*$ be a biased estimate of the parameter $\vartheta $ based on all observations $x_1$, $\dots $, $x_n$ and let $\theta _\{-i\}^*$ ($i=1,2,\dots ,n$) be the same estimate of the parameter $\vartheta $ obtained after deletion of the $i$-th observation. If the expectation of the estimators $\theta ^*$ and $\theta _\{-i\}^*$ are expressed as \[ \begin\{@align\}\{1\}\{-1\}\mathrm \{E\}(\theta ^*)&=\vartheta +a(n)b(\vartheta ) \\ \mathrm \{E\}(\theta \_\{-i\}^*)&=\vartheta +a(n-1)b(\vartheta )\qquad i=1,2,\dots ,n, \end\{@align\}\] where $a(n)$ is a known sequence of real numbers and $b(\vartheta )$ is a function of $\vartheta $, then this system of equations can be regarded as a linear model. The least squares method gives the generalized jackknife estimator. Using this method, it is possible to obtain the unbiased estimator of the parameter $\vartheta $.},
author = {Kudeláš, Jaromír},
journal = {Applications of Mathematics},
keywords = {Jackknife estimator; least squares estimator; linear model; estimator of variance-covariance components; consistency; estimator of variance-covariance components; consistency; deletion of observations; Gauss-Markov estimator; biased estimate; linear model; least squares method; generalized jackknife estimator; unbiased estimator},
language = {eng},
number = {2},
pages = {111-125},
publisher = {Institute of Mathematics, Academy of Sciences of the Czech Republic},
title = {The linear model with variance-covariance components and jackknife estimation},
url = {http://eudml.org/doc/32874},
volume = {39},
year = {1994},
}

TY - JOUR
AU - Kudeláš, Jaromír
TI - The linear model with variance-covariance components and jackknife estimation
JO - Applications of Mathematics
PY - 1994
PB - Institute of Mathematics, Academy of Sciences of the Czech Republic
VL - 39
IS - 2
SP - 111
EP - 125
AB - Let $\theta ^*$ be a biased estimate of the parameter $\vartheta $ based on all observations $x_1$, $\dots $, $x_n$ and let $\theta _{-i}^*$ ($i=1,2,\dots ,n$) be the same estimate of the parameter $\vartheta $ obtained after deletion of the $i$-th observation. If the expectation of the estimators $\theta ^*$ and $\theta _{-i}^*$ are expressed as \[ \begin{@align}{1}{-1}\mathrm {E}(\theta ^*)&=\vartheta +a(n)b(\vartheta ) \\ \mathrm {E}(\theta _{-i}^*)&=\vartheta +a(n-1)b(\vartheta )\qquad i=1,2,\dots ,n, \end{@align}\] where $a(n)$ is a known sequence of real numbers and $b(\vartheta )$ is a function of $\vartheta $, then this system of equations can be regarded as a linear model. The least squares method gives the generalized jackknife estimator. Using this method, it is possible to obtain the unbiased estimator of the parameter $\vartheta $.
LA - eng
KW - Jackknife estimator; least squares estimator; linear model; estimator of variance-covariance components; consistency; estimator of variance-covariance components; consistency; deletion of observations; Gauss-Markov estimator; biased estimate; linear model; least squares method; generalized jackknife estimator; unbiased estimator
UR - http://eudml.org/doc/32874
ER -

References

top
  1. 10.1214/aoms/1177698505, Ann. Math. Statistics 39 (1968), 70–75. (1968) MR0222998DOI10.1214/aoms/1177698505
  2. The jackknife method and the Gauss-Markov estimation, Probability and Math. Statistics 8 (1987), 111–116. (1987) MR0928124
  3. 10.1214/aos/1176342811, Ann. Math. Statistics 2 (1974), 880–891. (1974) Zbl0289.62042MR0356353DOI10.1214/aos/1176342811
  4. Approximate test of correlation in time-series, J. Roy. Statist. Soc. Ser. B 11 (1949), 68–84. (1949) MR0032176
  5. 10.1093/biomet/43.3-4.353, Biometrika 43 (1956), 353–360. (1956) Zbl0074.14003MR0081040DOI10.1093/biomet/43.3-4.353
  6. Linear statistical inference and its applications, J. Wiley, 1973. (1973) Zbl0256.62002MR0346957
  7. Consistency of linear and quadratic least squares estimators in regression models with covariance stationary errors, Applications of Mathematics 36(2) (1991), 149–155. (1991) MR1097699
  8. 10.1214/aoms/1177707036, Ann. Math. Statist. 28 (1957), 43–56. (1957) MR0084974DOI10.1214/aoms/1177707036

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.