Page 1

Displaying 1 – 8 of 8

Showing per page

The linear model with variance-covariance components and jackknife estimation

Jaromír Kudeláš (1994)

Applications of Mathematics

Let θ * be a biased estimate of the parameter ϑ based on all observations x 1 , , x n and let θ - i * ( i = 1 , 2 , , n ) be the same estimate of the parameter ϑ obtained after deletion of the i -th observation. If the expectation of the estimators θ * and θ - i * are expressed as E ( θ * ) = ϑ + a ( n ) b ( ϑ ) E ( θ - i * ) = ϑ + a ( n - 1 ) b ( ϑ ) i = 1 , 2 , , n , where a ( n ) is a known sequence of real numbers and b ( ϑ ) is a function of ϑ , then this system of equations can be regarded as a linear model. The least squares method gives the generalized jackknife estimator. Using this method, it is possible to obtain the unbiased...

The two-dimensional linear relation in the errors-in-variables model with replication of one variable

Anna Czapkiewicz, Antoni Dawidowicz (2000)

Applicationes Mathematicae

We present a two-dimensional linear regression model where both variables are subject to error. We discuss a model where one variable of each pair of observables is repeated. We suggest two methods to construct consistent estimators: the maximum likelihood method and the method which applies variance components theory. We study asymptotic properties of these estimators. We prove that the asymptotic variances of the estimators of regression slopes for both methods are comparable.

Currently displaying 1 – 8 of 8

Page 1