The Bayes estimator of the variance components and its admissibility
Let be a biased estimate of the parameter based on all observations , , and let () be the same estimate of the parameter obtained after deletion of the -th observation. If the expectation of the estimators and are expressed as where is a known sequence of real numbers and is a function of , then this system of equations can be regarded as a linear model. The least squares method gives the generalized jackknife estimator. Using this method, it is possible to obtain the unbiased...
We present a two-dimensional linear regression model where both variables are subject to error. We discuss a model where one variable of each pair of observables is repeated. We suggest two methods to construct consistent estimators: the maximum likelihood method and the method which applies variance components theory. We study asymptotic properties of these estimators. We prove that the asymptotic variances of the estimators of regression slopes for both methods are comparable.