Displaying similar documents to “A Gram-Schmidt orthogonalizing process of design matrices in linear models as an estimating procedure of covariance components.”

On some alternative forms equivalent to Kruskal's condition for OLSE to be BLUE.

Gabriela Beganu (2007)

RACSAM

Similarity:

The necessary and sufficient condition for the ordinary least squares estimators (OLSE) to be the best linear unbiased estimators (BLUE) of the expected mean in the general univariate linear regression model was given by Kruskal (1968) using a coordinate-free approach. The purpose of this article is to present in the same manner some alternative forms of this condition and to prove two of the Haberman’s equivalent conditions in a different and simpler way. The results obtained in the...

On the equality of the ordinary least squares estimators and the best linear unbiased estimators in multivariate growth-curve models.

Gabriela Beganu (2007)

RACSAM

Similarity:

It is well known that there were proved several necessary and sufficient conditions for the ordinary least squares estimators (OLSE) to be the best linear unbiased estimators (BLUE) of the fixed effects in general linear models. The purpose of this article is to verify one of these conditions given by Zyskind [39, 40]: there exists a matrix Q such that ΩX = XQ, where X and Ω are the design matrix and the covariance matrix, respectively. It will be shown the accessibility of this condition...

On estimation of parameters in the bivariate linear errors-in-variables model

Anna Czapkiewicz (1999)

Applicationes Mathematicae

Similarity:

We discuss some methods of estimation in bivariate errors-in-variables linear models. We also suggest a method of constructing consistent estimators in the case when the error disturbances have the normal distribution with unknown parameters. It is based on the theory of estimating variance components in linear models. A simulation study is presented which compares this estimator with the maximum likelihood one.

On a class of estimators in a multivariate RCA(1) model

Zuzana Prášková, Pavel Vaněček (2011)

Kybernetika

Similarity:

This work deals with a multivariate random coefficient autoregressive model (RCA) of the first order. A class of modified least-squares estimators of the parameters of the model, originally proposed by Schick for univariate first-order RCA models, is studied under more general conditions. Asymptotic behavior of such estimators is explored, and a lower bound for the asymptotic variance matrix of the estimator of the mean of random coefficient is established. Finite sample properties are...

Modified minimax quadratic estimation of variance components

Viktor Witkovský (1998)

Kybernetika

Similarity:

The paper deals with modified minimax quadratic estimation of variance and covariance components under full ellipsoidal restrictions. Based on the, so called, linear approach to estimation variance components, i. e. considering useful local transformation of the original model, we can directly adopt the results from the linear theory. Under normality assumption we can can derive the explicit form of the estimator which is formally find to be the Kuks–Olman type estimator.