The search session has expired. Please query the service again.
Two estimates of the regression coefficient in bivariate normal distribution are considered: the usual one based on a sample and a new one making use of additional observations of one of the variables. They are compared with respect to variance. The same is done for two regression lines. The conclusion is that the additional observations are worth using only when the sample is very small.
Matrix mathematics provides a powerful tool set for addressing statistical problems, in particular, the theory of matrix ranks and inertias has been developed as effective methodology of simplifying various complicated matrix expressions, and establishing equalities and inequalities occurred in statistical analysis. This paper describes how to establish exact formulas for calculating ranks and inertias of covariances of predictors and estimators of parameter spaces in general linear models (GLMs),...
Least-Squares Solution (LSS) of a linear matrix equation and Ordinary Least-Squares Estimator (OLSE) of unknown parameters in a general linear model are two standard algebraical methods in computational mathematics and regression analysis. Assume that a symmetric quadratic matrix-valued function Φ(Z) = Q − ZPZ0 is given, where Z is taken as the LSS of the linear matrix equation AZ = B. In this paper, we establish a group of formulas for calculating maximum and minimum ranks and inertias of Φ(Z)...
In many cases we can consider the regression parameters as realizations of a random variable. In these situations the minimum mean square error estimator seems to be useful and important. The explicit form of this estimator is given in the case that both the covariance matrices of the random parameters and those of the error vector are singular.
The Minimum Norm Quadratic Unbiased Invariant Estimator of the estimable linear function of the unknown variance-covariance component parameter θ in the linear model with given linear restrictions of the type Rθ = c is derived in two special structures: replicated and growth-curve model.
Un tema que ha suscitado el interés de los investigadores en datos longitudinales durante las dos últimas décadas, ha sido el desarrollo y uso de modelos paramétricos explícitos para la estructura de covarianza de los datos. Sin embargo, el análisis de estructuras de covarianza no estacionarias en el contexto de datos longitudinales no se ha realizado de forma detallada principalmente debido a que las distintas aplicaciones no hacían necesario su uso. Muchos son los modelos propuestos recientemente,...
Necessary and sufficient conditions are given under which the best linear unbiased estimator (BLUE) is identical with the BLUE ; are subvectors of the random vector in a general regression model , a vector of unknown parameters; the design matrix having a special so called multistage struture and the covariance matrix are given.
In multivariate linear statistical models with normally distributed observation matrix a structure of a covariance matrix plays an important role when confidence regions must be determined. In the paper it is assumed that the covariance matrix is a linear combination of known symmetric and positive semidefinite matrices and unknown parameters (variance components) which are unbiasedly estimable. Then insensitivity regions are found for them which enables us to decide whether plug-in approach can...
Multivariate models frequently used in many branches of science have relatively large number of different structures. Sometimes the regularity condition which enable us to solve statistical problems are not satisfied and it is reasonable to recognize it in advance. In the paper the model without constraints on parameters is analyzed only, since the greatness of the class of such problems in general is out of the size of the paper.
Currently displaying 1 –
20 of
20