Page 1

Displaying 1 – 5 of 5

Showing per page

Matrix rank/inertia formulas for least-squares solutions with statistical applications

Yongge Tian, Bo Jiang (2016)

Special Matrices

Least-Squares Solution (LSS) of a linear matrix equation and Ordinary Least-Squares Estimator (OLSE) of unknown parameters in a general linear model are two standard algebraical methods in computational mathematics and regression analysis. Assume that a symmetric quadratic matrix-valued function Φ(Z) = Q − ZPZ0 is given, where Z is taken as the LSS of the linear matrix equation AZ = B. In this paper, we establish a group of formulas for calculating maximum and minimum ranks and inertias of Φ(Z)...

Modified minimax quadratic estimation of variance components

Viktor Witkovský (1998)

Kybernetika

The paper deals with modified minimax quadratic estimation of variance and covariance components under full ellipsoidal restrictions. Based on the, so called, linear approach to estimation variance components, i. e. considering useful local transformation of the original model, we can directly adopt the results from the linear theory. Under normality assumption we can can derive the explicit form of the estimator which is formally find to be the Kuks–Olman type estimator.

Multivariate models with constraints confidence regions

Lubomír Kubáček (2008)

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

In multivariate linear statistical models with normally distributed observation matrix a structure of a covariance matrix plays an important role when confidence regions must be determined. In the paper it is assumed that the covariance matrix is a linear combination of known symmetric and positive semidefinite matrices and unknown parameters (variance components) which are unbiasedly estimable. Then insensitivity regions are found for them which enables us to decide whether plug-in approach can...

Currently displaying 1 – 5 of 5

Page 1