Displaying similar documents to “Matrix rank/inertia formulas for least-squares solutions with statistical applications”

Matrix rank and inertia formulas in the analysis of general linear models

Yongge Tian (2017)

Open Mathematics

Similarity:

Matrix mathematics provides a powerful tool set for addressing statistical problems, in particular, the theory of matrix ranks and inertias has been developed as effective methodology of simplifying various complicated matrix expressions, and establishing equalities and inequalities occurred in statistical analysis. This paper describes how to establish exact formulas for calculating ranks and inertias of covariances of predictors and estimators of parameter spaces in general linear...

All about the ⊥ with its applications in the linear statistical models

Augustyn Markiewicz, Simo Puntanen (2015)

Open Mathematics

Similarity:

For an n x m real matrix A the matrix A⊥ is defined as a matrix spanning the orthocomplement of the column space of A, when the orthogonality is defined with respect to the standard inner product ⟨x, y⟩ = x'y. In this paper we collect together various properties of the ⊥ operation and its applications in linear statistical models. Results covering the more general inner products are also considered. We also provide a rather extensive list of references

Testing hypotheses in universal models

Eva Fišerová (2006)

Discussiones Mathematicae Probability and Statistics

Similarity:

A linear regression model, when a design matrix has not full column rank and a covariance matrix is singular, is considered. The problem of testing hypotheses on mean value parameters is studied. Conditions when a hypothesis can be tested or when need not be tested are given. Explicit forms of test statistics based on residual sums of squares are presented.

On the equality of the ordinary least squares estimators and the best linear unbiased estimators in multivariate growth-curve models.

Gabriela Beganu (2007)

RACSAM

Similarity:

It is well known that there were proved several necessary and sufficient conditions for the ordinary least squares estimators (OLSE) to be the best linear unbiased estimators (BLUE) of the fixed effects in general linear models. The purpose of this article is to verify one of these conditions given by Zyskind [39, 40]: there exists a matrix Q such that ΩX = XQ, where X and Ω are the design matrix and the covariance matrix, respectively. It will be shown the accessibility of this condition...