Page 1

Displaying 1 – 15 of 15

Showing per page

Test for Independence of the Variables with Missing Elements in One and the Same Column of the Empirical Correlation Matrix

Veleva, Evelina (2008)

Serdica Mathematical Journal

2000 Mathematics Subject Classification: 62H15, 62H12.We consider variables with joint multivariate normal distribution and suppose that the sample correlation matrix has missing elements, located in one and the same column. Under these assumptions we derive the maximum likelihood ratio test for independence of the variables. We obtain also the maximum likelihood estimations for the missing values.

Testing hypotheses in universal models

Eva Fišerová (2006)

Discussiones Mathematicae Probability and Statistics

A linear regression model, when a design matrix has not full column rank and a covariance matrix is singular, is considered. The problem of testing hypotheses on mean value parameters is studied. Conditions when a hypothesis can be tested or when need not be tested are given. Explicit forms of test statistics based on residual sums of squares are presented.

Tests of independence of normal random variables with known and unknown variance ratio

Edward Gąsiorek, Andrzej Michalski, Roman Zmyślony (2000)

Discussiones Mathematicae Probability and Statistics

In the paper, a new approach to construction test for independenceof two-dimensional normally distributed random vectors is given under the assumption that the ratio of the variances is known. This test is uniformly better than the t-Student test. A comparison of the power of these two tests is given. A behaviour of this test forsome ε-contamination of the original model is also shown. In the general case when the variance ratio is unknown, an adaptive test is presented. The equivalence between...

The linear model with variance-covariance components and jackknife estimation

Jaromír Kudeláš (1994)

Applications of Mathematics

Let θ * be a biased estimate of the parameter ϑ based on all observations x 1 , , x n and let θ - i * ( i = 1 , 2 , , n ) be the same estimate of the parameter ϑ obtained after deletion of the i -th observation. If the expectation of the estimators θ * and θ - i * are expressed as E ( θ * ) = ϑ + a ( n ) b ( ϑ ) E ( θ - i * ) = ϑ + a ( n - 1 ) b ( ϑ ) i = 1 , 2 , , n , where a ( n ) is a known sequence of real numbers and b ( ϑ ) is a function of ϑ , then this system of equations can be regarded as a linear model. The least squares method gives the generalized jackknife estimator. Using this method, it is possible to obtain the unbiased...

Currently displaying 1 – 15 of 15

Page 1