Currently displaying 1 – 20 of 78

Showing per page

Order by Relevance | Title | Year of publication

Multivariate statistical models; solvability of basic problems

Lubomír Kubáček — 2010

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

Multivariate models frequently used in many branches of science have relatively large number of different structures. Sometimes the regularity condition which enable us to solve statistical problems are not satisfied and it is reasonable to recognize it in advance. In the paper the model without constraints on parameters is analyzed only, since the greatness of the class of such problems in general is out of the size of the paper.

Variance components and an additional experiment

Lubomír Kubáček — 2012

Applications of Mathematics

Estimators of parameters of an investigated object can be considered after some time as insufficiently precise. Therefore, an additional measurement must be realized. A model of a measurement, taking into account both the original results and the new ones, has a litle more complicated covariance matrix, since the variance components occur in it. How to deal with them is the aim of the paper.

Additional Experiment and Linear Statistical Models

Lubomír Kubáček — 2012

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

An accuracy of parameter estimates need not be sufficient for their unforeseen utilization. Therefore some additional measurement is necessary in order to attain the required precision. The problem is to express the correction to the original estimates in an explicit form.

Seemingly unrelated regression models

Lubomír Kubáček — 2013

Applications of Mathematics

The cross-covariance matrix of observation vectors in two linear statistical models need not be zero matrix. In such a case the problem is to find explicit expressions for the best linear unbiased estimators of both model parameters and estimators of variance components in the simplest structure of the covariance matrix. Univariate and multivariate forms of linear models are dealt with.

Ridge Estimator Revisited

Lubomír Kubáček — 2012

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

Bad conditioned matrix of normal equations in connection with small values of model parameters is a source of problems in parameter estimation. One solution gives the ridge estimator. Some modification of it is the aim of the paper. The behaviour of it in models with constraints is investigated as well.

Multivariate models with constraints confidence regions

Lubomír Kubáček — 2008

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

In multivariate linear statistical models with normally distributed observation matrix a structure of a covariance matrix plays an important role when confidence regions must be determined. In the paper it is assumed that the covariance matrix is a linear combination of known symmetric and positive semidefinite matrices and unknown parameters (variance components) which are unbiasedly estimable. Then insensitivity regions are found for them which enables us to decide whether plug-in approach can...

On a linearization of regression models

Lubomír Kubáček — 1995

Applications of Mathematics

An approximate value of a parameter in a nonlinear regression model is known in many cases. In such situation a linearization of the model is possible however it is important to recognize, whether the difference between the actual value of the parameter and the approximate value does not cause significant changes, e.g., in the bias of the estimator or in its variance, etc. Some rules suitable for a solution of this problem are given in the paper.

Linear model with inaccurate variance components

Lubomír Kubáček — 1996

Applications of Mathematics

A linear model with approximate variance components is considered. Differences among approximate and actual values of variance components influence the proper position and the shape of confidence ellipsoids, the level of statistical tests and their power function. A procedure how to recognize whether these diferences can be neglected is given in the paper.

Linear versus quadratic estimators in linearized models

Lubomír Kubáček — 2004

Applications of Mathematics

In nonlinear regression models an approximate value of an unknown parameter is frequently at our disposal. Then the linearization of the model is used and a linear estimate of the parameter can be calculated. Some criteria how to recognize whether a linearization is possible are developed. In the case that they are not satisfied, it is necessary to take into account either some quadratic corrections or to use the nonlinear least squares method. The aim of the paper is to find some criteria for an...

Linearized regression model with constraints of type II

Lubomír Kubáček — 2003

Applications of Mathematics

A linearization of the nonlinear regression model causes a bias in estimators of model parameters. It can be eliminated, e.g., either by a proper choice of the point where the model is developed into the Taylor series or by quadratic corrections of linear estimators. The aim of the paper is to obtain formulae for biases and variances of estimators in linearized models and also for corrected estimators.

Linearized models with constraints of type I

Lubomír Kubáček — 2003

Applications of Mathematics

In nonlinear regression models with constraints a linearization of the model leads to a bias in estimators of parameters of the mean value of the observation vector. Some criteria how to recognize whether a linearization is possible is developed. In the case that they are not satisfied, it is necessary to decide whether some quadratic corrections can make the estimator better. The aim of the paper is to contribute to the solution of the problem.

Multistage regression model

Lubomír Kubáček — 1986

Aplikace matematiky

Necessary and sufficient conditions are given under which the best linear unbiased estimator (BLUE) β ^ i ( Y 1 , , Y i ) is identical with the BLUE β ^ i ( β ^ 1 , , β ^ i - 1 , Y i ) ; Y 1 , Y i are subvectors of the random vector Y in a general regression model ( Y , X β , ) , ( β 1 ' , , β i ' ) ' = β a vector of unknown parameters; the design matrix X having a special so called multistage struture and the covariance matrix are given.

Page 1 Next

Download Results (CSV)