ABSTRACT - Models with a Low Nonlinearity.
Multivariate models frequently used in many branches of science have relatively large number of different structures. Sometimes the regularity condition which enable us to solve statistical problems are not satisfied and it is reasonable to recognize it in advance. In the paper the model without constraints on parameters is analyzed only, since the greatness of the class of such problems in general is out of the size of the paper.
Estimators of parameters of an investigated object can be considered after some time as insufficiently precise. Therefore, an additional measurement must be realized. A model of a measurement, taking into account both the original results and the new ones, has a litle more complicated covariance matrix, since the variance components occur in it. How to deal with them is the aim of the paper.
An accuracy of parameter estimates need not be sufficient for their unforeseen utilization. Therefore some additional measurement is necessary in order to attain the required precision. The problem is to express the correction to the original estimates in an explicit form.
The cross-covariance matrix of observation vectors in two linear statistical models need not be zero matrix. In such a case the problem is to find explicit expressions for the best linear unbiased estimators of both model parameters and estimators of variance components in the simplest structure of the covariance matrix. Univariate and multivariate forms of linear models are dealt with.
In mixed linear statistical models the best linear unbiased estimators need a known covariance matrix. However, the variance components must be usually estimated. Thus a problem arises what is the covariance matrix of the plug-in estimators.
Bad conditioned matrix of normal equations in connection with small values of model parameters is a source of problems in parameter estimation. One solution gives the ridge estimator. Some modification of it is the aim of the paper. The behaviour of it in models with constraints is investigated as well.
The error propagation law is investigated in the case of a nonlinear function of measured data with non-negligible uncertainty.
In multivariate linear statistical models with normally distributed observation matrix a structure of a covariance matrix plays an important role when confidence regions must be determined. In the paper it is assumed that the covariance matrix is a linear combination of known symmetric and positive semidefinite matrices and unknown parameters (variance components) which are unbiasedly estimable. Then insensitivity regions are found for them which enables us to decide whether plug-in approach can...
An approximate value of a parameter in a nonlinear regression model is known in many cases. In such situation a linearization of the model is possible however it is important to recognize, whether the difference between the actual value of the parameter and the approximate value does not cause significant changes, e.g., in the bias of the estimator or in its variance, etc. Some rules suitable for a solution of this problem are given in the paper.
A linear model with approximate variance components is considered. Differences among approximate and actual values of variance components influence the proper position and the shape of confidence ellipsoids, the level of statistical tests and their power function. A procedure how to recognize whether these diferences can be neglected is given in the paper.
In nonlinear regression models an approximate value of an unknown parameter is frequently at our disposal. Then the linearization of the model is used and a linear estimate of the parameter can be calculated. Some criteria how to recognize whether a linearization is possible are developed. In the case that they are not satisfied, it is necessary to take into account either some quadratic corrections or to use the nonlinear least squares method. The aim of the paper is to find some criteria for an...
A linearization of the nonlinear regression model causes a bias in estimators of model parameters. It can be eliminated, e.g., either by a proper choice of the point where the model is developed into the Taylor series or by quadratic corrections of linear estimators. The aim of the paper is to obtain formulae for biases and variances of estimators in linearized models and also for corrected estimators.
Some remarks to problems of point and interval estimation, testing and problems of outliers are presented in the case of multivariate regression model.
In nonlinear regression models with constraints a linearization of the model leads to a bias in estimators of parameters of the mean value of the observation vector. Some criteria how to recognize whether a linearization is possible is developed. In the case that they are not satisfied, it is necessary to decide whether some quadratic corrections can make the estimator better. The aim of the paper is to contribute to the solution of the problem.
Necessary and sufficient conditions are given under which the best linear unbiased estimator (BLUE) is identical with the BLUE ; are subvectors of the random vector in a general regression model , a vector of unknown parameters; the design matrix having a special so called multistage struture and the covariance matrix are given.
Page 1 Next