On a generalization of the orthogonal regression
An approximate value of a parameter in a nonlinear regression model is known in many cases. In such situation a linearization of the model is possible however it is important to recognize, whether the difference between the actual value of the parameter and the approximate value does not cause significant changes, e.g., in the bias of the estimator or in its variance, etc. Some rules suitable for a solution of this problem are given in the paper.
The multivariate linear model, in which the matrix of the first order parameters is divided into two matrices: to the matrix of the useful parameters and to the matrix of the nuisance parameters, is considered. We examine eliminating transformations which eliminate the nuisance parameters without loss of information on the useful parameters and on the variance components.
There exist many different ways of determining the best linear unbiased estimation of regression coefficients in general regression model. In Part I of this article it is shown that all these ways are numerically equivalent almost everyvhere. In Part II conditions are considered under which all the unbiased estimations of the unknown covariance matrix scalar factor are numerically equivalent almost everywhere.
There exist many different ways of determining the best linear unbiased estimation of regression coefficients in general regression model. In Part I of this article it is shown that all these ways are numerically equivalent almost everyvhere. In Part II conditions are considered under which all the unbiased estimations of the unknown covariance matrix scalar factor are numerically equivalent almost everywhere.
We discuss some methods of estimation in bivariate errors-in-variables linear models. We also suggest a method of constructing consistent estimators in the case when the error disturbances have the normal distribution with unknown parameters. It is based on the theory of estimating variance components in linear models. A simulation study is presented which compares this estimator with the maximum likelihood one.
In the paper we deal with the problem of parameter estimation in the linear normal mixed model with two variance components. We present solutions to the problem of finding the global maximizer of the likelihood function and to the problem of finding the global maximizer of the REML likelihood function in this model.
A generalization of a test for non-nested models in linear regression is derived for the case when there are several regression models with more regressors.
A bicubic model for local smoothing of surfaces is constructed on the base of pivot points. Such an approach allows reducing the dimension of matrix of normal equations more than twice. The model enables to increase essentially the speed and stability of calculations. The algorithms, constructed by the aid of the offered model, can be used both in applications and the development of global methods for smoothing and approximation of surfaces.