Least squares for practitioners.
In the paper two approaches to the problem of estimation of transition probabilities are considered. The approach by McCullagh and Nelder [5], based on the independent model and the quasi-likelihood function, is compared with the approach based on the marginal model and the standard likelihood function. The estimates following from these two approaches are illustrated on a simple example which was used by McCullagh and Nelder.
For inferences from random-effect models Lee and Nelder (1996) proposed to use hierarchical likelihood (h-likelihood). It allows influence from models that may include both fixed and random parameters. Because of the presence of unobserved random variables h-likelihood is not a likelihood in the Fisherian sense. The Fisher likelihood framework has advantages such as generality of application, statistical and computational efficiency. We introduce an extended likelihood framework and discuss why...
The paper deals with the linear comparative calibration problem, i. e. the situation when both variables are subject to errors. Considered is a quite general model which allows to include possibly correlated data (measurements). From statistical point of view the model could be represented by the linear errors-in-variables (EIV) model. We suggest an iterative algorithm for estimation the parameters of the analysis function (inverse of the calibration line) and we solve the problem of deriving the...
Linear error propagation law (LEPL) has been using frequently also for nonlinear functions. It can be adequate for an actual situation however it need not be so. It is useful to use some rule in order to recognize whether LEPL is admissible. The aim of the paper is to find such rule.
In nonlinear regression models an approximate value of an unknown parameter is frequently at our disposal. Then the linearization of the model is used and a linear estimate of the parameter can be calculated. Some criteria how to recognize whether a linearization is possible are developed. In the case that they are not satisfied, it is necessary to take into account either some quadratic corrections or to use the nonlinear least squares method. The aim of the paper is to find some criteria for an...
In the case of the nonlinear regression model, methods and procedures have been developed to obtain estimates of the parameters. These methods are much more complicated than the procedures used if the model considered is linear. Moreover, unlike the linear case, the properties of the resulting estimators are unknown and usually depend on the true values of the estimated parameters. It is sometimes possible to approximate the nonlinear model by a linear one and use the much more developed linear...
A construction of confidence regions in nonlinear regression models is difficult mainly in the case that the dimension of an estimated vector parameter is large. A singularity is also a problem. Therefore some simple approximation of an exact confidence region is welcome. The aim of the paper is to give a small modification of a confidence ellipsoid constructed in a linearized model which is sufficient under some conditions for an approximation of the exact confidence region.
If an observation vector in a nonlinear regression model is normally distributed, then an algorithm for a determination of the exact -confidence region for the parameter of the mean value of the observation vector is well known. However its numerical realization is tedious and therefore it is of some interest to find some condition which enables us to construct this region in a simpler way.
In nonlinear regression models with constraints a linearization of the model leads to a bias in estimators of parameters of the mean value of the observation vector. Some criteria how to recognize whether a linearization is possible is developed. In the case that they are not satisfied, it is necessary to decide whether some quadratic corrections can make the estimator better. The aim of the paper is to contribute to the solution of the problem.
A linearization of the nonlinear regression model causes a bias in estimators of model parameters. It can be eliminated, e.g., either by a proper choice of the point where the model is developed into the Taylor series or by quadratic corrections of linear estimators. The aim of the paper is to obtain formulae for biases and variances of estimators in linearized models and also for corrected estimators.
The paper deals with the linear model with uncorrelated observations. The dispersions of the values observed are linear-quadratic functions of the unknown parameters of the mean (measurements by devices of a given class of precision). Investigated are the locally best linear-quadratic unbiased estimators as improvements of locally best linear unbiased estimators in the case that the design matrix has none, one or two linearly dependent rows.