Displaying 161 – 180 of 388

Showing per page

Linear model with nuisance parameters and with constraints on useful and nuisance parameters

Pavla Kunderová, Jaroslav Marek (2006)

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

The properties of the regular linear model are well known (see [1], Chapter 1). In this paper the situation where the vector of the first order parameters is divided into two parts (to the vector of the useful parameters and to the vector of the nuisance parameters) is considered. It will be shown how the BLUEs of these parameters will be changed by constraints given on them. The theory will be illustrated by an example from the practice.

Linear versus quadratic estimators in linearized models

Lubomír Kubáček (2004)

Applications of Mathematics

In nonlinear regression models an approximate value of an unknown parameter is frequently at our disposal. Then the linearization of the model is used and a linear estimate of the parameter can be calculated. Some criteria how to recognize whether a linearization is possible are developed. In the case that they are not satisfied, it is necessary to take into account either some quadratic corrections or to use the nonlinear least squares method. The aim of the paper is to find some criteria for an...

Linearization conditions for regression models with unknown variance parameter

Anna Jenčová (2000)

Applications of Mathematics

In the case of the nonlinear regression model, methods and procedures have been developed to obtain estimates of the parameters. These methods are much more complicated than the procedures used if the model considered is linear. Moreover, unlike the linear case, the properties of the resulting estimators are unknown and usually depend on the true values of the estimated parameters. It is sometimes possible to approximate the nonlinear model by a linear one and use the much more developed linear...

Linearization regions for a confidence ellipsoid in singular nonlinear regression models

Lubomír Kubáček, Eva Tesaříková (2009)

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

A construction of confidence regions in nonlinear regression models is difficult mainly in the case that the dimension of an estimated vector parameter is large. A singularity is also a problem. Therefore some simple approximation of an exact confidence region is welcome. The aim of the paper is to give a small modification of a confidence ellipsoid constructed in a linearized model which is sufficient under some conditions for an approximation of the exact confidence region.

Linearization regions for confidence ellipsoids

Lubomír Kubáček, Eva Tesaříková (2008)

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

If an observation vector in a nonlinear regression model is normally distributed, then an algorithm for a determination of the exact ( 1 - α ) -confidence region for the parameter of the mean value of the observation vector is well known. However its numerical realization is tedious and therefore it is of some interest to find some condition which enables us to construct this region in a simpler way.

Linearized models with constraints of type I

Lubomír Kubáček (2003)

Applications of Mathematics

In nonlinear regression models with constraints a linearization of the model leads to a bias in estimators of parameters of the mean value of the observation vector. Some criteria how to recognize whether a linearization is possible is developed. In the case that they are not satisfied, it is necessary to decide whether some quadratic corrections can make the estimator better. The aim of the paper is to contribute to the solution of the problem.

Linearized regression model with constraints of type II

Lubomír Kubáček (2003)

Applications of Mathematics

A linearization of the nonlinear regression model causes a bias in estimators of model parameters. It can be eliminated, e.g., either by a proper choice of the point where the model is developed into the Taylor series or by quadratic corrections of linear estimators. The aim of the paper is to obtain formulae for biases and variances of estimators in linearized models and also for corrected estimators.

Locally and uniformly best estimators in replicated regression model

Júlia Volaufová, Lubomír Kubáček (1983)

Aplikace matematiky

The aim of the paper is to estimate a function γ = t r ( D β β ' ) + t r ( C ) (with d , C known matrices) in a regression model ( Y , X β , ) with an unknown parameter β and covariance matrix . Stochastically independent replications Y 1 , ... , Y m of the stochastic vector Y are considered, where the estimators of X β and are Y ¯ = 1 m i = 1 m Y i and ^ = ( m - 1 ) - 1 i = 1 m ( Y i - Y ¯ ) ( Y i - Y ¯ ) ' , respectively. Locally and uniformly best inbiased estimators of the function γ , based on Y ¯ and ^ , are given.

Making use of incomplete observations for regression in bivariate normal model

Joanna Tarasińska (2003)

Applications of Mathematics

Two estimates of the regression coefficient in bivariate normal distribution are considered: the usual one based on a sample and a new one making use of additional observations of one of the variables. They are compared with respect to variance. The same is done for two regression lines. The conclusion is that the additional observations are worth using only when the sample is very small.

Matrix rank and inertia formulas in the analysis of general linear models

Yongge Tian (2017)

Open Mathematics

Matrix mathematics provides a powerful tool set for addressing statistical problems, in particular, the theory of matrix ranks and inertias has been developed as effective methodology of simplifying various complicated matrix expressions, and establishing equalities and inequalities occurred in statistical analysis. This paper describes how to establish exact formulas for calculating ranks and inertias of covariances of predictors and estimators of parameter spaces in general linear models (GLMs),...

Matrix rank/inertia formulas for least-squares solutions with statistical applications

Yongge Tian, Bo Jiang (2016)

Special Matrices

Least-Squares Solution (LSS) of a linear matrix equation and Ordinary Least-Squares Estimator (OLSE) of unknown parameters in a general linear model are two standard algebraical methods in computational mathematics and regression analysis. Assume that a symmetric quadratic matrix-valued function Φ(Z) = Q − ZPZ0 is given, where Z is taken as the LSS of the linear matrix equation AZ = B. In this paper, we establish a group of formulas for calculating maximum and minimum ranks and inertias of Φ(Z)...

Currently displaying 161 – 180 of 388