Displaying 21 – 40 of 119

Showing per page

Bias of LS estimators in nonlinear regression models with constraints. Part I: General case

Andrej Pázman, Jean-Baptiste Denis (1999)

Applications of Mathematics

We derive expressions for the asymptotic approximation of the bias of the least squares estimators in nonlinear regression models with parameters which are subject to nonlinear equality constraints. The approach suggested modifies the normal equations of the estimator, and approximates them up to o p ( N - 1 ) , where N is the number of observations. The “bias equations” so obtained are solved under different assumptions on constraints and on the model. For functions of the parameters the invariance of the approximate...

Bias of LS estimators in nonlinear regression models with constraints. Part II: Biadditive models

Jean-Baptiste Denis, Andrej Pázman (1999)

Applications of Mathematics

General results giving approximate bias for nonlinear models with constrained parameters are applied to bilinear models in anova framework, called biadditive models. Known results on the information matrix and the asymptotic variance matrix of the parameters are summarized, and the Jacobians and Hessians of the response and of the constraints are derived. These intermediate results are the basis for any subsequent second order study of the model. Despite the large number of parameters involved,...

Change-point estimator in gradually changing sequences

Daniela Jarušková (1998)

Commentationes Mathematicae Universitatis Carolinae

Recently Hušková (1998) has studied the least squares estimator of a change-point in gradually changing sequence supposing that the sequence increases (or decreases) linearly after the change-point. The present paper shows that the limit behavior of the change-point estimator for more complicated gradual changes is similar. The limit variance of the estimator can be easily calculated from the covariance function of a limit process.

Confidence regions in nonlinear regression models

Rastislav Potocký, Van Ban To (1992)

Applications of Mathematics

New curvature measures for nonlinear regression models are developed and methods of their computing are given. Using these measures, more accurate confidence regions for parameters than those based on linear or quadratic approximations are obtained.

Consistency of the least weighted squares under heteroscedasticity

Jan Ámos Víšek (2011)

Kybernetika

A robust version of the Ordinary Least Squares accommodating the idea of weighting the order statistics of the squared residuals (rather than directly the squares of residuals) is recalled and its properties are studied. The existence of solution of the corresponding extremal problem and the consistency under heteroscedasticity is proved.

Dynamics of an artificial slope

Bartoň, Stanislav (2021)

Programs and Algorithms of Numerical Mathematics

The slope shape is replaced by a 3D regression function which corresponds with high precision to the position of several hundred points which were determined on the surface of the slope body. The position of several points was repeatedly measured for several years. The time changes in the position of these points were used to create regression functions that describe vertical movements, slope settlement and horizontal movements, slope movement. The model results are presented in the form of mathematical...

Empirical regression quantile processes

Jana Jurečková, Jan Picek, Martin Schindler (2020)

Applications of Mathematics

We address the problem of estimating quantile-based statistical functionals, when the measured or controlled entities depend on exogenous variables which are not under our control. As a suitable tool we propose the empirical process of the average regression quantiles. It partially masks the effect of covariates and has other properties convenient for applications, e.g. for coherent risk measures of various types in the situations with covariates.

Estimation in autoregressive model with measurement error

Jérôme Dedecker, Adeline Samson, Marie-Luce Taupin (2014)

ESAIM: Probability and Statistics

Consider an autoregressive model with measurement error: we observe Zi = Xi + εi, where the unobserved Xi is a stationary solution of the autoregressive equation Xi = gθ0(Xi − 1) + ξi. The regression function gθ0 is known up to a finite dimensional parameter θ0 to be estimated. The distributions of ξ1 and X0 are unknown and gθ belongs to a large class of parametric regression functions. The distribution of ε0is completely known. We propose an estimation procedure with a new criterion computed as...

Currently displaying 21 – 40 of 119