Displaying similar documents to “On the robustness of multiple regression coefficient estimators obtained by the p-point method”

Redescending M-estimators in regression analysis, cluster analysis and image analysis

Christine H. Müller (2004)

Discussiones Mathematicae Probability and Statistics

Similarity:

We give a review on the properties and applications of M-estimators with redescending score function. For regression analysis, some of these redescending M-estimators can attain the maximum breakdown point which is possible in this setup. Moreover, some of them are the solutions of the problem of maximizing the efficiency under bounded influence function when the regression coefficient and the scale parameter are estimated simultaneously. Hence redescending M-estimators satisfy several...

Sensitivity analysis in linear models

Shuangzhe Liu, Tiefeng Ma, Yonghui Liu (2016)

Special Matrices

Similarity:

In this work, we consider the general linear model or its variants with the ordinary least squares, generalised least squares or restricted least squares estimators of the regression coefficients and variance. We propose a newly unified set of definitions for local sensitivity for both situations, one for the estimators of the regression coefficients, and the other for the estimators of the variance. Based on these definitions, we present the estimators’ sensitivity results.We include...

Adaptive trimmed likelihood estimation in regression

Tadeusz Bednarski, Brenton R. Clarke, Daniel Schubert (2010)

Discussiones Mathematicae Probability and Statistics

Similarity:

In this paper we derive an asymptotic normality result for an adaptive trimmed likelihood estimator of regression starting from initial high breakdownpoint robust regression estimates. The approach leads to quickly and easily computed robust and efficient estimates for regression. A highlight of the method is that it tends automatically in one algorithm to expose the outliers and give least squares estimates with the outliers removed. The idea is to begin with a rapidly computed consistent...

Trimmed Estimators in Regression Framework

TomĂĄĹĄ Jurczyk (2011)

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

Similarity:

From the practical point of view the regression analysis and its Least Squares method is clearly one of the most used techniques of statistics. Unfortunately, if there is some problem present in the data (for example contamination), classical methods are not longer suitable. A lot of methods have been proposed to overcome these problematic situations. In this contribution we focus on special kind of methods based on trimming. There exist several approaches which use trimming off part...

Consistency of linear and quadratic least squares estimators in regression models with covariance stationary errors

František Štulajter (1991)

Applications of Mathematics

Similarity:

The least squres invariant quadratic estimator of an unknown covariance function of a stochastic process is defined and a sufficient condition for consistency of this estimator is derived. The mean value of the observed process is assumed to fulfil a linear regresion model. A sufficient condition for consistency of the least squares estimator of the regression parameters is derived, too.

On the Equivalence between Orthogonal Regression and Linear Model with Type-II Constraints

Sandra Donevska, Eva Fišerová, Karel Hron (2011)

Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica

Similarity:

Orthogonal regression, also known as the total least squares method, regression with errors-in variables or as a calibration problem, analyzes linear relationship between variables. Comparing to the standard regression, both dependent and explanatory variables account for measurement errors. Through this paper we shortly discuss the orthogonal least squares, the least squares and the maximum likelihood methods for estimation of the orthogonal regression line. We also show that all mentioned...