Page 1 Next

Displaying 1 – 20 of 108

Showing per page

Adaptive trimmed likelihood estimation in regression

Tadeusz Bednarski, Brenton R. Clarke, Daniel Schubert (2010)

Discussiones Mathematicae Probability and Statistics

In this paper we derive an asymptotic normality result for an adaptive trimmed likelihood estimator of regression starting from initial high breakdownpoint robust regression estimates. The approach leads to quickly and easily computed robust and efficient estimates for regression. A highlight of the method is that it tends automatically in one algorithm to expose the outliers and give least squares estimates with the outliers removed. The idea is to begin with a rapidly computed consistent robust...

An adaptive method of estimation and outlier detection in regression applicable for small to moderate sample sizes

Brenton R. Clarke (2000)

Discussiones Mathematicae Probability and Statistics

In small to moderate sample sizes it is important to make use of all the data when there are no outliers, for reasons of efficiency. It is equally important to guard against the possibility that there may be single or multiple outliers which can have disastrous effects on normal theory least squares estimation and inference. The purpose of this paper is to describe and illustrate the use of an adaptive regression estimation algorithm which can be used to highlight outliers, either single or multiple...

Bayes optimal stopping of a homogeneous poisson process under linex loss function and variation in the prior

Marek Męczarski, Ryszard Zieliński (1997)

Applicationes Mathematicae

A homogeneous Poisson process (N(t),t ≥ 0) with the intensity function m(t)=θ is observed on the interval [0,T]. The problem consists in estimating θ with balancing the LINEX loss due to an error of estimation and the cost of sampling which depends linearly on T. The optimal T is given when the prior distribution of θ is not uniquely specified.

Bayes robustness via the Kolmogorov metric

Agata Boratyńska, Ryszard Zieliński (1993)

Applicationes Mathematicae

An upper bound for the Kolmogorov distance between the posterior distributions in terms of that between the prior distributions is given. For some likelihood functions the inequality is sharp. Applications to assessing Bayes robustness are presented.

Blended φ -divergences with examples

Václav Kůs (2003)

Kybernetika

Several new examples of divergences emerged in the recent literature called blended divergences. Mostly these examples are constructed by the modification or parametrization of the old well-known phi-divergences. Newly introduced parameter is often called blending parameter. In this paper we present compact theory of blended divergences which provides us with a generally applicable method for finding new classes of divergences containing any two divergences D 0 and D 1 given in advance. Several examples...

Combining forecasts using the least trimmed squares

Jan Ámos Víšek (2001)

Kybernetika

Employing recently derived asymptotic representation of the least trimmed squares estimator, the combinations of the forecasts with constraints are studied. Under assumption of unbiasedness of individual forecasts it is shown that the combination without intercept and with constraint imposed on the estimate of regression coefficients that they sum to one, is better than others. A numerical example is included to support theoretical conclusions.

Consistency of the least weighted squares under heteroscedasticity

Jan Ámos Víšek (2011)

Kybernetika

A robust version of the Ordinary Least Squares accommodating the idea of weighting the order statistics of the squared residuals (rather than directly the squares of residuals) is recalled and its properties are studied. The existence of solution of the corresponding extremal problem and the consistency under heteroscedasticity is proved.

Currently displaying 1 – 20 of 108

Page 1 Next