Displaying 21 – 40 of 66

Showing per page

Existence, Consistency and computer simulation for selected variants of minimum distance estimators

Václav Kůs, Domingo Morales, Jitka Hrabáková, Iva Frýdlová (2018)

Kybernetika

The paper deals with sufficient conditions for the existence of general approximate minimum distance estimator (AMDE) of a probability density function f 0 on the real line. It shows that the AMDE always exists when the bounded φ -divergence, Kolmogorov, Lévy, Cramér, or discrepancy distance is used. Consequently, n - 1 / 2 consistency rate in any bounded φ -divergence is established for Kolmogorov, Lévy, and discrepancy estimators under the condition that the degree of variations of the corresponding family...

Extensions of the Frisch-Waugh-Lovell Theorem

Jürgen Groß, Simo Puntanen (2005)

Discussiones Mathematicae Probability and Statistics

In this paper we introduce extensions of the so-called Frisch-Waugh-Lovell Theorem. This is done by employing the close relationship between the concept of linear sufficiency and the appropriate reduction of linear models. Some specific reduced models which demonstrate alternatives to the Frisch-Waugh-Lovell procedure are discussed.

Modified power divergence estimators in normal models – simulation and comparative study

Iva Frýdlová, Igor Vajda, Václav Kůs (2012)

Kybernetika

Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the φ -divergence is always equal to its upper bound, and the minimum φ -divergence estimates are trivial. Broniatowski and Vajda [3] proposed several modifications of the minimum divergence rule to provide a solution to the...

On unbiased Lehmann-estimators of a variance of an exponential distribution with quadratic loss function.

Jadwiga Kicinska-Slaby (1982)

Trabajos de Estadística e Investigación Operativa

Lehmann in [4] has generalised the notion of the unbiased estimator with respect to the assumed loss function. In [5] Singh considered admissible estimators of function λ-r of unknown parameter λ of gamma distribution with density f(x|λ, b) = λb-1 e-λx xb-1 / Γ(b), x>0, where b is a known parameter, for loss function L(λ-r, λ-r) = (λ-r - λ-r)2 / λ-2r.Goodman in [1] choosing three loss functions of different shape found unbiased Lehmann-estimators, of the variance σ2 of the normal distribution....

Currently displaying 21 – 40 of 66