Currently displaying 1 – 4 of 4

Showing per page

Order by Relevance | Title | Year of publication

Blended φ -divergences with examples

Václav Kůs — 2003

Kybernetika

Several new examples of divergences emerged in the recent literature called blended divergences. Mostly these examples are constructed by the modification or parametrization of the old well-known phi-divergences. Newly introduced parameter is often called blending parameter. In this paper we present compact theory of blended divergences which provides us with a generally applicable method for finding new classes of divergences containing any two divergences D 0 and D 1 given in advance. Several examples...

Modified power divergence estimators in normal models – simulation and comparative study

Iva FrýdlováIgor VajdaVáclav Kůs — 2012

Kybernetika

Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the φ -divergence is always equal to its upper bound, and the minimum φ -divergence estimates are trivial. Broniatowski and Vajda [3] proposed several modifications of the minimum divergence rule to provide a solution to the...

Existence, Consistency and computer simulation for selected variants of minimum distance estimators

The paper deals with sufficient conditions for the existence of general approximate minimum distance estimator (AMDE) of a probability density function f 0 on the real line. It shows that the AMDE always exists when the bounded φ -divergence, Kolmogorov, Lévy, Cramér, or discrepancy distance is used. Consequently, n - 1 / 2 consistency rate in any bounded φ -divergence is established for Kolmogorov, Lévy, and discrepancy estimators under the condition that the degree of variations of the corresponding family...

Page 1

Download Results (CSV)