Several new examples of divergences emerged in the recent literature called blended divergences. Mostly these examples are constructed by the modification or parametrization of the old well-known phi-divergences. Newly introduced parameter is often called blending parameter. In this paper we present compact theory of blended divergences which provides us with a generally applicable method for finding new classes of divergences containing any two divergences and given in advance. Several examples...
Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the -divergence is always equal to its upper bound, and the minimum -divergence estimates are trivial. Broniatowski and Vajda [3] proposed several modifications of the minimum divergence rule to provide a solution to the...
The paper deals with sufficient conditions for the existence of general approximate minimum distance estimator (AMDE) of a probability density function on the real line. It shows that the AMDE always exists when the bounded -divergence, Kolmogorov, Lévy, Cramér, or discrepancy distance is used. Consequently, consistency rate in any bounded -divergence is established for Kolmogorov, Lévy, and discrepancy estimators under the condition that the degree of variations of the corresponding family...
Download Results (CSV)