Currently displaying 1 – 7 of 7

Showing per page

Order by Relevance | Title | Year of publication

On generalized information and divergence measures and their applications: a brief review.

The aim of this review is to give different two-parametric generalizations of the following measures: directed divergence (Kullback and Leibler, 1951), Jensen difference divergence (Burbea and Rao 1982 a,b; Rao, 1982) and Jeffreys invariant divergence (Jeffreys, 1946). These generalizations are put in the unified expression and their properties are studied. The applications of generalized information and divergence measures to comparison of experiments and the connections with Fisher information...

( h , Φ ) -entropy differential metric

María Luisa MenéndezDomingo MoralesLeandro PardoMiquel Salicrú — 1997

Applications of Mathematics

Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic differential metrics on probability spaces. Using these methods, they obtained the Fisher information metric as a particular case. In this paper we apply the method based on entropy measures to obtain a Riemannian metric based on ( h , Φ ) -entropy measures (Salicrú et al., 1993). The geodesic distances based on that information metric have been computed for a number of parametric families of distributions. The use of geodesic...

Order statistics and ( r , s ) -entropy measures

María Dolores EstebanDomingo MoralesLeandro PardoMaría Luisa Menéndez — 1994

Applications of Mathematics

K. M. Wong and S. Chen [9] analyzed the Shannon entropy of a sequence of random variables under order restrictions. Using ( r , s ) -entropies, I. J. Taneja [8], these results are generalized. Upper and lower bounds to the entropy reduction when the sequence is ordered and conditions under which they are achieved are derived. Theorems are presented showing the difference between the average entropy of the individual order statistics and the entropy of a member of the original independent identically distributed...

Minimum disparity estimators for discrete and continuous models

María Luisa MenéndezDomingo MoralesLeandro PardoIgor Vajda — 2001

Applications of Mathematics

Disparities of discrete distributions are introduced as a natural and useful extension of the information-theoretic divergences. The minimum disparity point estimators are studied in regular discrete models with i.i.d. observations and their asymptotic efficiency of the first order, in the sense of Rao, is proved. These estimators are applied to continuous models with i.i.d. observations when the observation space is quantized by fixed points, or at random, by the sample quantiles of fixed orders....

Inference about stationary distributions of Markov chains based on divergences with observed frequencies

For data generated by stationary Markov chains there are considered estimates of chain parameters minimizing φ –divergences between theoretical and empirical distributions of states. Consistency and asymptotic normality are established and the asymptotic covariance matrices are evaluated. Testing of hypotheses about the stationary distributions based on φ –divergences between the estimated and empirical distributions is considered as well. Asymptotic distributions of φ –divergence test statistics are...

Page 1

Download Results (CSV)