The aim of this review is to give different two-parametric generalizations of the following measures: directed divergence (Kullback and Leibler, 1951), Jensen difference divergence (Burbea and Rao 1982 a,b; Rao, 1982) and Jeffreys invariant divergence (Jeffreys, 1946). These generalizations are put in the unified expression and their properties are studied. The applications of generalized information and divergence measures to comparison of experiments and the connections with Fisher information...
Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic differential metrics on probability spaces. Using these methods, they obtained the Fisher information metric as a particular case. In this paper we apply the method based on entropy measures to obtain a Riemannian metric based on -entropy measures (Salicrú et al., 1993). The geodesic distances based on that information metric have been computed for a number of parametric families of distributions. The use of geodesic...
K. M. Wong and S. Chen [9] analyzed the Shannon entropy of a sequence of random variables under order restrictions. Using -entropies, I. J. Taneja [8], these results are generalized. Upper and lower bounds to the entropy reduction when the sequence is ordered and conditions under which they are achieved are derived. Theorems are presented showing the difference between the average entropy of the individual order statistics and the entropy of a member of the original independent identically distributed...
Disparities of discrete distributions are introduced as a natural and useful extension of the information-theoretic divergences. The minimum disparity point estimators are studied in regular discrete models with i.i.d. observations and their asymptotic efficiency of the first order, in the sense of Rao, is proved. These estimators are applied to continuous models with i.i.d. observations when the observation space is quantized by fixed points, or at random, by the sample quantiles of fixed orders....
For data generated by stationary Markov chains there are considered estimates of chain parameters minimizing –divergences between theoretical and empirical distributions of states. Consistency and asymptotic normality are established and the asymptotic covariance matrices are evaluated. Testing of hypotheses about the stationary distributions based on –divergences between the estimated and empirical distributions is considered as well. Asymptotic distributions of –divergence test statistics are...
Download Results (CSV)