On the branching property of entropy
Kullback's relative information and Kerridge's inaccuracy are two information-theoretic measures associated with a pair of probability distributions of a discrete random variable. The authors study a generalized measure which in particular contains a parametric generalization of relative information and inaccuracy. Some important properties of this generalized measure along with an inversion theorem are also studied.
Belis and Guiasu studied a generalization of Shannon entropy as weighted or useful entropy. In this paper, the weighted entropy of type is defined and characterized and some if its properties are studied. Further generalizations involving more parameters of weighted entropy are also specified.
The aim of this review is to give different two-parametric generalizations of the following measures: directed divergence (Kullback and Leibler, 1951), Jensen difference divergence (Burbea and Rao 1982 a,b; Rao, 1982) and Jeffreys invariant divergence (Jeffreys, 1946). These generalizations are put in the unified expression and their properties are studied. The applications of generalized information and divergence measures to comparison of experiments and the connections with Fisher information...
We consider a measure of the diversity of a population based on the λ-measure of hypoentropy introduced by Ferreri (1980). Our purpose is to study its asymptotic distribution for testing hypotheses. A numerical example based on real data is given.
Various information, divergence and distance measures have been used by researchers to compare experiments using classical approaches such as those of Blackwell, Bayesian ets. Blackwell's [1] idea of comparing two statistical experiments is based on the existence of stochastic transformations. Using this idea of Blackwell, as well as the classical bayesian approach, we have compared statistical experiments by considering unified scalar parametric generalizations of Jensen difference divergence measure....
Page 1