On characterization of useful information-theoretic measures
In this paper a new framework for the study of measures of dispersion for a class of n-dimensional lists is proposed. The concept of monotonicity with respect to a sharpened-type order is introduced. This type of monotonicity, together with other well known conditions, allows to create a reasonable and general ambit where the notion of dispersion measure can be studied. Some properties are analized and relations with other approaches carried out by different authors on this subject are established....
We prove the complete convergence of Shannon’s, paired, genetic and α-entropy for random partitions of the unit segment. We also derive exact expressions for expectations and variances of the above entropies using special functions.
In Part I, we have proved characterization theorems for entropy-like functionals , , , and restricted to the class consisting of all finite spaces , the class of all semimetric spaces equipped with a bounded measure. These theorems are now extended to the case of , and defined on the whole of , and of and restricted to a certain fairly wide subclass of .
The notion of cumulative past inaccuracy (CPI) measure has recently been proposed in the literature as a generalization of cumulative past entropy (CPE) in univariate as well as bivariate setup. In this paper, we introduce the notion of CPI of order and study the proposed measure for conditionally specified models of two components failed at different time instants, called generalized conditional CPI (GCCPI). Several properties, including the effect of monotone transformation and bounds of GCCPI...
The aim of this review is to give different two-parametric generalizations of the following measures: directed divergence (Kullback and Leibler, 1951), Jensen difference divergence (Burbea and Rao 1982 a,b; Rao, 1982) and Jeffreys invariant divergence (Jeffreys, 1946). These generalizations are put in the unified expression and their properties are studied. The applications of generalized information and divergence measures to comparison of experiments and the connections with Fisher information...
Kullback's relative information and Kerridge's inaccuracy are two information-theoretic measures associated with a pair of probability distributions of a discrete random variable. The authors study a generalized measure which in particular contains a parametric generalization of relative information and inaccuracy. Some important properties of this generalized measure along with an inversion theorem are also studied.
This work studies the standard exponential families of probability measures on Euclidean spaces that have finite supports. In such a family parameterized by means, the mean is supposed to move along a segment inside the convex support towards an endpoint on the boundary of the support. Limit behavior of several quantities related to the exponential family is described explicitly. In particular, the variance functions and information divergences are studied around the boundary.
In this paper, measurable solutions of a functional equation with four unknown functions are obtained. As an application of the measurable solutions a joint characterization of Shannon’s entropy and entropy of type is given.
Standard properties of -divergences of probability measures are widely applied in various areas of information processing. Among the desirable supplementary properties facilitating employment of mathematical methods is the metricity of -divergences, or the metricity of their powers. This paper extends the previously known family of -divergences with these properties. The extension consists of a continuum of -divergences which are squared metric distances and which are mostly new but include...