Displaying 161 – 180 of 268

Showing per page

On entropies for random partitions of the unit segment

Milena Bieniek, Dominik Szynal (2008)

Kybernetika

We prove the complete convergence of Shannon’s, paired, genetic and α-entropy for random partitions of the unit segment. We also derive exact expressions for expectations and variances of the above entropies using special functions.

On entropy-like functionals and codes for metrized probability spaces II

Miroslav Katětov (1992)

Commentationes Mathematicae Universitatis Carolinae

In Part I, we have proved characterization theorems for entropy-like functionals δ , λ , E , Δ and Λ restricted to the class consisting of all finite spaces P 𝔚 , the class of all semimetric spaces equipped with a bounded measure. These theorems are now extended to the case of δ , λ and E defined on the whole of 𝔚 , and of Δ and Λ restricted to a certain fairly wide subclass of 𝔚 .

On generalized conditional cumulative past inaccuracy measure

Amit Ghosh, Chanchal Kundu (2018)

Applications of Mathematics

The notion of cumulative past inaccuracy (CPI) measure has recently been proposed in the literature as a generalization of cumulative past entropy (CPE) in univariate as well as bivariate setup. In this paper, we introduce the notion of CPI of order α and study the proposed measure for conditionally specified models of two components failed at different time instants, called generalized conditional CPI (GCCPI). Several properties, including the effect of monotone transformation and bounds of GCCPI...

On generalized information and divergence measures and their applications: a brief review.

Inder Jeet Taneja, Leandro Pardo, Domingo Morales, María Luisa Menéndez (1989)

Qüestiió

The aim of this review is to give different two-parametric generalizations of the following measures: directed divergence (Kullback and Leibler, 1951), Jensen difference divergence (Burbea and Rao 1982 a,b; Rao, 1982) and Jeffreys invariant divergence (Jeffreys, 1946). These generalizations are put in the unified expression and their properties are studied. The applications of generalized information and divergence measures to comparison of experiments and the connections with Fisher information...

On generalized measures of relative information and inaccuracy

Inder Jeet Taneja, H. C. Gupta (1978)

Aplikace matematiky

Kullback's relative information and Kerridge's inaccuracy are two information-theoretic measures associated with a pair of probability distributions of a discrete random variable. The authors study a generalized measure which in particular contains a parametric generalization of relative information and inaccuracy. Some important properties of this generalized measure along with an inversion theorem are also studied.

On limiting towards the boundaries of exponential families

František Matúš (2015)

Kybernetika

This work studies the standard exponential families of probability measures on Euclidean spaces that have finite supports. In such a family parameterized by means, the mean is supposed to move along a segment inside the convex support towards an endpoint on the boundary of the support. Limit behavior of several quantities related to the exponential family is described explicitly. In particular, the variance functions and information divergences are studied around the boundary.

On metric divergences of probability measures

Igor Vajda (2009)

Kybernetika

Standard properties of φ -divergences of probability measures are widely applied in various areas of information processing. Among the desirable supplementary properties facilitating employment of mathematical methods is the metricity of φ -divergences, or the metricity of their powers. This paper extends the previously known family of φ -divergences with these properties. The extension consists of a continuum of φ -divergences which are squared metric distances and which are mostly new but include...

On solution sets of information inequalities

Nihat Ay, Walter Wenzel (2012)

Kybernetika

We investigate solution sets of a special kind of linear inequality systems. In particular, we derive characterizations of these sets in terms of minimal solution sets. The studied inequalities emerge as information inequalities in the context of Bayesian networks. This allows to deduce structural properties of Bayesian networks, which is important within causal inference.

On some functional equations from additive and nonadditive measures (III).

Palaniappan Kannappan (1980)

Stochastica

In this series, this paper is devoted to the study of two related functional equations primarily connected with weighted entropy and weighted entropy of degree beta (which are weighted additive and weighted beta additive respectively) which include as special cases Shannon's entropy, inaccuracy (additive measures) and the entropy of degree beta (nonadditive) respectively. These functional equations which arise mainly from the representation and these 'additive' properties are solved for fixed m...

Currently displaying 161 – 180 of 268