Displaying 41 – 60 of 268

Showing per page

Bounds on the information divergence for hypergeometric distributions

Peter Harremoës, František Matúš (2020)

Kybernetika

The hypergeometric distributions have many important applications, but they have not had sufficient attention in information theory. Hypergeometric distributions can be approximated by binomial distributions or Poisson distributions. In this paper we present upper and lower bounds on information divergence. These bounds are important for statistical testing and for a better understanding of the notion of exchangeability.

Canonical distributions and phase transitions

K.B. Athreya, J.D.H. Smith (2000)

Discussiones Mathematicae Probability and Statistics

Entropy maximization subject to known expected values is extended to the case where the random variables involved may take on positive infinite values. As a result, an arbitrary probability distribution on a finite set may be realized as a canonical distribution. The Rényi entropy of the distribution arises as a natural by-product of this realization. Starting with the uniform distributionon a proper subset of a set, the canonical distribution of equilibriumstatistical mechanics may be used to exhibit...

Convexity inequalities for estimating generalized conditional entropies from below

Alexey E. Rastegin (2012)

Kybernetika

Generalized entropic functionals are in an active area of research. Hence lower and upper bounds on these functionals are of interest. Lower bounds for estimating Rényi conditional α -entropy and two kinds of non-extensive conditional α -entropy are obtained. These bounds are expressed in terms of error probability of the standard decision and extend the inequalities known for the regular conditional entropy. The presented inequalities are mainly based on the convexity of some functions. In a certain...

Currently displaying 41 – 60 of 268