Currently displaying 1 – 4 of 4

Showing per page

Order by Relevance | Title | Year of publication

Joint Range of Rényi entropies

Peter Harremoës — 2009

Kybernetika

The exact range of the joined values of several Rényi entropies is determined. The method is based on topology with special emphasis on the orientation of the objects studied. Like in the case when only two orders of the Rényi entropies are studied, one can parametrize the boundary of the range. An explicit formula for a tight upper or lower bound for one order of entropy in terms of another order of entropy cannot be given.

Bounds on tail probabilities for negative binomial distributions

Peter Harremoës — 2016

Kybernetika

In this paper we derive various bounds on tail probabilities of distributions for which the generated exponential family has a linear or quadratic variance function. The main result is an inequality relating the signed log-likelihood of a negative binomial distribution with the signed log-likelihood of a Gamma distribution. This bound leads to a new bound on the signed log-likelihood of a binomial distribution compared with a Poisson distribution that can be used to prove an intersection property...

Bounds on the information divergence for hypergeometric distributions

Peter HarremoësFrantišek Matúš — 2020

Kybernetika

The hypergeometric distributions have many important applications, but they have not had sufficient attention in information theory. Hypergeometric distributions can be approximated by binomial distributions or Poisson distributions. In this paper we present upper and lower bounds on information divergence. These bounds are important for statistical testing and for a better understanding of the notion of exchangeability.

Page 1

Download Results (CSV)