Previous Page 2

Displaying 21 – 28 of 28

Showing per page

Bounds for f -divergences under likelihood ratio constraints

Sever Silvestru Dragomir (2003)

Applications of Mathematics

In this paper we establish an upper and a lower bound for the f -divergence of two discrete random variables under likelihood ratio constraints in terms of the Kullback-Leibler distance. Some particular cases for Hellinger and triangular discimination, χ 2 -distance and Rényi’s divergences, etc. are also considered.

Bounds on the information divergence for hypergeometric distributions

Peter Harremoës, František Matúš (2020)

Kybernetika

The hypergeometric distributions have many important applications, but they have not had sufficient attention in information theory. Hypergeometric distributions can be approximated by binomial distributions or Poisson distributions. In this paper we present upper and lower bounds on information divergence. These bounds are important for statistical testing and for a better understanding of the notion of exchangeability.

Currently displaying 21 – 28 of 28

Previous Page 2