Page 1

Displaying 1 – 6 of 6

Showing per page

Binomial-Poisson entropic inequalities and the M/M/∞ queue

Djalil Chafaï (2006)

ESAIM: Probability and Statistics

This article provides entropic inequalities for binomial-Poisson distributions, derived from the two point space. They appear as local inequalities of the M/M/∞ queue. They describe in particular the exponential dissipation of Φ-entropies along this process. This simple queueing process appears as a model of “constant curvature”, and plays for the simple Poisson process the role played by the Ornstein-Uhlenbeck process for Brownian Motion. Some of the inequalities are recovered by semi-group ...

Bound on extended f -divergences for a variety of classes

Pietro Cerone, Sever Silvestru Dragomir, Ferdinand Österreicher (2004)

Kybernetika

The concept of f -divergences was introduced by Csiszár in 1963 as measures of the ‘hardness’ of a testing problem depending on a convex real valued function f on the interval [ 0 , ) . The choice of this parameter f can be adjusted so as to match the needs for specific applications. The definition and some of the most basic properties of f -divergences are given and the class of χ α -divergences is presented. Ostrowski’s inequality and a Trapezoid inequality are utilized in order to prove bounds for an extension...

Bounds for f -divergences under likelihood ratio constraints

Sever Silvestru Dragomir (2003)

Applications of Mathematics

In this paper we establish an upper and a lower bound for the f -divergence of two discrete random variables under likelihood ratio constraints in terms of the Kullback-Leibler distance. Some particular cases for Hellinger and triangular discimination, χ 2 -distance and Rényi’s divergences, etc. are also considered.

Bounds on the information divergence for hypergeometric distributions

Peter Harremoës, František Matúš (2020)

Kybernetika

The hypergeometric distributions have many important applications, but they have not had sufficient attention in information theory. Hypergeometric distributions can be approximated by binomial distributions or Poisson distributions. In this paper we present upper and lower bounds on information divergence. These bounds are important for statistical testing and for a better understanding of the notion of exchangeability.

Currently displaying 1 – 6 of 6

Page 1