-valuations of graphs
Some basic theorems and formulae (equations and inequalities) of several areas of mathematics that hold in Bernstein spaces are no longer valid in larger spaces. However, when a function f is in some sense close to a Bernstein space, then the corresponding relation holds with a remainder or error term. This paper presents a new, unified approach to these errors in terms of the distance of f from . The difficult situation of derivative-free error estimates is also covered.
This article provides entropic inequalities for binomial-Poisson distributions, derived from the two point space. They appear as local inequalities of the M/M/∞ queue. They describe in particular the exponential dissipation of Φ-entropies along this process. This simple queueing process appears as a model of “constant curvature”, and plays for the simple Poisson process the role played by the Ornstein-Uhlenbeck process for Brownian Motion. Some of the inequalities are recovered by semi-group ...
The concept of -divergences was introduced by Csiszár in 1963 as measures of the ‘hardness’ of a testing problem depending on a convex real valued function on the interval . The choice of this parameter can be adjusted so as to match the needs for specific applications. The definition and some of the most basic properties of -divergences are given and the class of -divergences is presented. Ostrowski’s inequality and a Trapezoid inequality are utilized in order to prove bounds for an extension...
In this paper we establish an upper and a lower bound for the -divergence of two discrete random variables under likelihood ratio constraints in terms of the Kullback-Leibler distance. Some particular cases for Hellinger and triangular discimination, -distance and Rényi’s divergences, etc. are also considered.
The hypergeometric distributions have many important applications, but they have not had sufficient attention in information theory. Hypergeometric distributions can be approximated by binomial distributions or Poisson distributions. In this paper we present upper and lower bounds on information divergence. These bounds are important for statistical testing and for a better understanding of the notion of exchangeability.