Displaying similar documents to “An entropy proof of the Kahn-Lovász theorem.”

Entropy jumps for isotropic log-concave random vectors and spectral gap

Keith Ball, Van Hoang Nguyen (2012)

Studia Mathematica

Similarity:

We prove a quantitative dimension-free bound in the Shannon-Stam entropy inequality for the convolution of two log-concave distributions in dimension d in terms of the spectral gap of the density. The method relies on the analysis of the Fisher information production, which is the second derivative of the entropy along the (normalized) heat semigroup. We also discuss consequences of our result in the study of the isotropic constant of log-concave distributions (slicing problem). ...

Isomorphic random Bernoulli shifts

V. Gundlach, G. Ochs (2000)

Colloquium Mathematicae

Similarity:

We develop a relative isomorphism theory for random Bernoulli shifts by showing that any random Bernoulli shifts are relatively isomorphic if and only if they have the same fibre entropy. This allows the identification of random Bernoulli shifts with standard Bernoulli shifts.

A new approach to mutual information

Fumio Hiai, Dénes Petz (2007)

Banach Center Publications

Similarity:

A new expression as a certain asymptotic limit via "discrete micro-states" of permutations is provided for the mutual information of both continuous and discrete random variables.