Entropy jumps for isotropic log-concave random vectors and spectral gap

Keith Ball; Van Hoang Nguyen

Studia Mathematica (2012)

  • Volume: 213, Issue: 1, page 81-96
  • ISSN: 0039-3223

Abstract

top
We prove a quantitative dimension-free bound in the Shannon-Stam entropy inequality for the convolution of two log-concave distributions in dimension d in terms of the spectral gap of the density. The method relies on the analysis of the Fisher information production, which is the second derivative of the entropy along the (normalized) heat semigroup. We also discuss consequences of our result in the study of the isotropic constant of log-concave distributions (slicing problem).

How to cite

top

Keith Ball, and Van Hoang Nguyen. "Entropy jumps for isotropic log-concave random vectors and spectral gap." Studia Mathematica 213.1 (2012): 81-96. <http://eudml.org/doc/285823>.

@article{KeithBall2012,
abstract = {We prove a quantitative dimension-free bound in the Shannon-Stam entropy inequality for the convolution of two log-concave distributions in dimension d in terms of the spectral gap of the density. The method relies on the analysis of the Fisher information production, which is the second derivative of the entropy along the (normalized) heat semigroup. We also discuss consequences of our result in the study of the isotropic constant of log-concave distributions (slicing problem).},
author = {Keith Ball, Van Hoang Nguyen},
journal = {Studia Mathematica},
keywords = {entropy gap; Fisher information; isotropic constant; isotropic logconcave random vector; spectral gap; Shannon-Stam difference },
language = {eng},
number = {1},
pages = {81-96},
title = {Entropy jumps for isotropic log-concave random vectors and spectral gap},
url = {http://eudml.org/doc/285823},
volume = {213},
year = {2012},
}

TY - JOUR
AU - Keith Ball
AU - Van Hoang Nguyen
TI - Entropy jumps for isotropic log-concave random vectors and spectral gap
JO - Studia Mathematica
PY - 2012
VL - 213
IS - 1
SP - 81
EP - 96
AB - We prove a quantitative dimension-free bound in the Shannon-Stam entropy inequality for the convolution of two log-concave distributions in dimension d in terms of the spectral gap of the density. The method relies on the analysis of the Fisher information production, which is the second derivative of the entropy along the (normalized) heat semigroup. We also discuss consequences of our result in the study of the isotropic constant of log-concave distributions (slicing problem).
LA - eng
KW - entropy gap; Fisher information; isotropic constant; isotropic logconcave random vector; spectral gap; Shannon-Stam difference
UR - http://eudml.org/doc/285823
ER -

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.