On the joint entropy of d -wise-independent variables

Dmitry Gavinsky; Pavel Pudlák

Commentationes Mathematicae Universitatis Carolinae (2016)

  • Volume: 57, Issue: 3, page 333-343
  • ISSN: 0010-2628

Abstract

top
How low can the joint entropy of n d -wise independent (for d 2 ) discrete random variables be, subject to given constraints on the individual distributions (say, no value may be taken by a variable with probability greater than p , for p < 1 )? This question has been posed and partially answered in a recent work of Babai [Entropy versus pairwise independence (preliminary version), http://people.cs.uchicago.edu/ laci/papers/13augEntropy.pdf, 2013]. In this paper we improve some of his bounds, prove new bounds in a wider range of parameters and show matching upper bounds in some special cases. In particular, we prove tight lower bounds for the min-entropy (as well as the entropy) of pairwise and three-wise independent balanced binary variables for infinitely many values of n .

How to cite

top

Gavinsky, Dmitry, and Pudlák, Pavel. "On the joint entropy of $d$-wise-independent variables." Commentationes Mathematicae Universitatis Carolinae 57.3 (2016): 333-343. <http://eudml.org/doc/286806>.

@article{Gavinsky2016,
abstract = {How low can the joint entropy of $n$$d$-wise independent (for $d\ge 2$) discrete random variables be, subject to given constraints on the individual distributions (say, no value may be taken by a variable with probability greater than $p$, for $p< 1$)? This question has been posed and partially answered in a recent work of Babai [Entropy versus pairwise independence (preliminary version), http://people.cs.uchicago.edu/ laci/papers/13augEntropy.pdf, 2013]. In this paper we improve some of his bounds, prove new bounds in a wider range of parameters and show matching upper bounds in some special cases. In particular, we prove tight lower bounds for the min-entropy (as well as the entropy) of pairwise and three-wise independent balanced binary variables for infinitely many values of $n$.},
author = {Gavinsky, Dmitry, Pudlák, Pavel},
journal = {Commentationes Mathematicae Universitatis Carolinae},
keywords = {$d$-wise-independent variables; entropy; lower bound},
language = {eng},
number = {3},
pages = {333-343},
publisher = {Charles University in Prague, Faculty of Mathematics and Physics},
title = {On the joint entropy of $d$-wise-independent variables},
url = {http://eudml.org/doc/286806},
volume = {57},
year = {2016},
}

TY - JOUR
AU - Gavinsky, Dmitry
AU - Pudlák, Pavel
TI - On the joint entropy of $d$-wise-independent variables
JO - Commentationes Mathematicae Universitatis Carolinae
PY - 2016
PB - Charles University in Prague, Faculty of Mathematics and Physics
VL - 57
IS - 3
SP - 333
EP - 343
AB - How low can the joint entropy of $n$$d$-wise independent (for $d\ge 2$) discrete random variables be, subject to given constraints on the individual distributions (say, no value may be taken by a variable with probability greater than $p$, for $p< 1$)? This question has been posed and partially answered in a recent work of Babai [Entropy versus pairwise independence (preliminary version), http://people.cs.uchicago.edu/ laci/papers/13augEntropy.pdf, 2013]. In this paper we improve some of his bounds, prove new bounds in a wider range of parameters and show matching upper bounds in some special cases. In particular, we prove tight lower bounds for the min-entropy (as well as the entropy) of pairwise and three-wise independent balanced binary variables for infinitely many values of $n$.
LA - eng
KW - $d$-wise-independent variables; entropy; lower bound
UR - http://eudml.org/doc/286806
ER -

References

top
  1. Alon N., Babai L., Itai A., 10.1016/0196-6774(86)90019-2, J. Algorithms 7 (1986), no. 4, 567–583. Zbl0631.68063MR0866792DOI10.1016/0196-6774(86)90019-2
  2. Alon N., Spencer J., The Probabilistic Method, John Wiley, Hoboken, NJ, 2008. Zbl1333.05001MR2437651
  3. Babai L., Entropy versus pairwise independence, (preliminary version), http://people.cs.uchicago.edu/ laci/papers/13augEntropy.pdf, 2013. 
  4. Cantelli F.P., Intorno ad un teorema fondamentale della teoria del rischio, Bollettino dell' Associazione degli Attuari Italiani 24 (1910), 1–23. 
  5. Lancaster H.O., 10.1214/aoms/1177700007, Ann. Math. Statist. 36 (1965), no. 4, 1313–1317. Zbl0131.18105MR0176507DOI10.1214/aoms/1177700007
  6. Luby M., Wigderson A., 10.1561/0400000009, Found. Trends Theor. Comput. Sci. 1 (2005), no. 4, 237–301. Zbl1140.68402MR2379508DOI10.1561/0400000009
  7. MacWilliams F.J., Sloane N.J.A., The Theory of Error-Correcting Codes, North Holland Publishing Co., Amsterdam-New York-Oxford, 1977. Zbl0657.94010

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.