Mean mutual information and symmetry breaking for finite random fields

J. Buzzi; L. Zambotti

Annales de l'I.H.P. Probabilités et statistiques (2012)

  • Volume: 48, Issue: 2, page 343-367
  • ISSN: 0246-0203

Abstract

top
G. Edelman, O. Sporns and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely invariance under permutations and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that maximizers have small support and exchangeable systems have small intricacy. In particular, maximizing intricacy leads to spontaneous symmetry breaking and lack of uniqueness.

How to cite

top

Buzzi, J., and Zambotti, L.. "Mean mutual information and symmetry breaking for finite random fields." Annales de l'I.H.P. Probabilités et statistiques 48.2 (2012): 343-367. <http://eudml.org/doc/272043>.

@article{Buzzi2012,
abstract = {G. Edelman, O. Sporns and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely invariance under permutations and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that maximizers have small support and exchangeable systems have small intricacy. In particular, maximizing intricacy leads to spontaneous symmetry breaking and lack of uniqueness.},
author = {Buzzi, J., Zambotti, L.},
journal = {Annales de l'I.H.P. Probabilités et statistiques},
keywords = {entropy; mutual information; complexity; discrete probability; exchangeable random variables; neural complexity},
language = {eng},
number = {2},
pages = {343-367},
publisher = {Gauthier-Villars},
title = {Mean mutual information and symmetry breaking for finite random fields},
url = {http://eudml.org/doc/272043},
volume = {48},
year = {2012},
}

TY - JOUR
AU - Buzzi, J.
AU - Zambotti, L.
TI - Mean mutual information and symmetry breaking for finite random fields
JO - Annales de l'I.H.P. Probabilités et statistiques
PY - 2012
PB - Gauthier-Villars
VL - 48
IS - 2
SP - 343
EP - 367
AB - G. Edelman, O. Sporns and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely invariance under permutations and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that maximizers have small support and exchangeable systems have small intricacy. In particular, maximizing intricacy leads to spontaneous symmetry breaking and lack of uniqueness.
LA - eng
KW - entropy; mutual information; complexity; discrete probability; exchangeable random variables; neural complexity
UR - http://eudml.org/doc/272043
ER -

References

top
  1. [1] D. J. Aldous. Exchangeability and related topics. In Ecole d’été de probabilités de Saint-Flour, XIII 1–198. Lecture Notes in Math. 1117. Springer, Berlin, 1985. Zbl0562.60042MR883646
  2. [2] P. Bak and M. Paczuski. Complexity, contingency and criticality. Proc. Natl. Acad. Sci. USA92 (1995) 6689–6696. 
  3. [3] L. Barnett, C. L. Buckley and S. Bullock. Neural complexity and structural connectivity. Phys. Rev. E 79 (2009) 051914. MR2551416
  4. [4] C. Bennett. How to define complexity in physics and why. In Complexity, Entropy and the Physics of Information, Vol. VIII. W. Zurek (Ed.). Addison-Wesley, Redwood City, 1990. 
  5. [5] J. Bertoin. Random Fragmentation and Coagulation Processes. Cambridge Univ. Press, Cambridge, 2006. Zbl1107.60002
  6. [6] J. Buzzi and L. Zambotti. Approximate maximizers of intricacy functionals. Probab. Theory Related Fields. To appear. Available at http://arxiv.org/abs/0909.2120. Zbl1261.94019MR2948682
  7. [7] T. Cover and J. Thomas. Elements of Information Theory. John Wiley & Sons, Hoboken, NJ, 2006. Zbl1140.94001
  8. [8] J. Crutchfield and K. Young. Inferring statistical complexity. Phys. Rev. Lett.63 (1989) 105–109. MR1001514
  9. [9] M. De Lucia, M. Bottaccio, M. Montuori and L. Pietronero. A topological approach to neural complexity. Phys. Rev. E 71 (2005), 016114. MR2139320
  10. [10] G. Edelman and J. Gally. Degeneracy and complexity in biological systems. Proc. Natl. Acad. Sci. USA98 (2001) 13763–13768. 
  11. [11] N. Goldenfeld and L. Kadanoff. Simple lessons from complexity. Science284 (1999) 87–89. 
  12. [12] A. Greven, G. Keller and G. Warnecke. Entropy. Princeton Univ. Press, Princeton, NJ, 2003. Zbl1187.00001MR2035814
  13. [13] S. Fujishige. Polymatroidal dependence structure of a set of random variables. Information and Control39 (1978) 55–72. Zbl0388.94006MR514262
  14. [14] T. S. Han. Nonnegative entropy measures of multivariate symmetric correlations. Information and Control36 (1978) 133–156. Zbl0367.94041MR464499
  15. [15] K. Holthausen and O. Breidbach. Analytical description of the evolution of neural networks: Learning rules and complexity. Biol. Cybern.81 (1999) 169–176. Zbl0929.92004
  16. [16] J. Krichmar, D. Nitz, J. Gally and G. Edelman. Characterizing functional hippocampal pathways in a brain-based device as it solves a spatial memory task. Proc. Natl. Acad. Sci. USA102 (2005) 2111–2116. 
  17. [17] M. Madiman and P. Tetali. Information inequalities for joint distributions, with interpretations and applications. IEEE Trans. Inform. Theory56 (2010) 2699–2713. MR2683430
  18. [18] A. Seth, E. Izhikevich, G. Reeke and G. Edelman. Theories and measures of consciousness: An extended framework. Proc. Natl. Acad. Sci. USA103 (2006) 10799–10804. 
  19. [19] A. Seth. Models of consciousness. Scholarpedia 2 (2007) 1328. 
  20. [20] M. P. Shanahan. Dynamical complexity in small-world networks of spiking neurons. Phys. Rev. E 78 (2008) 041924. MR2529582
  21. [21] O. Sporns, G. Tononi and G. Edelman. Connectivity and complexity: The relationship between neuroanatomy and brain dynamics. Neural Netw.13 (2000) 909–922. 
  22. [22] O. Sporns. Networks analysis, complexity, and brain function. Complexity8 (2002) 56–60. MR1969099
  23. [23] O. Sporns. Complexity. Scholarpedia 2 (2007) 1623. 
  24. [24] M. Talagrand. Spin Glasses: A Challenge for Mathematicians. Springer, Berlin, 2003. Zbl1033.82002MR1993891
  25. [25] T. S. Han. Nonnegative entropy measures of multivariate symmetric correlations. Information and Control36 (1978) 133–156. Zbl0367.94041MR464499
  26. [26] G. Tononi, O. Sporns and G. Edelman. A measure for brain complexity: Relating functional segregation and integration in the nervous system. Proc. Natl. Acad. Sci. USA91 (1994) 5033–5037. 
  27. [27] G. Tononi, O. Sporns and G. Edelman. A complexity measure for selective matching of signals by the brain. Proc. Natl. Acad. Sci. USA93 (1996) 3422–3427. 
  28. [28] G. Tononi, O. Sporns and G. Edelman. Measures of degeneracy and redundancy in biological networks. Proc. Natl. Acad. Sci. USA96 (1999), 3257–3262. 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.