On typical encodings of multivariate ergodic sources

Michal Kupsa

Kybernetika (2020)

  • Volume: 56, Issue: 6, page 1090-1110
  • ISSN: 0023-5954

Abstract

top
We show that the typical coordinate-wise encoding of multivariate ergodic source into prescribed alphabets has the entropy profile close to the convolution of the entropy profile of the source and the modular polymatroid that is determined by the cardinalities of the output alphabets. We show that the proportion of the exceptional encodings that are not close to the convolution goes to zero doubly exponentially. The result holds for a class of multivariate sources that satisfy asymptotic equipartition property described via the mean fluctuation of the information functions. This class covers asymptotically mean stationary processes with ergodic mean, ergodic processes, irreducible Markov chains with an arbitrary initial distribution. We also proved that typical encodings yield the asymptotic equipartition property for the output variables. These asymptotic results are based on an explicit lower bound of the proportion of encodings that transform a multivariate random variable into a variable with the entropy profile close to the suitable convolution.

How to cite

top

Kupsa, Michal. "On typical encodings of multivariate ergodic sources." Kybernetika 56.6 (2020): 1090-1110. <http://eudml.org/doc/296932>.

@article{Kupsa2020,
abstract = {We show that the typical coordinate-wise encoding of multivariate ergodic source into prescribed alphabets has the entropy profile close to the convolution of the entropy profile of the source and the modular polymatroid that is determined by the cardinalities of the output alphabets. We show that the proportion of the exceptional encodings that are not close to the convolution goes to zero doubly exponentially. The result holds for a class of multivariate sources that satisfy asymptotic equipartition property described via the mean fluctuation of the information functions. This class covers asymptotically mean stationary processes with ergodic mean, ergodic processes, irreducible Markov chains with an arbitrary initial distribution. We also proved that typical encodings yield the asymptotic equipartition property for the output variables. These asymptotic results are based on an explicit lower bound of the proportion of encodings that transform a multivariate random variable into a variable with the entropy profile close to the suitable convolution.},
author = {Kupsa, Michal},
journal = {Kybernetika},
keywords = {entropy; entropy rate; multivariate source; ergodic source; a.e.p. property},
language = {eng},
number = {6},
pages = {1090-1110},
publisher = {Institute of Information Theory and Automation AS CR},
title = {On typical encodings of multivariate ergodic sources},
url = {http://eudml.org/doc/296932},
volume = {56},
year = {2020},
}

TY - JOUR
AU - Kupsa, Michal
TI - On typical encodings of multivariate ergodic sources
JO - Kybernetika
PY - 2020
PB - Institute of Information Theory and Automation AS CR
VL - 56
IS - 6
SP - 1090
EP - 1110
AB - We show that the typical coordinate-wise encoding of multivariate ergodic source into prescribed alphabets has the entropy profile close to the convolution of the entropy profile of the source and the modular polymatroid that is determined by the cardinalities of the output alphabets. We show that the proportion of the exceptional encodings that are not close to the convolution goes to zero doubly exponentially. The result holds for a class of multivariate sources that satisfy asymptotic equipartition property described via the mean fluctuation of the information functions. This class covers asymptotically mean stationary processes with ergodic mean, ergodic processes, irreducible Markov chains with an arbitrary initial distribution. We also proved that typical encodings yield the asymptotic equipartition property for the output variables. These asymptotic results are based on an explicit lower bound of the proportion of encodings that transform a multivariate random variable into a variable with the entropy profile close to the suitable convolution.
LA - eng
KW - entropy; entropy rate; multivariate source; ergodic source; a.e.p. property
UR - http://eudml.org/doc/296932
ER -

References

top
  1. Bassoli, R., Marques, H., Rodriguez, J., Shum, K. W., Tafazolli, R., 10.1109/surv.2013.013013.00104, IEEE Commun. Surveys Tutor. 15 (2013), 1950-1978. DOI10.1109/surv.2013.013013.00104
  2. Cover, T. M., Thomas, J. A., 10.1002/0471200611, John Wiley and Sons, 2012. MR2239987DOI10.1002/0471200611
  3. Gray, R. M., Kieffer, J. C., 10.1214/aop/1176994624, Ann. Probab. 8 (1980), 962-973. MR0586779DOI10.1214/aop/1176994624
  4. Gray, R. M., Entropy and Information Theory., Springer Science and Business Media, 2011. MR3134681
  5. Kaced, T., Partage de secret et théorie algorithmique de l'information., PhD. Thesis, Université Montpellier 2, 2012. 
  6. Kieffer, J. C., 10.1214/aop/1176996230, Ann. Probab. 3 (1975), 1031-1037. MR0393422DOI10.1214/aop/1176996230
  7. Matúš, F., 10.1109/tit.2006.887090, IEEE Trans. Inform. Theory 53 (2007), 320-330. MR2292891DOI10.1109/tit.2006.887090
  8. Matúš, F., Csirmaz, L., 10.1109/tit.2016.2601598, IEEE Trans. Inform. Theory 62 (2016), 6007-6018. MR3565097DOI10.1109/tit.2016.2601598
  9. Matúš, F., Kupsa, M., 10.1109/isit.2010.5513700, In: Proc. IEEE International Symposium on Information Theory 2010, pp. 1272-1275. DOI10.1109/isit.2010.5513700
  10. Yeung, R. W., 10.1007/978-0-387-79234-7_1, Springer Science and Business Media, 2008. DOI10.1007/978-0-387-79234-7_1

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.