On the amount of information contained in a sequence of independent observations

Igor Vajda

Kybernetika (1970)

  • Volume: 06, Issue: 4, page (306)-324
  • ISSN: 0023-5954

How to cite

top

Vajda, Igor. "On the amount of information contained in a sequence of independent observations." Kybernetika 06.4 (1970): (306)-324. <http://eudml.org/doc/28463>.

@article{Vajda1970,
author = {Vajda, Igor},
journal = {Kybernetika},
language = {eng},
number = {4},
pages = {(306)-324},
publisher = {Institute of Information Theory and Automation AS CR},
title = {On the amount of information contained in a sequence of independent observations},
url = {http://eudml.org/doc/28463},
volume = {06},
year = {1970},
}

TY - JOUR
AU - Vajda, Igor
TI - On the amount of information contained in a sequence of independent observations
JO - Kybernetika
PY - 1970
PB - Institute of Information Theory and Automation AS CR
VL - 06
IS - 4
SP - (306)
EP - 324
LA - eng
UR - http://eudml.org/doc/28463
ER -

References

top
  1. I. Vajda, On the convergence of information contained in a sequence of observations, Proc. Coll. on Inf. Th., Debrecen (Hungary), Budapest 1969. (1969) MR0258525
  2. A. Rényi, On some basic problems of statistics from the point of view of information theory, Proc. Coll. on Inf. Th., Debrecen (Hungary), Budapest 1969. (1969) 
  3. H. Chernoff, A measure of efficiency for tests of a hypothesis based on the sum of observations, Ann. Math. Stat. 23 (1952), 493-507. (1952) MR0057518
  4. I. Vajda, Limit theorems for total variation of Cartesian product measures, Studia Sci. Math. Hungarica (in print). Zbl0243.62034MR0310950
  5. J. L. Doob, Stochastic processes, J. Willey, N. Y. 1953. (1953) Zbl0053.26802MR0058896
  6. L. H. Koopmans, Asymptotic rate of discrimination for Markov processes, Ann. Math. Stat. 31 (1960), 982-994. (1960) Zbl0096.12603MR0119368
  7. S. Bochner, Lectures on Fourier integrals, Princeton Univ. Press, 1959. (1959) Zbl0085.31802MR0107124
  8. A. Perez, Notions généralisées d'incertitude, d'entropie et d'information du point de vue de la théorie de martingales, Trans. First Prague Conf. on Inf. Th., Prague 1957. (1957) Zbl0102.13204
  9. I. Csiszár, Information-type measures of difference of probability distributions and indirect observations, Studia Sci. Math. Hungarica 2 (1967), 299-318. (1967) MR0219345
  10. A. Rényi, On measures of entropy and information, Proc. 4th Berkeley Symp. on Prob. and Math. Stat., Berkeley, Vol. I, 547-561. MR0132570
  11. O. Kraft D. Plachky, Bounds for the power of likelihood ratio tests and their asymptotic properties, (Preliminary report, University of Münster). 
  12. S. Kullback, Information theory and statistics, Willey, N. Y. 1959. (1959) Zbl0088.10406MR0103557
  13. H. Chernoff, Large sample theory: Parametric case, Ann. Math. Stat 27 (1956), 1-22. (1956) Zbl0072.35703MR0076245
  14. I. Vajda, Accumulation of information in case that sample variables depend on sample size, Studia Sci. Math. Hungarica (in print). 
  15. Sanov, On large deviations probabilities of random variables, Mat. Sborník, N. S. 42 (1957), 11-44. (1957) 
  16. R. R. Bahadur R. Ranga Rao, On deviations of the sample mean, Ann. Math. Stat. 31 (1960), 1015-1027. (1960) MR0117775

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.