Information-theoretic risk estimates in statistical decision

Albert Pérez

Kybernetika (1967)

  • Volume: 03, Issue: 1, page (1)-21
  • ISSN: 0023-5954

How to cite

top

Pérez, Albert. "Information-theoretic risk estimates in statistical decision." Kybernetika 03.1 (1967): (1)-21. <http://eudml.org/doc/28680>.

@article{Pérez1967,
author = {Pérez, Albert},
journal = {Kybernetika},
keywords = {information, communication},
language = {eng},
number = {1},
pages = {(1)-21},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Information-theoretic risk estimates in statistical decision},
url = {http://eudml.org/doc/28680},
volume = {03},
year = {1967},
}

TY - JOUR
AU - Pérez, Albert
TI - Information-theoretic risk estimates in statistical decision
JO - Kybernetika
PY - 1967
PB - Institute of Information Theory and Automation AS CR
VL - 03
IS - 1
SP - (1)
EP - 21
LA - eng
KW - information, communication
UR - http://eudml.org/doc/28680
ER -

References

top
  1. Perez A., Information, ε -Sufficiency and Data Reduction Problems, Kybernetika 1 (1965), 4, 297-323. (1965) MR0205410
  2. Perez A., Information and ε -Sufficiency, Paper No. 41 presented at the 35th Session of the International Statistical Institute, Beograd, 1965. (1965) 
  3. Perez A., Information Theory Methods in Reducing Complex Decision Problems, To appear in: Transactions of the Fourth Prague Conference on Information Theory, Statistical Decision Problems and Random Processes (1965). (1965) MR0216676
  4. Csiszár I., Eine informationstheoretische Ungleichung und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten, Publications of the Mathematical Institute of the Hungarian Academy of Sciences VIII (1963), Series A, Fasc. 1-2, 85-108. (1963) MR0164374
  5. Perez A., Extensions of the Shannon-McMillan's Limit Theorem to more general Stochastic Processes, In: Transactions of the Third Prague Conference on Information Theory, Statistical Decision Functions and Random Processes (1962), Prague 1964, 545-574. (1962) MR0165996
  6. Perez A., Notions généralisées d'incertitude, d'entropie et d'information du point de vue de la théorie des partingales, In: Transactions of the First Prague Conference on Information Theory, Statistical Decision Functions and Random Processes (1956), Prague 1957, 183 - 208. (1956) MR0099889
  7. Rényi A., On measures of entropy and information, In: Proceedings of the 4th Berkeley Symposium on Probability and Statistics, I, Berkeley, 1960, 547-561. (1960) MR0132570
  8. Kovalevskij V. A., The problem of pattern recognition from the point of view of mathematical statistics, In: Reading Automata, Kiev 1965, 8-37 (in Russian). (1965) 

Citations in EuDML Documents

top
  1. Jana Zvárová, On measures of statistical dependence
  2. Albert Pérez, On the reducibility of a set of statistical hypotheses
  3. Albert Pérez, Information-theoretic approach to measurement reduction problems
  4. Albert Pérez, ε -admissible simplifications of the dependence structure of a set of random variables
  5. Jan T. Białasiewicz, Statistical data reduction via construction of sample space partitions
  6. Igor Vajda, Jana Zvárová, On generalized entropies, Bayesian decisions and statistical diversity
  7. Alexey E. Rastegin, Convexity inequalities for estimating generalized conditional entropies from below

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.