Generalized information criteria for Bayes decisions

Domingo Morales; Igor Vajda

Kybernetika (2012)

  • Volume: 48, Issue: 4, page 714-749
  • ISSN: 0023-5954

Abstract

top
This paper deals with Bayesian models given by statistical experiments and standard loss functions. Bayes probability of error and Bayes risk are estimated by means of classical and generalized information criteria applicable to the experiment. The accuracy of the estimation is studied. Among the information criteria studied in the paper is the class of posterior power entropies which include the Shannon entropy as special case for the power α = 1 . It is shown that the most accurate estimate is in this class achieved by the quadratic posterior entropy of the power α = 2 . The paper introduces and studies also a new class of alternative power entropies which in general estimate the Bayes errors and risk more tightly than the classical power entropies. Concrete examples, tables and figures illustrate the obtained results.

How to cite

top

Morales, Domingo, and Vajda, Igor. "Generalized information criteria for Bayes decisions." Kybernetika 48.4 (2012): 714-749. <http://eudml.org/doc/246365>.

@article{Morales2012,
abstract = {This paper deals with Bayesian models given by statistical experiments and standard loss functions. Bayes probability of error and Bayes risk are estimated by means of classical and generalized information criteria applicable to the experiment. The accuracy of the estimation is studied. Among the information criteria studied in the paper is the class of posterior power entropies which include the Shannon entropy as special case for the power $\alpha =1$. It is shown that the most accurate estimate is in this class achieved by the quadratic posterior entropy of the power $\alpha =2$. The paper introduces and studies also a new class of alternative power entropies which in general estimate the Bayes errors and risk more tightly than the classical power entropies. Concrete examples, tables and figures illustrate the obtained results.},
author = {Morales, Domingo, Vajda, Igor},
journal = {Kybernetika},
keywords = {Shannon entropy; alternative Shannon entropy; power entropies; alternative power entropies; Bayes error; Bayes risk; sub-Bayes risk; power entropies; Shannon entropy; Bayes error; Bayes risk; alternative Shannon entropy; alternative power entropies; sub-Bayes risk},
language = {eng},
number = {4},
pages = {714-749},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Generalized information criteria for Bayes decisions},
url = {http://eudml.org/doc/246365},
volume = {48},
year = {2012},
}

TY - JOUR
AU - Morales, Domingo
AU - Vajda, Igor
TI - Generalized information criteria for Bayes decisions
JO - Kybernetika
PY - 2012
PB - Institute of Information Theory and Automation AS CR
VL - 48
IS - 4
SP - 714
EP - 749
AB - This paper deals with Bayesian models given by statistical experiments and standard loss functions. Bayes probability of error and Bayes risk are estimated by means of classical and generalized information criteria applicable to the experiment. The accuracy of the estimation is studied. Among the information criteria studied in the paper is the class of posterior power entropies which include the Shannon entropy as special case for the power $\alpha =1$. It is shown that the most accurate estimate is in this class achieved by the quadratic posterior entropy of the power $\alpha =2$. The paper introduces and studies also a new class of alternative power entropies which in general estimate the Bayes errors and risk more tightly than the classical power entropies. Concrete examples, tables and figures illustrate the obtained results.
LA - eng
KW - Shannon entropy; alternative Shannon entropy; power entropies; alternative power entropies; Bayes error; Bayes risk; sub-Bayes risk; power entropies; Shannon entropy; Bayes error; Bayes risk; alternative Shannon entropy; alternative power entropies; sub-Bayes risk
UR - http://eudml.org/doc/246365
ER -

References

top
  1. M. Ben Bassat, 10.1016/S0019-9958(78)90587-9, Inform. Control 39 (1978), 227-242. Zbl0394.94011MR0523439DOI10.1016/S0019-9958(78)90587-9
  2. M. Ben Bassat, J. Raviv, 10.1109/TIT.1978.1055890, IEEE Trans. Inform. Theory 24 (1978), 324-331. MR0484747DOI10.1109/TIT.1978.1055890
  3. J. O. Berger, Statistical Decision Theory and Bayesian Analysis. Second edition., Springer, Berlin 1986. MR0804611
  4. T. M. Cover, P. E. Hart, 10.1109/TIT.1967.1053964, IEEE Trans. Inform. Theory 13 (1967), 21-27. Zbl0154.44505DOI10.1109/TIT.1967.1053964
  5. P. Devijver, J. Kittler, Pattern Recognition. A Statistical Approach., Prentice Hall, Englewood Cliffs, New Jersey 1982. Zbl0542.68071MR0692767
  6. L. Devroye, L. Györfi, G. Lugosi, A Probabilistic Theory of Pattern Recognition 1996., Springer, Berlin 1996. MR1383093
  7. D. K. Faddeev, Zum Begriff der Entropie einer endlichen Wahrscheinlichkeitsschemas., Vol. I. Deutscher Verlag der Wissenschaften, Berlin 1957. 
  8. M. Feder, N. Merhav, 10.1109/18.272494, IEEE Trans. Inform. Theory 40 (1994), 259-266. Zbl0802.94004DOI10.1109/18.272494
  9. P. Harremoës, F. Topsøe, 10.1109/18.959272, IEEE Trans. Inform. Theory 47 (2001), 2944-2960. MR1872852DOI10.1109/18.959272
  10. J. Havrda, F. Charvát, Concept of structural a -entropy., Kybernetika 3 (1967), 30-35. Zbl0178.22401MR0209067
  11. L. Kanal, Patterns in pattern recognittion., IEEE Trans. Inform. Theory 20 (1974), 697-707. MR0356609
  12. V. A. Kovalevsky, The problem of character recognition from the point of view of mathematical statistics., In: Reading Automata and Pattern Recognition (in Russian) (Naukova Dumka, Kyjev, ed. 1965). English translation in: Character Readers and Pattern Recognition, Spartan Books, New York 1968, pp. 3-30. 
  13. D. Morales, L. Pardo, I. Vajda, Uncertainty of discrete stochastic systems: general theory and statistical inference., IEEE Trans. System, Man and Cybernetics, Part A 26 (1996), 1-17. 
  14. A. Rényi, Proceedings of 4th Berkeley Symp. on Probab. Statist., University of California Press, Berkeley, California 1961. MR0132570
  15. N. P. Salikhov, Confirmation of a hypothesis of I. Vajda (in Russian)., Problemy Peredachi Informatsii 10 (1974), 114-115. MR0464476
  16. D. L. Tebbe, S. J. Dwyer III, 10.1109/TIT.1968.1054135, IEEE Trans. Inform. Theory 14 (1968), 516-518. DOI10.1109/TIT.1968.1054135
  17. G. T. Toussaint, 10.1109/TSMC.1977.4309705, IEEE Trans. System, Man and Cybernetics 7 (1977), 300-302. Zbl0363.94024MR0453269DOI10.1109/TSMC.1977.4309705
  18. I. Vajda, Bounds on the minimal error probability and checking a finite or countable number of hypotheses., Inform. Transmission Problems 4 (1968), 9-17. MR0267685
  19. I. Vajda, A contribution to informational analysis of patterns., In: Methodologies of Pattern Recognition (M. S. Watanabe, ed.), Academic Press, New York 1969. 
  20. I. Vajda, K. Vašek, Majorization, concave entropies and comparison of experiments., Problems Control Inform. Theory 14 (1985), 105-115. Zbl0601.62006MR0806056
  21. I. Vajda, J. Zvárová, On generalized entropies, Bayesian decisions and statistical diversity., Kybernetika 43 (2007), 675-696. Zbl1143.94006MR2376331

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.