Global information in statistical experiments and consistency of likelihood-based estimates and tests

Igor Vajda

Kybernetika (1998)

  • Volume: 34, Issue: 3, page [245]-263
  • ISSN: 0023-5954

Abstract

top
In the framework of standard model of asymptotic statistics we introduce a global information in the statistical experiment about the occurrence of the true parameter in a given set. Basic properties of this information are established, including relations to the Kullback and Fisher information. Its applicability in point estimation and testing statistical hypotheses is demonstrated.

How to cite

top

Vajda, Igor. "Global information in statistical experiments and consistency of likelihood-based estimates and tests." Kybernetika 34.3 (1998): [245]-263. <http://eudml.org/doc/33353>.

@article{Vajda1998,
abstract = {In the framework of standard model of asymptotic statistics we introduce a global information in the statistical experiment about the occurrence of the true parameter in a given set. Basic properties of this information are established, including relations to the Kullback and Fisher information. Its applicability in point estimation and testing statistical hypotheses is demonstrated.},
author = {Vajda, Igor},
journal = {Kybernetika},
keywords = {information divergence; point estimation; testing statistical hypotheses; information divergence; point estimation; testing statistical hypotheses},
language = {eng},
number = {3},
pages = {[245]-263},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Global information in statistical experiments and consistency of likelihood-based estimates and tests},
url = {http://eudml.org/doc/33353},
volume = {34},
year = {1998},
}

TY - JOUR
AU - Vajda, Igor
TI - Global information in statistical experiments and consistency of likelihood-based estimates and tests
JO - Kybernetika
PY - 1998
PB - Institute of Information Theory and Automation AS CR
VL - 34
IS - 3
SP - [245]
EP - 263
AB - In the framework of standard model of asymptotic statistics we introduce a global information in the statistical experiment about the occurrence of the true parameter in a given set. Basic properties of this information are established, including relations to the Kullback and Fisher information. Its applicability in point estimation and testing statistical hypotheses is demonstrated.
LA - eng
KW - information divergence; point estimation; testing statistical hypotheses; information divergence; point estimation; testing statistical hypotheses
UR - http://eudml.org/doc/33353
ER -

References

top
  1. Bahadur R. R., Some Limit Theorems in Statistics, SIAM, Philadelphia 1971 Zbl0257.62015MR0315820
  2. Berk R. H., 10.1214/aoms/1177699597, Ann. Math. Statist. 37 (1966), 51–58 (1966) Zbl0151.23802MR0189176DOI10.1214/aoms/1177699597
  3. Cover T. M., Thomas J. B., Elements of Information Theory, Wiley, New York 1991 Zbl1140.94001MR1122806
  4. Groot, M H. De, 10.1214/aoms/1177704567, Ann. Math. Statist. 33 (1962), 404–419 (1962) MR0139242DOI10.1214/aoms/1177704567
  5. Groot M. H. De, Optimal Statistical Decisions, McGraw Hill, New York 1970 MR0356303
  6. Gallager R. C., Information Theory and Reliable Communication, Wiley, New York 1968 Zbl0295.94001
  7. Huber P. J., Robust Statistics, Wiley, New York 1981 MR0606374
  8. Kullback S., Information Theory and Statistics, Wiley, New York 1959 Zbl0897.62003MR0103557
  9. Lehman E. L., Testing Statistical Hypotheses, Second edition. Wiley, New York 1986 
  10. Liese F., Vajda I., 10.1007/BF01894328, Metrika 42 (1995), 93–114 (1995) Zbl0834.62024DOI10.1007/BF01894328
  11. Lindley D. V., 10.1214/aoms/1177728069, Ann. Math. Statist. 27 (1956), 986–1005 (1956) DOI10.1214/aoms/1177728069
  12. Loéve M., Probability Theory, Wiley, New York 1963 
  13. Perlman M. D., On the strong consistency of approximate maximum likelihood estimators, In: Proc. VIth Berkeley Symp. Prob. Math. Statist., 1972, pp. 263–281 (1972) 
  14. Pfanzagl J., 10.1007/BF02613654, Metrika 14 (1969), 249–272 (1969) DOI10.1007/BF02613654
  15. Pfanzagl J., Parametric Statistical Theory, Walter de Guyter, Berlin 1994 Zbl0807.62016
  16. Rao C. R., Linear Statistical Inference and its Applications, Wiley, New York 1965 Zbl0256.62002
  17. A, Rényi: On the amount of information concerning an unknown parameter in a sequence of observations, Publ. Math. Inst. Hungar. Acad. Sci., Sec. A 9 (1964), 617–625 (1964) 
  18. Rényi A., Statistics and information theory, Stud. Scient. Math. Hungar. 2 (1967), 249–256 (1967) Zbl0155.27602
  19. Rockafellar R. T., Convex Analysis, Princeton Univ. Press, Princeton, N. J. 1970 
  20. Strasser H., 10.1214/aos/1176345590, Ann. Statist. 9 (1981), 1107–1113 (1981) Zbl0483.62019DOI10.1214/aos/1176345590
  21. Strasser A., Mathematical Theory of Statistics, DeGruyter, Berlin 1985 Zbl0594.62017
  22. Torgersen E., Comparison of Statistical Experiments, Cambridge Univ. Press, Cambridge 1991 Zbl1158.62006
  23. Vajda I., 10.1016/0304-4149(94)00069-6, Stochastic Process. Appl. 56 (1995), 35–56 (1995) Zbl0817.62073DOI10.1016/0304-4149(94)00069-6

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.