( h , Φ ) -entropy differential metric

María Luisa Menéndez; Domingo Morales; Leandro Pardo; Miquel Salicrú

Applications of Mathematics (1997)

  • Volume: 42, Issue: 2, page 81-98
  • ISSN: 0862-7940

Abstract

top
Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic differential metrics on probability spaces. Using these methods, they obtained the Fisher information metric as a particular case. In this paper we apply the method based on entropy measures to obtain a Riemannian metric based on ( h , Φ ) -entropy measures (Salicrú et al., 1993). The geodesic distances based on that information metric have been computed for a number of parametric families of distributions. The use of geodesic distances in testing statistical hypotheses is illustrated by an example within the Pareto family. We obtain the asymptotic distribution of the information matrices associated with the metric when the parameter is replaced by its maximum likelihood estimator. The relation between the information matrices and the Cramér-Rao inequality is also obtained.

How to cite

top

Menéndez, María Luisa, et al. "$(h,\Phi )$-entropy differential metric." Applications of Mathematics 42.2 (1997): 81-98. <http://eudml.org/doc/32970>.

@article{Menéndez1997,
abstract = {Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic differential metrics on probability spaces. Using these methods, they obtained the Fisher information metric as a particular case. In this paper we apply the method based on entropy measures to obtain a Riemannian metric based on $(h,\Phi )$-entropy measures (Salicrú et al., 1993). The geodesic distances based on that information metric have been computed for a number of parametric families of distributions. The use of geodesic distances in testing statistical hypotheses is illustrated by an example within the Pareto family. We obtain the asymptotic distribution of the information matrices associated with the metric when the parameter is replaced by its maximum likelihood estimator. The relation between the information matrices and the Cramér-Rao inequality is also obtained.},
author = {Menéndez, María Luisa, Morales, Domingo, Pardo, Leandro, Salicrú, Miquel},
journal = {Applications of Mathematics},
keywords = {$(h,\Phi )$-entropy measures; information metric; geodesic distance between probability distributions; maximum likelihood estimators; asymptotic distributions; Cramér-Rao inequality.; generalized entropies; information metric; generalized entropies; geodesic distance},
language = {eng},
number = {2},
pages = {81-98},
publisher = {Institute of Mathematics, Academy of Sciences of the Czech Republic},
title = {$(h,\Phi )$-entropy differential metric},
url = {http://eudml.org/doc/32970},
volume = {42},
year = {1997},
}

TY - JOUR
AU - Menéndez, María Luisa
AU - Morales, Domingo
AU - Pardo, Leandro
AU - Salicrú, Miquel
TI - $(h,\Phi )$-entropy differential metric
JO - Applications of Mathematics
PY - 1997
PB - Institute of Mathematics, Academy of Sciences of the Czech Republic
VL - 42
IS - 2
SP - 81
EP - 98
AB - Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic differential metrics on probability spaces. Using these methods, they obtained the Fisher information metric as a particular case. In this paper we apply the method based on entropy measures to obtain a Riemannian metric based on $(h,\Phi )$-entropy measures (Salicrú et al., 1993). The geodesic distances based on that information metric have been computed for a number of parametric families of distributions. The use of geodesic distances in testing statistical hypotheses is illustrated by an example within the Pareto family. We obtain the asymptotic distribution of the information matrices associated with the metric when the parameter is replaced by its maximum likelihood estimator. The relation between the information matrices and the Cramér-Rao inequality is also obtained.
LA - eng
KW - $(h,\Phi )$-entropy measures; information metric; geodesic distance between probability distributions; maximum likelihood estimators; asymptotic distributions; Cramér-Rao inequality.; generalized entropies; information metric; generalized entropies; geodesic distance
UR - http://eudml.org/doc/32970
ER -

References

top
  1. A foundation of information geometry, vol. 66-A, , 1983, pp. 1–10. (1983) MR0747878
  2. Rao’s distance measure, vol. 43, , 1981, pp. 345–365. (1981) MR0665876
  3. 10.1016/S0019-9958(71)90065-9, Information and Control 19 (1971), no. , , 181–194. (1971) Zbl0222.94022MR0309224DOI10.1016/S0019-9958(71)90065-9
  4. Informative geometry of probability spaces, vol. 4, , 1986, pp. 347–378. (1986) MR0867963
  5. 10.1016/0047-259X(82)90065-3, J. Multivariate Analysis 12 (1982a), no. , , 575–596. (1982a) MR0680530DOI10.1016/0047-259X(82)90065-3
  6. On the convexity of some divergence measures based on entropy functions, vol. IT-28, , 1982b, pp. 489–495. (1982b) MR0672884
  7. On the convexity of higher order Jensen differences based on entropy functions, vol. IT-28, , 1982c, pp. 961–963. (1982c) MR0687297
  8. Statistical Decision Rules and Optimal Inference, vol. , , 1982, pp. . (1982) MR0645898
  9. Information-type measures of difference of probability distributions and indirect observations, vol. 2, , 1967, pp. 299–318. (1967) MR0219345
  10. New parametric measures of information, vol. 51, , 1981, pp. 193–208. (1981) MR0686839
  11. Hypoentropy and related heterogeneity divergence measures, vol. 40, , 1980, pp. 55–118. (1980) MR0586545
  12. Concept of structural α -entropy, vol. 3, , 1967, pp. 30–35. (1967) 
  13. New parametric measures of information based on generalized R -divergences, vol. , , 1993, pp. 473–488. (1993) MR1268437
  14. Aspect of Multivariate Statistical Theory, vol. , , 1982, pp. . (1982) MR0652932
  15. Energie Informationnelle, vol. 263, , 1966, pp. 841–842. (1966) MR0229478
  16. Information and accuracy attainable in the estimation of statistical parameters, vol. 37, , 1945, pp. 81–91. (1945) MR0015748
  17. Differential Metrics in probability spaces, vol. , , 1987, pp. . (1987) 
  18. On measures of entropy and information, vol. 1, , 1961, pp. 547–561. (1961) MR0132570
  19. Asymptotic distribution of ( h , Φ ) -entropies, vol. 22(7), , 1993, pp. 2015–2031. (1993) MR1238377
  20. A mathematical theory of communication, vol. 27, , 1948, pp. 379–423. (1948) Zbl1154.94303MR0026286
  21. Entropy of type ( α , β ) and other generalized measures in information theory, vol. 22, , 1975, pp. 205–215. (1975) MR0398670
  22. New non-additive measures of relative information, vol. 2, , 1975, pp. 122–133. (1975) MR0476167
  23. A study of generalized measures in information theory, vol. , , 1975, pp. . (1975) 
  24. On generalized information measures and their applications, vol. 76, , 1989, pp. 327–413. (1989) 
  25. Majorization, concave entropies and comparison of experiments, vol. 14, , 1985, pp. 105–115. (1985) MR0806056
  26. R -norm information and a general class of measures for certainty and information, M. Sc. Thesis, Delf University of Technology, Dept. E.E., (1977), no. , , . (Dutch) (1977) 
  27. A generalized probabilistic theory of the measurement of certainty and information, Ph. D. Thesis, Delf University of Technology, Dept. E.E., (1981), no. , , . (1981) 
  28. Generalizations of Renyi’s entropy of order α , vol. 1, , 1966, pp. 34–48. (1966) Zbl0166.15401MR0210515

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.