On metric divergences of probability measures

Igor Vajda

Kybernetika (2009)

  • Volume: 45, Issue: 6, page 885-900
  • ISSN: 0023-5954

Abstract

top
Standard properties of -divergences of probability measures are widely applied in various areas of information processing. Among the desirable supplementary properties facilitating employment of mathematical methods is the metricity of -divergences, or the metricity of their powers. This paper extends the previously known family of -divergences with these properties. The extension consists of a continuum of -divergences which are squared metric distances and which are mostly new but include also some classical cases like e. g. the Le Cam squared distance. The paper establishes also basic properties of the -divergences from the extended class including the range of values and the upper and lower bounds attained under fixed total variation.

How to cite

top

Vajda, Igor. "On metric divergences of probability measures." Kybernetika 45.6 (2009): 885-900. <http://eudml.org/doc/37684>.

@article{Vajda2009,
abstract = {Standard properties of $\phi $-divergences of probability measures are widely applied in various areas of information processing. Among the desirable supplementary properties facilitating employment of mathematical methods is the metricity of $\phi $-divergences, or the metricity of their powers. This paper extends the previously known family of $\phi $-divergences with these properties. The extension consists of a continuum of $\phi $-divergences which are squared metric distances and which are mostly new but include also some classical cases like e. g. the Le Cam squared distance. The paper establishes also basic properties of the $\phi $-divergences from the extended class including the range of values and the upper and lower bounds attained under fixed total variation.},
author = {Vajda, Igor},
journal = {Kybernetika},
keywords = {total variation; Hellinger divergence; Le Cam divergence; Information divergence; Jensen-Shannon divergence; metric divergences; metric divergences; total variation; Hellinger divergence; Le Cam divergence; information divergence; Jensen-Shannon divergence},
language = {eng},
number = {6},
pages = {885-900},
publisher = {Institute of Information Theory and Automation AS CR},
title = {On metric divergences of probability measures},
url = {http://eudml.org/doc/37684},
volume = {45},
year = {2009},
}

TY - JOUR
AU - Vajda, Igor
TI - On metric divergences of probability measures
JO - Kybernetika
PY - 2009
PB - Institute of Information Theory and Automation AS CR
VL - 45
IS - 6
SP - 885
EP - 900
AB - Standard properties of $\phi $-divergences of probability measures are widely applied in various areas of information processing. Among the desirable supplementary properties facilitating employment of mathematical methods is the metricity of $\phi $-divergences, or the metricity of their powers. This paper extends the previously known family of $\phi $-divergences with these properties. The extension consists of a continuum of $\phi $-divergences which are squared metric distances and which are mostly new but include also some classical cases like e. g. the Le Cam squared distance. The paper establishes also basic properties of the $\phi $-divergences from the extended class including the range of values and the upper and lower bounds attained under fixed total variation.
LA - eng
KW - total variation; Hellinger divergence; Le Cam divergence; Information divergence; Jensen-Shannon divergence; metric divergences; metric divergences; total variation; Hellinger divergence; Le Cam divergence; information divergence; Jensen-Shannon divergence
UR - http://eudml.org/doc/37684
ER -

References

top
  1. I. Csiszár, Information-type measures of difference of probability distributions and indirect observations, Studia Sci. Math. Hungar. 2 (1967), 299–318. (1967) MR0219345
  2. I. Csiszár, On topological properties of -divergences, Studia Sci. Math. Hungar. 2 (1967), 329–339. (1967) MR0219346
  3. B. Fuglede, T. Topsøe, Jensen–Shannon divergence and Hilbert space embedding, In: Proc. IEEE Internat. Symposium on Inform. Theory, IEEE Publications, New York 2004, p. 31. (2004) 
  4. P. Kafka, F. Österreicher, I. Vincze, On powers of -divergences defining a distance, Stud. Sci. Math. Hungar. 26 (1991), 329–339. (1991) MR1197090
  5. M. Khosravifard, D. Fooladivanda, T. A. Gulliver, 10.1093/ietfec/e90-a.9.1848, IEICE Trans. on Fundamentals E90-A (2007), 1848–1853. (2007) DOI10.1093/ietfec/e90-a.9.1848
  6. V. Kůs, D. Morales, I. Vajda, Extensions of the parametric families of divergences used in statistical inference, Kybernetika 44 (2008), 95–112. (2008) Zbl1142.62002MR2405058
  7. L. Le Cam, Asymptotic Methods in Statistical Decision Theory, Springer, New York 1986. (1986) Zbl0605.62002MR0856411
  8. F. Liese, I. Vajda, Convex Statistical Distances, Teubner, Leipzig 1987. (1987) Zbl0656.62004MR0926905
  9. F. Liese, I. Vajda, 10.1109/TIT.2006.881731, IEEE Trans. Inform. Theory 52 (2006), 4394–4412. (2006) MR2300826DOI10.1109/TIT.2006.881731
  10. K. Matusita, 10.1214/aoms/1177728422, Ann. Math. Statist. 26 (1955), 631–640. (1955) Zbl0065.12101MR0073899DOI10.1214/aoms/1177728422
  11. F. Öesterreicher, On a class of perimeter-type distances of probability distributions, Kybernetika 32 (1996), 389–393. (1996) MR1420130
  12. F. Österreicher, I. Vajda, 10.1007/BF02517812, Ann. Inst. Statist. Math. 55 (2003), 639–653. (2003) MR2007803DOI10.1007/BF02517812
  13. I. Vajda, 10.1007/BF02018663, Period. Math. Hungar. 2 (1972), 223–234. (1972) Zbl0248.62001MR0335163DOI10.1007/BF02018663
  14. I. Vincze, On the concept and measure of information contained in an observation, In: Contributions to Probability (J. Gani and V. F. Rohatgi, eds.), Academic Press, New York 1981, pp. 207–214. (1981) Zbl0531.62002MR0618690

Citations in EuDML Documents

top
  1. László Györfi, Adam Krzyżak, Why view and what is next?
  2. Friedrich Liese, PHI-divergences, sufficiency, Bayes sufficiency, and deficiency
  3. Jukka Corander, Ulpu Remes, Timo Koski, On the Jensen-Shannon divergence and the variation distance for categorical probability distributions

NotesEmbed ?

top

You must be logged in to post comments.