Discretization problems on generalized entropies and R -divergences

Luis Pardo; D. Morales; K. Ferentinos; K. Zografos

Kybernetika (1994)

  • Volume: 30, Issue: 4, page 445-460
  • ISSN: 0023-5954

How to cite

top

Pardo, Luis, et al. "Discretization problems on generalized entropies and $R$-divergences." Kybernetika 30.4 (1994): 445-460. <http://eudml.org/doc/27313>.

@article{Pardo1994,
author = {Pardo, Luis, Morales, D., Ferentinos, K., Zografos, K.},
journal = {Kybernetika},
keywords = {discretization of data; quadratic convergence theorems; continuous distribution; entropies; divergences; sample discretized estimates; loss of information},
language = {eng},
number = {4},
pages = {445-460},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Discretization problems on generalized entropies and $R$-divergences},
url = {http://eudml.org/doc/27313},
volume = {30},
year = {1994},
}

TY - JOUR
AU - Pardo, Luis
AU - Morales, D.
AU - Ferentinos, K.
AU - Zografos, K.
TI - Discretization problems on generalized entropies and $R$-divergences
JO - Kybernetika
PY - 1994
PB - Institute of Information Theory and Automation AS CR
VL - 30
IS - 4
SP - 445
EP - 460
LA - eng
KW - discretization of data; quadratic convergence theorems; continuous distribution; entropies; divergences; sample discretized estimates; loss of information
UR - http://eudml.org/doc/27313
ER -

References

top
  1. M. W. Birch, A new proof of the Pearson-Fisher Theorem, Ann. Math. Statist. 35 (1964), 817-824. (1964) Zbl0259.62017MR0169324
  2. J. Burbea, C. R. Rao, Entropy differential metric, distance and divergence measures in probability spaces: A unified approach, J. Multivariate Anal. 12 (1982), 575-596. (1982) Zbl0526.60015MR0680530
  3. T. M. Cover, J. B. Thomas, Elements of Information Theory, J. Wiley, New York 1991. (1991) Zbl0762.94001MR1122806
  4. I. Csiszár, Generalized entropy and quantization problem, In: Trans. of the Sixth Prague Conference, Academia, Prague 1973, pp. 159-174. (1973) MR0359995
  5. S. G. Ghurye, B. Johnson, Discrete approximations to the information integral, Canad. J. Statist. 9 (1981), 27-37. (1981) Zbl0473.62007MR0638384
  6. D. Morales L. Pardo M. Salicrú, M. L. Menéndez, Information measures associated to R-divergences, In: Multivariate analysis: Future directions 2. (C. M. Cuadras and C. R. Rao, eds.) Elsevier Science Publishers, B. V. 1982. (1982) 
  7. M. Salicrú M. L. Menéndez L. Pardo, D. Morales, Asymptotic distribution of ( h , φ ) -entropies, Comm. Statist. A -- Theory Methods (to appear). MR1238377
  8. I. J. Taneja, On generalized information measures and their applications, Adv. Elect. and Elect. Phys. 76 (1989), 327-413. (1989) 
  9. I. Vajda, K. Vašek, Majorization, concave entropies and comparison of experiments, Problems Control Inform. Theory 14 (1985), 105-115. (1985) MR0806056
  10. K. Zografos K. Ferentinos, T. Papaioannou, Discrete approximations to the Csiszár, Rényi, and Fisher measures of information, Canadian J. Statist. 14 (1986), 4, 355-366. (1986) MR0876762

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.