Order statistics and ( r , s ) -entropy measures

María Dolores Esteban; Domingo Morales; Leandro Pardo; María Luisa Menéndez

Applications of Mathematics (1994)

  • Volume: 39, Issue: 5, page 321-337
  • ISSN: 0862-7940

Abstract

top
K. M. Wong and S. Chen [9] analyzed the Shannon entropy of a sequence of random variables under order restrictions. Using ( r , s ) -entropies, I. J. Taneja [8], these results are generalized. Upper and lower bounds to the entropy reduction when the sequence is ordered and conditions under which they are achieved are derived. Theorems are presented showing the difference between the average entropy of the individual order statistics and the entropy of a member of the original independent identically distributed (i.i.d.) population. Finally, the entropies of the individual order statistics are studied when the probability density function (p.d.f.) of the original i.i.d. sequence is symmetric about its mean.

How to cite

top

Esteban, María Dolores, et al. "Order statistics and $(r,s)$-entropy measures." Applications of Mathematics 39.5 (1994): 321-337. <http://eudml.org/doc/32888>.

@article{Esteban1994,
abstract = {K. M. Wong and S. Chen [9] analyzed the Shannon entropy of a sequence of random variables under order restrictions. Using $(r,s)$-entropies, I. J. Taneja [8], these results are generalized. Upper and lower bounds to the entropy reduction when the sequence is ordered and conditions under which they are achieved are derived. Theorems are presented showing the difference between the average entropy of the individual order statistics and the entropy of a member of the original independent identically distributed (i.i.d.) population. Finally, the entropies of the individual order statistics are studied when the probability density function (p.d.f.) of the original i.i.d. sequence is symmetric about its mean.},
author = {Esteban, María Dolores, Morales, Domingo, Pardo, Leandro, Menéndez, María Luisa},
journal = {Applications of Mathematics},
keywords = {unified $(r,s)$-entropy measure; order statistics; Shannon entropy; logistic distribution; unified (r,s)-entropy measure; logistic distribution; Shannon entropy; entropy reduction; order statistics},
language = {eng},
number = {5},
pages = {321-337},
publisher = {Institute of Mathematics, Academy of Sciences of the Czech Republic},
title = {Order statistics and $(r,s)$-entropy measures},
url = {http://eudml.org/doc/32888},
volume = {39},
year = {1994},
}

TY - JOUR
AU - Esteban, María Dolores
AU - Morales, Domingo
AU - Pardo, Leandro
AU - Menéndez, María Luisa
TI - Order statistics and $(r,s)$-entropy measures
JO - Applications of Mathematics
PY - 1994
PB - Institute of Mathematics, Academy of Sciences of the Czech Republic
VL - 39
IS - 5
SP - 321
EP - 337
AB - K. M. Wong and S. Chen [9] analyzed the Shannon entropy of a sequence of random variables under order restrictions. Using $(r,s)$-entropies, I. J. Taneja [8], these results are generalized. Upper and lower bounds to the entropy reduction when the sequence is ordered and conditions under which they are achieved are derived. Theorems are presented showing the difference between the average entropy of the individual order statistics and the entropy of a member of the original independent identically distributed (i.i.d.) population. Finally, the entropies of the individual order statistics are studied when the probability density function (p.d.f.) of the original i.i.d. sequence is symmetric about its mean.
LA - eng
KW - unified $(r,s)$-entropy measure; order statistics; Shannon entropy; logistic distribution; unified (r,s)-entropy measure; logistic distribution; Shannon entropy; entropy reduction; order statistics
UR - http://eudml.org/doc/32888
ER -

References

top
  1. 10.1016/S0019-9958(71)90065-9, Information and Control 19 (1971), 181–194. (1971) MR0309224DOI10.1016/S0019-9958(71)90065-9
  2. Order statistics and inference, Estimation methods, Academic Press, 1991. (1991) MR1084812
  3. Table of integrals, series and products, Academic Press, 1980. (1980) MR1398882
  4. Quantification method of classification processes: concept of structural α -entropy, Kybernetika 3 (1967), 30–35. (1967) MR0209067
  5. On measures of entropy and information, Proc. 4th Berkeley Symp. Math. Statist. and Prob. 1 (1961), 547–561. (1961) Zbl0106.33001MR0132570
  6. 10.1002/j.1538-7305.1948.tb01338.x, Bell. Syst. Tech. J. 27 (1948), 379–423. (1948) MR0026286DOI10.1002/j.1538-7305.1948.tb01338.x
  7. New nonadditive measures of entropy for discrete probability distribution, J. Math. Sci. 10 (1975), 28–40. (1975) MR0539493
  8. 10.1016/S0065-2539(08)60580-6, Adv. Elect. and Elect. Phis. 76 (1989), 327–413. (1989) DOI10.1016/S0065-2539(08)60580-6
  9. 10.1109/18.52473, IEEE Transactions on Information Theory 36(2) (1990), 276–284. (1990) MR1052779DOI10.1109/18.52473

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.