A note on how Rényi entropy can create a spectrum of probabilistic merging operators
Kybernetika (2019)
- Volume: 55, Issue: 4, page 605-617
- ISSN: 0023-5954
Access Full Article
topAbstract
topHow to cite
topAdamčík, Martin. "A note on how Rényi entropy can create a spectrum of probabilistic merging operators." Kybernetika 55.4 (2019): 605-617. <http://eudml.org/doc/295078>.
@article{Adamčík2019,
abstract = {In this paper we present a result that relates merging of closed convex sets of discrete probability functions respectively by the squared Euclidean distance and the Kullback-Leibler divergence, using an inspiration from the Rényi entropy. While selecting the probability function with the highest Shannon entropy appears to be a convincingly justified way of representing a closed convex set of probability functions, the discussion on how to represent several closed convex sets of probability functions is still ongoing. The presented result provides a perspective on this discussion. Furthermore, for those who prefer the standard minimisation based on the squared Euclidean distance, it provides a connection to a probabilistic merging operator based on the Kullback-Leibler divergence, which is closely connected to the Shannon entropy.},
author = {Adamčík, Martin},
journal = {Kybernetika},
keywords = {probabilistic merging; information geometry; Kullback–Leibler divergence; Rényi entropy},
language = {eng},
number = {4},
pages = {605-617},
publisher = {Institute of Information Theory and Automation AS CR},
title = {A note on how Rényi entropy can create a spectrum of probabilistic merging operators},
url = {http://eudml.org/doc/295078},
volume = {55},
year = {2019},
}
TY - JOUR
AU - Adamčík, Martin
TI - A note on how Rényi entropy can create a spectrum of probabilistic merging operators
JO - Kybernetika
PY - 2019
PB - Institute of Information Theory and Automation AS CR
VL - 55
IS - 4
SP - 605
EP - 617
AB - In this paper we present a result that relates merging of closed convex sets of discrete probability functions respectively by the squared Euclidean distance and the Kullback-Leibler divergence, using an inspiration from the Rényi entropy. While selecting the probability function with the highest Shannon entropy appears to be a convincingly justified way of representing a closed convex set of probability functions, the discussion on how to represent several closed convex sets of probability functions is still ongoing. The presented result provides a perspective on this discussion. Furthermore, for those who prefer the standard minimisation based on the squared Euclidean distance, it provides a connection to a probabilistic merging operator based on the Kullback-Leibler divergence, which is closely connected to the Shannon entropy.
LA - eng
KW - probabilistic merging; information geometry; Kullback–Leibler divergence; Rényi entropy
UR - http://eudml.org/doc/295078
ER -
References
top- Adamčík, M., 10.3390/e16126338, Entropy 16 (2014), 6338-6381. MR3299534DOI10.3390/e16126338
- Adamčík, M., Collective Reasoning under Uncertainty and Inconsistency.
- Adamčík, M., 10.1016/j.jal.2016.10.001, J. Appl. Logic 19 (2016), 20-49. MR3573263DOI10.1016/j.jal.2016.10.001
- Adamčík, M., 10.1016/j.jbi.2017.05.017, J. Biomed. Inform. 71 (2017), 110-129. DOI10.1016/j.jbi.2017.05.017
- Adamčík, M., Wilmers, G. M., 10.2143/LEA.228.0.3078175, Logique Analyse 228 (2014), 563-590. MR3379218DOI10.2143/LEA.228.0.3078175
- Amari, S., Cichocki, A., 10.3390/e12061532, Entropy 12 (2010), 1532-1568. MR2659408DOI10.3390/e12061532
- Basu, A., Harris, I. R., Hjort, N., Jones, M., 10.1093/biomet/85.3.549, Biometrika 85 (1998), 549-559. MR1665873DOI10.1093/biomet/85.3.549
- Bregman, L. M., 10.1016/0041-5553(67)90040-7, USSR Comput. Mathematics Math. Physics 1 (1967), 200-217. MR0215617DOI10.1016/0041-5553(67)90040-7
- Hawes, P., Investigation of Properties of Some Inference Processes.
- Jaynes, E. T., Where do we stand on maximum entropy?, In: The Maximum Entropy Formalism (R. D. Levine, M. Tribus, eds.), M.I.T. Press, 1979, pp. 15-118. MR0521743
- Kern-Isberner, G., Rödder, W., 10.1002/int.20027, Int. J. Intell. Systems 19 (2004), 837-857. Zbl1101.68944DOI10.1002/int.20027
- Osherson, D., Vardi, M., 10.1016/j.geb.2006.04.001, Games Econom. Behavior 56 (2006), 148-173. Zbl1127.62129MR2235941DOI10.1016/j.geb.2006.04.001
- Paris, J. B., The Uncertain Reasoner Companion., Cambridge University Press, Cambridge 1994. MR1314199
- Paris, J. B., Vencovská, A., 10.1016/0888-613x(89)90012-1, Int. J. Approx. Reason. 3 (1989), 1-34. Zbl0665.68079MR0975613DOI10.1016/0888-613x(89)90012-1
- Paris, J. B., Vencovská, A., 10.1016/0888-613x(90)90020-3, Int. J. Approx. Reason. 4 (1990), 183-224. MR1051032DOI10.1016/0888-613x(90)90020-3
- Predd, J. B., Osherson, D. N., Kulkarni, S. R, Poor, H. V., 10.1287/deca.1080.0119, Decision Analysis 5 (2008), 177-189. DOI10.1287/deca.1080.0119
- Rényi, A., On measures of entropy and information., In: Proc. Fourth Berkeley Symposium on Mathematics, Statistics and Probability 1 (1961), 547-561. Zbl0106.33001MR0132570
- Shannon, C. E., 10.1002/j.1538-7305.1948.tb00917.x, Bell System Techn. J. 27 (1948), 379-423, 623-656. Zbl1154.94303MR0026286DOI10.1002/j.1538-7305.1948.tb00917.x
- Wilmers, G. M., 10.3390/e17020594, Entropy 17 (2015), 594-645. MR3315866DOI10.3390/e17020594
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.