Entropy, Inaccuracy and Information
P. Nath (1968)
Metrika
Similarity:
P. Nath (1968)
Metrika
Similarity:
D.P. Mittal (1975)
Metrika
Similarity:
I. Csiszár (1998)
Metrika
Similarity:
A.B. El-Sayed (1978)
Metrika
Similarity:
B.D. Sharma, I.J. Taneja (1975)
Metrika
Similarity:
H.O. Georgii (1990)
Metrika
Similarity:
J. S. Chawla (1977)
Kybernetika
Similarity:
P. Nath, P.N. Arora (1972)
Metrika
Similarity:
Martin Adamčík (2019)
Kybernetika
Similarity:
In this paper we present a result that relates merging of closed convex sets of discrete probability functions respectively by the squared Euclidean distance and the Kullback-Leibler divergence, using an inspiration from the Rényi entropy. While selecting the probability function with the highest Shannon entropy appears to be a convincingly justified way of representing a closed convex set of probability functions, the discussion on how to represent several closed convex sets of probability...
Margrit Gauglhofer, A. T. Bharucha-Reid (1973)
Annales de l'I.H.P. Probabilités et statistiques
Similarity:
Tim Austin (2015)
Studia Mathematica
Similarity:
A number of recent works have sought to generalize the Kolmogorov-Sinai entropy of probability-preserving transformations to the setting of Markov operators acting on the integrable functions on a probability space (X,μ). These works have culminated in a proof by Downarowicz and Frej that various competing definitions all coincide, and that the resulting quantity is uniquely characterized by certain abstract properties. On the other hand, Makarov has shown that this...