A note on characterizations of entropies
J. S. Chawla (1977)
Kybernetika
Similarity:
The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
The search session has expired. Please query the service again.
J. S. Chawla (1977)
Kybernetika
Similarity:
Margrit Gauglhofer, A. T. Bharucha-Reid (1973)
Annales de l'I.H.P. Probabilités et statistiques
Similarity:
Martin Adamčík (2019)
Kybernetika
Similarity:
In this paper we present a result that relates merging of closed convex sets of discrete probability functions respectively by the squared Euclidean distance and the Kullback-Leibler divergence, using an inspiration from the Rényi entropy. While selecting the probability function with the highest Shannon entropy appears to be a convincingly justified way of representing a closed convex set of probability functions, the discussion on how to represent several closed convex sets of probability...
Peter Harremoës (2009)
Kybernetika
Similarity:
The exact range of the joined values of several Rényi entropies is determined. The method is based on topology with special emphasis on the orientation of the objects studied. Like in the case when only two orders of the Rényi entropies are studied, one can parametrize the boundary of the range. An explicit formula for a tight upper or lower bound for one order of entropy in terms of another order of entropy cannot be given.
Tim Austin (2015)
Studia Mathematica
Similarity:
A number of recent works have sought to generalize the Kolmogorov-Sinai entropy of probability-preserving transformations to the setting of Markov operators acting on the integrable functions on a probability space (X,μ). These works have culminated in a proof by Downarowicz and Frej that various competing definitions all coincide, and that the resulting quantity is uniquely characterized by certain abstract properties. On the other hand, Makarov has shown that this...
Anna Fioretto, Andrea Sgarro (1996)
Mathware and Soft Computing
Similarity:
We discuss pragmatic information measures (hypergraph entropy and fractional entropy) inspired by source-coding theory (rate-distortion theory). We re-phrase the problem in the language of evidence theory, by expressing the pragmatic requirements of the human agent in terms of suitable bodies of evidence, or BOE's. We tackle the situation when the overall uncertainty is removed in two steps. In the case when fractional entropy measures the first-step (partial, pragmatic) uncertainty,...