Information-theoretic risk estimates in statistical decision
Albert Pérez (1967)
Kybernetika
Similarity:
Albert Pérez (1967)
Kybernetika
Similarity:
J. S. Chawla (1977)
Kybernetika
Similarity:
Martin Adamčík (2019)
Kybernetika
Similarity:
In this paper we present a result that relates merging of closed convex sets of discrete probability functions respectively by the squared Euclidean distance and the Kullback-Leibler divergence, using an inspiration from the Rényi entropy. While selecting the probability function with the highest Shannon entropy appears to be a convincingly justified way of representing a closed convex set of probability functions, the discussion on how to represent several closed convex sets of probability...
D. Vivona, M. Divari (2007)
Mathware and Soft Computing
Similarity:
Anna Fioretto, Andrea Sgarro (1996)
Mathware and Soft Computing
Similarity:
We discuss pragmatic information measures (hypergraph entropy and fractional entropy) inspired by source-coding theory (rate-distortion theory). We re-phrase the problem in the language of evidence theory, by expressing the pragmatic requirements of the human agent in terms of suitable bodies of evidence, or BOE's. We tackle the situation when the overall uncertainty is removed in two steps. In the case when fractional entropy measures the first-step (partial, pragmatic) uncertainty,...
Tim Austin (2015)
Studia Mathematica
Similarity:
A number of recent works have sought to generalize the Kolmogorov-Sinai entropy of probability-preserving transformations to the setting of Markov operators acting on the integrable functions on a probability space (X,μ). These works have culminated in a proof by Downarowicz and Frej that various competing definitions all coincide, and that the resulting quantity is uniquely characterized by certain abstract properties. On the other hand, Makarov has shown that this...