Displaying similar documents to “Bounds on guessing numbers and secret sharing combining information theory methods”

On Entropy Bumps for Calderón-Zygmund Operators

Michael T. Lacey, Scott Spencer (2015)

Concrete Operators

Similarity:

We study twoweight inequalities in the recent innovative language of ‘entropy’ due to Treil-Volberg. The inequalities are extended to Lp, for 1 < p ≠ 2 < ∞, with new short proofs. A result proved is as follows. Let ℇ be a monotonic increasing function on (1,∞) which satisfy [...] Let σ and w be two weights on Rd. If this supremum is finite, for a choice of 1 < p < ∞, [...] then any Calderón-Zygmund operator T satisfies the bound [...]

A pragmatic uncertainty measure based on rate-distortion theory and the uncertainty of BOE's.

Anna Fioretto, Andrea Sgarro (1996)

Mathware and Soft Computing

Similarity:

We discuss pragmatic information measures (hypergraph entropy and fractional entropy) inspired by source-coding theory (rate-distortion theory). We re-phrase the problem in the language of evidence theory, by expressing the pragmatic requirements of the human agent in terms of suitable bodies of evidence, or BOE's. We tackle the situation when the overall uncertainty is removed in two steps. In the case when fractional entropy measures the first-step (partial, pragmatic) uncertainty,...

A new approach to mutual information

Fumio Hiai, Dénes Petz (2007)

Banach Center Publications

Similarity:

A new expression as a certain asymptotic limit via "discrete micro-states" of permutations is provided for the mutual information of both continuous and discrete random variables.

A note on how Rényi entropy can create a spectrum of probabilistic merging operators

Martin Adamčík (2019)

Kybernetika

Similarity:

In this paper we present a result that relates merging of closed convex sets of discrete probability functions respectively by the squared Euclidean distance and the Kullback-Leibler divergence, using an inspiration from the Rényi entropy. While selecting the probability function with the highest Shannon entropy appears to be a convincingly justified way of representing a closed convex set of probability functions, the discussion on how to represent several closed convex sets of probability...