Displaying 181 – 200 of 268

Showing per page

On the amount of information resulting from empirical and theoretical knowledge.

Igor Vajda, Arnost Vesely, Jana Zvarova (2005)

Revista Matemática Complutense

We present a mathematical model allowing formally define the concepts of empirical and theoretical knowledge. The model consists of a finite set P of predicates and a probability space (Ω, S, P) over a finite set Ω called ontology which consists of objects ω for which the predicates π ∈ P are either valid (π(ω) = 1) or not valid (π(ω) = 0). Since this is a first step in this area, our approach is as simple as possible, but still nontrivial, as it is demonstrated by examples. More realistic approach...

On the computation of covert channel capacity

Eugene Asarin, Cătălin Dima (2010)

RAIRO - Theoretical Informatics and Applications

We address the problem of computing the capacity of a covert channel, modeled as a nondeterministic transducer. We give three possible statements of the notion of “covert channel capacity” and relate the different definitions. We then provide several methods allowing the computation of lower and upper bounds for the capacity of a channel. We show that, in some cases, including the case of input-deterministic channels, the capacity of the channel can be computed exactly (e.g. in the form...

On the g -entropy and its Hudetz correction

Beloslav Riečan (2002)

Kybernetika

The Hudetz correction of the fuzzy entropy is applied to the g -entropy. The new invariant is expressed by the Hudetz correction of fuzzy entropy.

On the Jensen-Shannon divergence and the variation distance for categorical probability distributions

Jukka Corander, Ulpu Remes, Timo Koski (2021)

Kybernetika

We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence...

On weighted entropy of type ( α , β ) and its generalizations

Gur Dial, Inder Jeet Taneja (1981)

Aplikace matematiky

Belis and Guiasu studied a generalization of Shannon entropy as weighted or useful entropy. In this paper, the weighted entropy of type ( α , β ) is defined and characterized and some if its properties are studied. Further generalizations involving more parameters of weighted entropy are also specified.

Currently displaying 181 – 200 of 268