Matrix trace inequalities on the Tsallis entropies.
The problem to maximize the information divergence from an exponential family is generalized to the setting of Bregman divergences and suitably defined Bregman families.
G. Edelman, O. Sporns and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely invariance under permutations and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies...
We discuss the effects that the usual set theoretic and arithmetic operations with fuzzy sets and fuzzy numbers have with respect to the energies and entropies of the fuzzy sets connected and of the resulting fuzzy sets, and we also compare the entropies and energies of the results of several of those operations.
This paper deals with a characterization of the totally compositive measures of uncertainty which satisfy the branching property. A procedure to construct all continuous measures in this class is given.
The aim of this paper is to define global measures of uncertainty in the framework of Dempster-Shafer's Theory of Evidence. Starting from the concepts of entropy and specificity introduced by Yager, two measures are considered; the lower entropy and the upper entropy.
Total correlation (‘TC’) and dual total correlation (‘DTC’) are two classical ways to quantify the correlation among an -tuple of random variables. They both reduce to mutual information when . The first part of this paper sets up the theory of TC and DTC for general random variables, not necessarily finite-valued. This generality has not been exposed in the literature before. The second part considers the structural implications when a joint distribution has small TC or DTC. If , then is...