On Amount of Information of Type-ß and Other Measures.
A general formalization is given for asynchronous multiple access channels which admits different assumptions on delays. This general framework allows the analysis of so far unexplored models leading to new interesting capacity regions. The main result is the single letter characterization of the capacity region in case of 3 senders, 2 synchronous with each other and the third not synchronous with them.
Kullback's relative information and Kerridge's inaccuracy are two information-theoretic measures associated with a pair of probability distributions of a discrete random variable. The authors study a generalized measure which in particular contains a parametric generalization of relative information and inaccuracy. Some important properties of this generalized measure along with an inversion theorem are also studied.
In this paper, measurable solutions of a functional equation with four unknown functions are obtained. As an application of the measurable solutions a joint characterization of Shannon’s entropy and entropy of type is given.
We investigate solution sets of a special kind of linear inequality systems. In particular, we derive characterizations of these sets in terms of minimal solution sets. The studied inequalities emerge as information inequalities in the context of Bayesian networks. This allows to deduce structural properties of Bayesian networks, which is important within causal inference.