This article studies exponential families on finite sets such that the information divergence of an arbitrary probability distribution from is bounded by some constant . A particular class of low-dimensional exponential families that have low values of can be obtained from partitions of the state space. The main results concern optimality properties of these partition exponential families. The case where is studied in detail. This case is special, because if , then contains all probability...
The problem to maximize the information divergence from an exponential family is generalized to the setting of Bregman divergences and suitably defined Bregman families.
We compute the expected value of the Kullback-Leibler divergence of various fundamental statistical models with respect to Dirichlet priors. For the uniform prior, the expected divergence of any model containing the uniform distribution is bounded by a constant . For the models that we consider this bound is approached as the cardinality of the sample space tends to infinity, if the model dimension remains relatively small. For Dirichlet priors with reasonable concentration parameters the expected...
We study the unique information function defined by Bertschinger et al. within the framework of information decompositions. In particular, we study uniqueness and support of the solutions to the convex optimization problem underlying the definition of . We identify sufficient conditions for non-uniqueness of solutions with full support in terms of conditional independence constraints and in terms of the cardinalities of , and . Our results are based on a reformulation of the first order conditions...
Download Results (CSV)