Currently displaying 1 – 4 of 4

Showing per page

Order by Relevance | Title | Year of publication

Optimally approximating exponential families

Johannes Rauh — 2013

Kybernetika

This article studies exponential families on finite sets such that the information divergence D ( P ) of an arbitrary probability distribution from is bounded by some constant D > 0 . A particular class of low-dimensional exponential families that have low values of D can be obtained from partitions of the state space. The main results concern optimality properties of these partition exponential families. The case where D = log ( 2 ) is studied in detail. This case is special, because if D < log ( 2 ) , then contains all probability...

Scaling of model approximation errors and expected entropy distances

Guido F. MontúfarJohannes Rauh — 2014

Kybernetika

We compute the expected value of the Kullback-Leibler divergence of various fundamental statistical models with respect to Dirichlet priors. For the uniform prior, the expected divergence of any model containing the uniform distribution is bounded by a constant 1 - γ . For the models that we consider this bound is approached as the cardinality of the sample space tends to infinity, if the model dimension remains relatively small. For Dirichlet priors with reasonable concentration parameters the expected...

Properties of unique information

Johannes RauhMaik SchünemannJürgen Jost — 2021

Kybernetika

We study the unique information function U I ( T : X Y ) defined by Bertschinger et al. within the framework of information decompositions. In particular, we study uniqueness and support of the solutions to the convex optimization problem underlying the definition of U I . We identify sufficient conditions for non-uniqueness of solutions with full support in terms of conditional independence constraints and in terms of the cardinalities of T , X and Y . Our results are based on a reformulation of the first order conditions...

Page 1

Download Results (CSV)