Conditional probabilities and permutahedron
If conditional independence constraints define a family of positive distributions that is log-convex then this family turns out to be a Markov model over an undirected graph. This is proved for the distributions on products of finite sets and for the regular Gaussian ones. As a consequence, the assertion known as Brook factorization theorem, Hammersley–Clifford theorem or Gibbs–Markov equivalence is obtained.
The information divergence of a probability measure from an exponential family over a finite set is defined as infimum of the divergences of from subject to . All directional derivatives of the divergence from are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for to be a maximizer of the divergence from are presented, including new ones when is not projectable to .
This work studies the standard exponential families of probability measures on Euclidean spaces that have finite supports. In such a family parameterized by means, the mean is supposed to move along a segment inside the convex support towards an endpoint on the boundary of the support. Limit behavior of several quantities related to the exponential family is described explicitly. In particular, the variance functions and information divergences are studied around the boundary.
Four notions of factorizability over arbitrary directed graphs are examined. For acyclic graphs they coincide and are identical with the usual factorization of probability distributions in Markov models. Relations between the factorizations over circuits are described in detail including nontrivial counterexamples. Restrictions on the cardinality of state spaces cause that a factorizability with respect to some special cyclic graphs implies the factorizability with respect to their, more simple,...
The simultaneous occurrence of conditional independences among subvectors of a regular Gaussian vector is examined. All configurations of the conditional independences within four jointly regular Gaussian variables are found and completely characterized in terms of implications involving conditional independence statements. The statements induced by the separation in any simple graph are shown to correspond to such a configuration within a regular Gaussian vector.
Integral functionals based on convex normal integrands are minimized subject to finitely many moment constraints. The integrands are finite on the positive and infinite on the negative numbers, strictly convex but not necessarily differentiable. The minimization is viewed as a primal problem and studied together with a dual one in the framework of convex duality. The effective domain of the value function is described by a conic core, a modification of the earlier concept of convex core. Minimizers...
The problem to maximize the information divergence from an exponential family is generalized to the setting of Bregman divergences and suitably defined Bregman families.
The hypergeometric distributions have many important applications, but they have not had sufficient attention in information theory. Hypergeometric distributions can be approximated by binomial distributions or Poisson distributions. In this paper we present upper and lower bounds on information divergence. These bounds are important for statistical testing and for a better understanding of the notion of exchangeability.
We prove that a rank Dowling geometry of a group is partition representable if and only if is a Frobenius complement. This implies that Dowling group geometries are secret-sharing if and only if they are multilinearly representable.
Page 1