Currently displaying 1 – 13 of 13

Showing per page

Order by Relevance | Title | Year of publication

On conditional independence and log-convexity

František Matúš — 2012

Annales de l'I.H.P. Probabilités et statistiques

If conditional independence constraints define a family of positive distributions that is log-convex then this family turns out to be a Markov model over an undirected graph. This is proved for the distributions on products of finite sets and for the regular Gaussian ones. As a consequence, the assertion known as Brook factorization theorem, Hammersley–Clifford theorem or Gibbs–Markov equivalence is obtained.

Optimality conditions for maximizers of the information divergence from an exponential family

František Matúš — 2007

Kybernetika

The information divergence of a probability measure P from an exponential family over a finite set is defined as infimum of the divergences of P from Q subject to Q . All directional derivatives of the divergence from are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from are presented, including new ones when P  is not projectable to .

On limiting towards the boundaries of exponential families

František Matúš — 2015

Kybernetika

This work studies the standard exponential families of probability measures on Euclidean spaces that have finite supports. In such a family parameterized by means, the mean is supposed to move along a segment inside the convex support towards an endpoint on the boundary of the support. Limit behavior of several quantities related to the exponential family is described explicitly. In particular, the variance functions and information divergences are studied around the boundary.

On factorization of probability distributions over directed graphs

František MatúšBernhard Strohmeier — 1998

Kybernetika

Four notions of factorizability over arbitrary directed graphs are examined. For acyclic graphs they coincide and are identical with the usual factorization of probability distributions in Markov models. Relations between the factorizations over circuits are described in detail including nontrivial counterexamples. Restrictions on the cardinality of state spaces cause that a factorizability with respect to some special cyclic graphs implies the factorizability with respect to their, more simple,...

On Gaussian conditional independence structures

Radim LněničkaFrantišek Matúš — 2007

Kybernetika

The simultaneous occurrence of conditional independences among subvectors of a regular Gaussian vector is examined. All configurations of the conditional independences within four jointly regular Gaussian variables are found and completely characterized in terms of implications involving conditional independence statements. The statements induced by the separation in any simple graph are shown to correspond to such a configuration within a regular Gaussian vector.

Generalized minimizers of convex integral functionals, Bregman distance, Pythagorean identities

Imre CsiszárFrantišek Matúš — 2012

Kybernetika

Integral functionals based on convex normal integrands are minimized subject to finitely many moment constraints. The integrands are finite on the positive and infinite on the negative numbers, strictly convex but not necessarily differentiable. The minimization is viewed as a primal problem and studied together with a dual one in the framework of convex duality. The effective domain of the value function is described by a conic core, a modification of the earlier concept of convex core. Minimizers...

Bounds on the information divergence for hypergeometric distributions

Peter HarremoësFrantišek Matúš — 2020

Kybernetika

The hypergeometric distributions have many important applications, but they have not had sufficient attention in information theory. Hypergeometric distributions can be approximated by binomial distributions or Poisson distributions. In this paper we present upper and lower bounds on information divergence. These bounds are important for statistical testing and for a better understanding of the notion of exchangeability.

Page 1

Download Results (CSV)