Previous Page 2

Displaying 21 – 22 of 22

Showing per page

Optimality conditions for maximizers of the information divergence from an exponential family

František Matúš (2007)

Kybernetika

The information divergence of a probability measure P from an exponential family over a finite set is defined as infimum of the divergences of P from Q subject to Q . All directional derivatives of the divergence from are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from are presented, including new ones when P  is not projectable to .

Currently displaying 21 – 22 of 22

Previous Page 2