Extreme symmetry and the directed divergence in information theory
We investigate the sets of joint probability distributions that maximize the average multi-information over a collection of margins. These functionals serve as proxies for maximizing the multi-information of a set of variables or the mutual information of two subsets of variables, at a lower computation and estimation complexity. We describe the maximizers and their relations to the maximizers of the multi-information and the mutual information.
En este trabajo se presentan las medidas de entropía que provienen de la distancia, en el sentido de Csiszar, entre una distribución y la distribución en la que todos los sucesos son equiprobables. En segundo lugar, se estudian condiciones para la concavidad y no negatividad de las medidas propuestas. Finalmente, se obtienen los funcionales Φ-entropía como casos particulares de las medidas estudiadas.
This paper deals with Bayesian models given by statistical experiments and standard loss functions. Bayes probability of error and Bayes risk are estimated by means of classical and generalized information criteria applicable to the experiment. The accuracy of the estimation is studied. Among the information criteria studied in the paper is the class of posterior power entropies which include the Shannon entropy as special case for the power . It is shown that the most accurate estimate is in this...