Displaying 81 – 100 of 268

Showing per page

Factorized mutual information maximization

Thomas Merkh, Guido F. Montúfar (2020)

Kybernetika

We investigate the sets of joint probability distributions that maximize the average multi-information over a collection of margins. These functionals serve as proxies for maximizing the multi-information of a set of variables or the mutual information of two subsets of variables, at a lower computation and estimation complexity. We describe the maximizers and their relations to the maximizers of the multi-information and the mutual information.

Funciones de entropía asociadas a medidas de Csiszar.

Miquel Salicrú Pagès, Carles Maria Cuadras (1987)

Qüestiió

En este trabajo se presentan las medidas de entropía que provienen de la distancia, en el sentido de Csiszar, entre una distribución y la distribución en la que todos los sucesos son equiprobables. En segundo lugar, se estudian condiciones para la concavidad y no negatividad de las medidas propuestas. Finalmente, se obtienen los funcionales Φ-entropía como casos particulares de las medidas estudiadas.

Generalized information criteria for Bayes decisions

Domingo Morales, Igor Vajda (2012)

Kybernetika

This paper deals with Bayesian models given by statistical experiments and standard loss functions. Bayes probability of error and Bayes risk are estimated by means of classical and generalized information criteria applicable to the experiment. The accuracy of the estimation is studied. Among the information criteria studied in the paper is the class of posterior power entropies which include the Shannon entropy as special case for the power α = 1 . It is shown that the most accurate estimate is in this...

Currently displaying 81 – 100 of 268