On Two Conditional Entropies without Probability.
Belis and Guiasu studied a generalization of Shannon entropy as weighted or useful entropy. In this paper, the weighted entropy of type is defined and characterized and some if its properties are studied. Further generalizations involving more parameters of weighted entropy are also specified.
We establish the optimal quantization problem for probabilities under constrained Rényi--entropy of the quantizers. We determine the optimal quantizers and the optimal quantization error of one-dimensional uniform distributions including the known special cases (restricted codebook size) and (restricted Shannon entropy).
The information divergence of a probability measure from an exponential family over a finite set is defined as infimum of the divergences of from subject to . All directional derivatives of the divergence from are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for to be a maximizer of the divergence from are presented, including new ones when is not projectable to .
This article studies exponential families on finite sets such that the information divergence of an arbitrary probability distribution from is bounded by some constant . A particular class of low-dimensional exponential families that have low values of can be obtained from partitions of the state space. The main results concern optimality properties of these partition exponential families. The case where is studied in detail. This case is special, because if , then contains all probability...