Displaying similar documents to “Entropy estimate for k -monotone functions via small ball probability of integrated Brownian motions.”

( h , Φ ) -entropy differential metric

María Luisa Menéndez, Domingo Morales, Leandro Pardo, Miquel Salicrú (1997)

Applications of Mathematics

Similarity:

Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic differential metrics on probability spaces. Using these methods, they obtained the Fisher information metric as a particular case. In this paper we apply the method based on entropy measures to obtain a Riemannian metric based on ( h , Φ ) -entropy measures (Salicrú et al., 1993). The geodesic distances based on that information metric have been computed for a number of parametric families of distributions. The...

A note on how Rényi entropy can create a spectrum of probabilistic merging operators

Martin Adamčík (2019)

Kybernetika

Similarity:

In this paper we present a result that relates merging of closed convex sets of discrete probability functions respectively by the squared Euclidean distance and the Kullback-Leibler divergence, using an inspiration from the Rényi entropy. While selecting the probability function with the highest Shannon entropy appears to be a convincingly justified way of representing a closed convex set of probability functions, the discussion on how to represent several closed convex sets of probability...

A general class of entropy statistics

María Dolores Esteban (1997)

Applications of Mathematics

Similarity:

To study the asymptotic properties of entropy estimates, we use a unified expression, called the H h , v ϕ 1 , ϕ 2 -entropy. Asymptotic distributions for these statistics are given in several cases when maximum likelihood estimators are considered, so they can be used to construct confidence intervals and to test statistical hypotheses based on one or more samples. These results can also be applied to multinomial populations.