Page 1 Next

Displaying 1 – 20 of 732

Showing per page

( h , Φ ) -entropy differential metric

María Luisa Menéndez, Domingo Morales, Leandro Pardo, Miquel Salicrú (1997)

Applications of Mathematics

Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic differential metrics on probability spaces. Using these methods, they obtained the Fisher information metric as a particular case. In this paper we apply the method based on entropy measures to obtain a Riemannian metric based on ( h , Φ ) -entropy measures (Salicrú et al., 1993). The geodesic distances based on that information metric have been computed for a number of parametric families of distributions. The use of geodesic...

A Characterization of Uniform Distribution

Joanna Chachulska (2005)

Bulletin of the Polish Academy of Sciences. Mathematics

Is the Lebesgue measure on [0,1]² a unique product measure on [0,1]² which is transformed again into a product measure on [0,1]² by the mapping ψ(x,y) = (x,(x+y)mod 1))? Here a somewhat stronger version of this problem in a probabilistic framework is answered. It is shown that for independent and identically distributed random variables X and Y constancy of the conditional expectations of X+Y-I(X+Y > 1) and its square given X identifies uniform distribution either absolutely continuous or discrete....

A class of tests for exponentiality based on a continuum of moment conditions

Simos G. Meintanis (2009)

Kybernetika

The empirical moment process is utilized to construct a family of tests for the null hypothesis that a random variable is exponentially distributed. The tests are consistent against the 'new better than used in expectation' (NBUE) class of alternatives. Consistency is shown and the limit null distribution of the test statistic is derived, while efficiency results are also provided. The finite-sample properties of the proposed procedure in comparison to more standard procedures are investigated via...

A general class of entropy statistics

María Dolores Esteban (1997)

Applications of Mathematics

To study the asymptotic properties of entropy estimates, we use a unified expression, called the H h , v ϕ 1 , ϕ 2 -entropy. Asymptotic distributions for these statistics are given in several cases when maximum likelihood estimators are considered, so they can be used to construct confidence intervals and to test statistical hypotheses based on one or more samples. These results can also be applied to multinomial populations.

Currently displaying 1 – 20 of 732

Page 1 Next