Page 1 Next

Displaying 1 – 20 of 179

Showing per page

( h , Φ ) -entropy differential metric

María Luisa Menéndez, Domingo Morales, Leandro Pardo, Miquel Salicrú (1997)

Applications of Mathematics

Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic differential metrics on probability spaces. Using these methods, they obtained the Fisher information metric as a particular case. In this paper we apply the method based on entropy measures to obtain a Riemannian metric based on ( h , Φ ) -entropy measures (Salicrú et al., 1993). The geodesic distances based on that information metric have been computed for a number of parametric families of distributions. The use of geodesic...

( R , S ) -information radius of type t and comparison of experiments

Inder Jeet Taneja, Luis Pardo, D. Morales (1991)

Applications of Mathematics

Various information, divergence and distance measures have been used by researchers to compare experiments using classical approaches such as those of Blackwell, Bayesian ets. Blackwell's [1] idea of comparing two statistical experiments is based on the existence of stochastic transformations. Using this idea of Blackwell, as well as the classical bayesian approach, we have compared statistical experiments by considering unified scalar parametric generalizations of Jensen difference divergence measure....

A general class of entropy statistics

María Dolores Esteban (1997)

Applications of Mathematics

To study the asymptotic properties of entropy estimates, we use a unified expression, called the H h , v ϕ 1 , ϕ 2 -entropy. Asymptotic distributions for these statistics are given in several cases when maximum likelihood estimators are considered, so they can be used to construct confidence intervals and to test statistical hypotheses based on one or more samples. These results can also be applied to multinomial populations.

A new approach to mutual information

Fumio Hiai, Dénes Petz (2007)

Banach Center Publications

A new expression as a certain asymptotic limit via "discrete micro-states" of permutations is provided for the mutual information of both continuous and discrete random variables.

A new approach to mutual information. II

Fumio Hiai, Takuho Miyamoto (2010)

Banach Center Publications

A new concept of mutual pressure is introduced for potential functions on both continuous and discrete compound spaces via discrete micro-states of permutations, and its relations with the usual pressure and the mutual information are established. This paper is a continuation of the paper of Hiai and Petz in Banach Center Publications, Vol. 78.

A note on quantifying the uncertainty corresponding to the utilities.

Norberto Corral Blanco, María Angeles Gil Alvarez (1983)

Trabajos de Estadística e Investigación Operativa

In a previous paper we have studied the relevant analogies between the variance, applied to a compound scheme of probability and utility, and the measure which we had defined to evaluate the unquietness for such a compound scheme.The purpose of the present note is to display the advantage exhibited by the second measure, with respect to the first one, in quantifying the uncertainty corresponding to the utilities. This advantage consists of the larger ability to distinguish among the different compound...

About the maximum information and maximum likelihood principles

Igor Vajda, Jiří Grim (1998)


Neural networks with radial basis functions are considered, and the Shannon information in their output concerning input. The role of information- preserving input transformations is discussed when the network is specified by the maximum information principle and by the maximum likelihood principle. A transformation is found which simplifies the input structure in the sense that it minimizes the entropy in the class of all information-preserving transformations. Such transformation need not be unique...

Adaptive tests for periodic signal detection with applications to laser vibrometry

Magalie Fromont, Céline Lévy-leduc (2006)

ESAIM: Probability and Statistics

Initially motivated by a practical issue in target detection via laser vibrometry, we are interested in the problem of periodic signal detection in a Gaussian fixed design regression framework. Assuming that the signal belongs to some periodic Sobolev ball and that the variance of the noise is known, we first consider the problem from a minimax point of view: we evaluate the so-called minimax separation rate which corresponds to the minimal l2-distance between the signal and zero so that the detection...

Algunas consideraciones sobre la información generalizada.

M.ª del Pilar Zuluaga Arias (1987)

Trabajos de Estadística

De Groot ha definido una medida de información que generaliza la información de Kullback como medida de discrepancia entre dos distribuciones.En el presente artículo definimos la información modal, como caso particular de la información generalizada, la cual servirá para demostrar que las principales propiedades de la información de Kullback no se verifican para la medida dada por De Groot.

An exploratory canonical analysis approach for multinomial populations based on the φ -divergence measure

Julio A. Pardo, Leandro Pardo, María Del Carmen Pardo, K. Zografos (2004)


In this paper we consider an exploratory canonical analysis approach for multinomial population based on the φ -divergence measure. We define the restricted minimum φ -divergence estimator, which is seen to be a generalization of the restricted maximum likelihood estimator. This estimator is then used in φ -divergence goodness-of-fit statistics which is the basis of two new families of statistics for solving the problem of selecting the number of significant correlations as well as the appropriateness...

Análisis de distribuciones a priori bajo cierta información parcial.

José A. Cristóbal Cristóbal, Manuel Salvador Figueras (1989)

Trabajos de Estadística

Se calculan las distribuciones menos informativas cuando se utilizan como medidas de información la entropía útil y la energía informacional de Onicescu, tanto si el espacio de estados Θ es continuo (intervalo de R) como si es discreto y suponiendo que el decisor posee información acerca de algunas características de la distribución a priori (monotonías de la función de densidad, probabilidades de subconjuntos de Θ, monotonías o cotas de la razón de fallo).

Currently displaying 1 – 20 of 179

Page 1 Next