Displaying 21 – 40 of 281

Showing per page

About the maximum information and maximum likelihood principles

Igor Vajda, Jiří Grim (1998)

Kybernetika

Neural networks with radial basis functions are considered, and the Shannon information in their output concerning input. The role of information- preserving input transformations is discussed when the network is specified by the maximum information principle and by the maximum likelihood principle. A transformation is found which simplifies the input structure in the sense that it minimizes the entropy in the class of all information-preserving transformations. Such transformation need not be unique...

Adaptive tests for periodic signal detection with applications to laser vibrometry

Magalie Fromont, Céline Lévy-leduc (2006)

ESAIM: Probability and Statistics

Initially motivated by a practical issue in target detection via laser vibrometry, we are interested in the problem of periodic signal detection in a Gaussian fixed design regression framework. Assuming that the signal belongs to some periodic Sobolev ball and that the variance of the noise is known, we first consider the problem from a minimax point of view: we evaluate the so-called minimax separation rate which corresponds to the minimal l2-distance between the signal and zero so that the detection...

Algunas consideraciones sobre la información generalizada.

M.ª del Pilar Zuluaga Arias (1987)

Trabajos de Estadística

De Groot ha definido una medida de información que generaliza la información de Kullback como medida de discrepancia entre dos distribuciones.En el presente artículo definimos la información modal, como caso particular de la información generalizada, la cual servirá para demostrar que las principales propiedades de la información de Kullback no se verifican para la medida dada por De Groot.

An exploratory canonical analysis approach for multinomial populations based on the φ -divergence measure

Julio A. Pardo, Leandro Pardo, María Del Carmen Pardo, K. Zografos (2004)

Kybernetika

In this paper we consider an exploratory canonical analysis approach for multinomial population based on the φ -divergence measure. We define the restricted minimum φ -divergence estimator, which is seen to be a generalization of the restricted maximum likelihood estimator. This estimator is then used in φ -divergence goodness-of-fit statistics which is the basis of two new families of statistics for solving the problem of selecting the number of significant correlations as well as the appropriateness...

Análisis de distribuciones a priori bajo cierta información parcial.

José A. Cristóbal Cristóbal, Manuel Salvador Figueras (1989)

Trabajos de Estadística

Se calculan las distribuciones menos informativas cuando se utilizan como medidas de información la entropía útil y la energía informacional de Onicescu, tanto si el espacio de estados Θ es continuo (intervalo de R) como si es discreto y suponiendo que el decisor posee información acerca de algunas características de la distribución a priori (monotonías de la función de densidad, probabilidades de subconjuntos de Θ, monotonías o cotas de la razón de fallo).

Blended φ -divergences with examples

Václav Kůs (2003)

Kybernetika

Several new examples of divergences emerged in the recent literature called blended divergences. Mostly these examples are constructed by the modification or parametrization of the old well-known phi-divergences. Newly introduced parameter is often called blending parameter. In this paper we present compact theory of blended divergences which provides us with a generally applicable method for finding new classes of divergences containing any two divergences D 0 and D 1 given in advance. Several examples...

Bound on extended f -divergences for a variety of classes

Pietro Cerone, Sever Silvestru Dragomir, Ferdinand Österreicher (2004)

Kybernetika

The concept of f -divergences was introduced by Csiszár in 1963 as measures of the ‘hardness’ of a testing problem depending on a convex real valued function f on the interval [ 0 , ) . The choice of this parameter f can be adjusted so as to match the needs for specific applications. The definition and some of the most basic properties of f -divergences are given and the class of χ α -divergences is presented. Ostrowski’s inequality and a Trapezoid inequality are utilized in order to prove bounds for an extension...

Currently displaying 21 – 40 of 281