Displaying 201 – 220 of 281

Showing per page

Scaling of model approximation errors and expected entropy distances

Guido F. Montúfar, Johannes Rauh (2014)

Kybernetika

We compute the expected value of the Kullback-Leibler divergence of various fundamental statistical models with respect to Dirichlet priors. For the uniform prior, the expected divergence of any model containing the uniform distribution is bounded by a constant 1 - γ . For the models that we consider this bound is approached as the cardinality of the sample space tends to infinity, if the model dimension remains relatively small. For Dirichlet priors with reasonable concentration parameters the expected...

Second order asymptotic distribution of the R φ -divergence goodness-of-fit statistics

María Del Carmen Pardo (2000)

Kybernetika

The distribution of each member of the family of statistics based on the R φ -divergence for testing goodness-of-fit is a chi-squared to o ( 1 ) (Pardo [pard96]). In this paper a closer approximation to the exact distribution is obtained by extracting the φ -dependent second order component from the o ( 1 ) term.

Several applications of divergence criteria in continuous families

Michel Broniatowski, Igor Vajda (2012)

Kybernetika

This paper deals with four types of point estimators based on minimization of information-theoretic divergences between hypothetical and empirical distributions. These were introduced (i) by Liese and Vajda [9] and independently Broniatowski and Keziou [3], called here power superdivergence estimators, (ii) by Broniatowski and Keziou [4], called here power subdivergence estimators, (iii) by Basu et al. [2], called here power pseudodistance estimators, and (iv) by Vajda [18] called here Rényi pseudodistance...

Sobre el tamaño de muestra para experimentos aleatorios con imprecisión difusa.

M.ª Angeles Gil Alvarez, Pedro Gil Alvarez (1988)

Trabajos de Estadística

Statistical Inference deals with the drawing of conclusions about a random experiment on the basis of the information contained in a sample from it. A random experiment can be defined by means of the set of its possible outcomes (sample space) and the ability of observation of the experimenter. It is usually assumed that this ability allows the experimenter to describe the observable events as subsets of the sample space. In this paper, we will consider that the experimenter can only express the...

Some inequalities related to the Stam inequality

Abram Kagan, Tinghui Yu (2008)

Applications of Mathematics

Zamir showed in 1998 that the Stam classical inequality for the Fisher information (about a location parameter) 1 / I ( X + Y ) 1 / I ( X ) + 1 / I ( Y ) for independent random variables X , Y is a simple corollary of basic properties of the Fisher information (monotonicity, additivity and a reparametrization formula). The idea of his proof works for a special case of a general (not necessarily location) parameter. Stam type inequalities are obtained for the Fisher information in a multivariate observation depending on a univariate location...

Some results envolving the concepts of moment generating function and affinity between distribution functions. Extension for r k-dimensional normal distribution functions.

Antonio Dorival Campos (1999)

Qüestiió

We present a function ρ (F1, F2, t) which contains Matusita's affinity and expresses the affinity between moment generating functions. An interesting results is expressed through decomposition of this affinity ρ (F1, F2, t) when the functions considered are k-dimensional normal distributions. The same decomposition remains true for other families of distribution functions. Generalizations of these results are also presented.

Currently displaying 201 – 220 of 281