Realization of Closed, Convex, and Symmetric Subsets of the Unit Square as Regions of Risk for Testing Simple Hypotheses.
We compute the expected value of the Kullback-Leibler divergence of various fundamental statistical models with respect to Dirichlet priors. For the uniform prior, the expected divergence of any model containing the uniform distribution is bounded by a constant . For the models that we consider this bound is approached as the cardinality of the sample space tends to infinity, if the model dimension remains relatively small. For Dirichlet priors with reasonable concentration parameters the expected...
The distribution of each member of the family of statistics based on the -divergence for testing goodness-of-fit is a chi-squared to (Pardo [pard96]). In this paper a closer approximation to the exact distribution is obtained by extracting the -dependent second order component from the term.
This paper deals with four types of point estimators based on minimization of information-theoretic divergences between hypothetical and empirical distributions. These were introduced (i) by Liese and Vajda [9] and independently Broniatowski and Keziou [3], called here power superdivergence estimators, (ii) by Broniatowski and Keziou [4], called here power subdivergence estimators, (iii) by Basu et al. [2], called here power pseudodistance estimators, and (iv) by Vajda [18] called here Rényi pseudodistance...
Statistical Inference deals with the drawing of conclusions about a random experiment on the basis of the information contained in a sample from it. A random experiment can be defined by means of the set of its possible outcomes (sample space) and the ability of observation of the experimenter. It is usually assumed that this ability allows the experimenter to describe the observable events as subsets of the sample space. In this paper, we will consider that the experimenter can only express the...
Zamir showed in 1998 that the Stam classical inequality for the Fisher information (about a location parameter) for independent random variables , is a simple corollary of basic properties of the Fisher information (monotonicity, additivity and a reparametrization formula). The idea of his proof works for a special case of a general (not necessarily location) parameter. Stam type inequalities are obtained for the Fisher information in a multivariate observation depending on a univariate location...
We present a function ρ (F1, F2, t) which contains Matusita's affinity and expresses the affinity between moment generating functions. An interesting results is expressed through decomposition of this affinity ρ (F1, F2, t) when the functions considered are k-dimensional normal distributions. The same decomposition remains true for other families of distribution functions. Generalizations of these results are also presented.