Minimum entropy of error estimate for multi-dimensional parameter and finite-state-space observations
Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the -divergence is always equal to its upper bound, and the minimum -divergence estimates are trivial. Broniatowski and Vajda [3] proposed several modifications of the minimum divergence rule to provide a solution to the...
Total correlation (‘TC’) and dual total correlation (‘DTC’) are two classical ways to quantify the correlation among an -tuple of random variables. They both reduce to mutual information when . The first part of this paper sets up the theory of TC and DTC for general random variables, not necessarily finite-valued. This generality has not been exposed in the literature before. The second part considers the structural implications when a joint distribution has small TC or DTC. If , then is...
We introduce new estimates and tests of independence in copula models with unknown margins using -divergences and the duality technique. The asymptotic laws of the estimates and the test statistics are established both when the parameter is an interior or a boundary value of the parameter space. Simulation results show that the choice of -divergence has good properties in terms of efficiency-robustness.
Partial orderings and measures of information for continuous univariate random variables with special roles of Gaussian and uniform distributions are discussed. The information measures and measures of non-Gaussianity including the third and fourth cumulants are generally used as projection indices in the projection pursuit approach for the independent component analysis. The connections between information, non-Gaussianity and statistical independence in the context of independent component analysis...
Se proponen en este trabajo nuevos funcionales reales de la matriz de información de Fisher como medidas de información paramétricas. Se analizan las propiedades de dichas medidas. Se presenta un método sencillo, basado en la matriz de Fisher, para obtener medidas de información paramétricas reales con la propiedad de invariancia bajo transformaciones biyectivas del espacio paramétrico.
This paper deals with order identification for Markov chains with Markov regime (MCMR) in the context of finite alphabets. We define the joint order of a MCMR process in terms of the number k of states of the hidden Markov chain and the memory m of the conditional Markov chain. We study the properties of penalized maximum likelihood estimators for the unknown order (k, m) of an observed MCMR process, relying on information theoretic arguments. The novelty of our work relies in the joint...
Point and region estimation may both be described as specific decision problems. In point estimation, the action space is the set of possible values of the quantity on interest; in region estimation, the action space is the set of its possible credible regions. Foundations dictate that the solution to these decision problems must depend on both the utility function and the prior distribution. Estimators intended for general use should surely be invariant under one-to-one transformations, and this...
The notion of cumulative past inaccuracy (CPI) measure has recently been proposed in the literature as a generalization of cumulative past entropy (CPE) in univariate as well as bivariate setup. In this paper, we introduce the notion of CPI of order and study the proposed measure for conditionally specified models of two components failed at different time instants, called generalized conditional CPI (GCCPI). Several properties, including the effect of monotone transformation and bounds of GCCPI...
The aim of this review is to give different two-parametric generalizations of the following measures: directed divergence (Kullback and Leibler, 1951), Jensen difference divergence (Burbea and Rao 1982 a,b; Rao, 1982) and Jeffreys invariant divergence (Jeffreys, 1946). These generalizations are put in the unified expression and their properties are studied. The applications of generalized information and divergence measures to comparison of experiments and the connections with Fisher information...