Displaying 181 – 200 of 281

Showing per page

On the Jensen-Shannon divergence and the variation distance for categorical probability distributions

Jukka Corander, Ulpu Remes, Timo Koski (2021)

Kybernetika

We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence...

On the Rao-Blackwell Theorem for fuzzy random variables

María Asunción Lubiano, María Angeles Gil, Miguel López-Díaz (1999)

Kybernetika

In a previous paper, conditions have been given to compute iterated expectations of fuzzy random variables, irrespectively of the order of integration. In another previous paper, a generalized real-valued measure to quantify the absolute variation of a fuzzy random variable with respect to its expected value have been introduced and analyzed. In the present paper we combine the conditions and generalized measure above to state an extension of the basic Rao–Blackwell Theorem. An application of this...

On unbiased Lehmann-estimators of a variance of an exponential distribution with quadratic loss function.

Jadwiga Kicinska-Slaby (1982)

Trabajos de Estadística e Investigación Operativa

Lehmann in [4] has generalised the notion of the unbiased estimator with respect to the assumed loss function. In [5] Singh considered admissible estimators of function λ-r of unknown parameter λ of gamma distribution with density f(x|λ, b) = λb-1 e-λx xb-1 / Γ(b), x>0, where b is a known parameter, for loss function L(λ-r, λ-r) = (λ-r - λ-r)2 / λ-2r.Goodman in [1] choosing three loss functions of different shape found unbiased Lehmann-estimators, of the variance σ2 of the normal distribution....

Optimal solutions of multivariate coupling problems

Ludger Rüschendorf (1995)

Applicationes Mathematicae

Some necessary and some sufficient conditions are established for the explicit construction and characterization of optimal solutions of multivariate transportation (coupling) problems. The proofs are based on ideas from duality theory and nonconvex optimization theory. Applications are given to multivariate optimal coupling problems w.r.t. minimal l p -type metrics, where fairly explicit and complete characterizations of optimal transportation plans (couplings) are obtained. The results are of interest...

Optimality conditions for maximizers of the information divergence from an exponential family

František Matúš (2007)

Kybernetika

The information divergence of a probability measure P from an exponential family over a finite set is defined as infimum of the divergences of P from Q subject to Q . All directional derivatives of the divergence from are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from are presented, including new ones when P  is not projectable to .

Optimally approximating exponential families

Johannes Rauh (2013)

Kybernetika

This article studies exponential families on finite sets such that the information divergence D ( P ) of an arbitrary probability distribution from is bounded by some constant D > 0 . A particular class of low-dimensional exponential families that have low values of D can be obtained from partitions of the state space. The main results concern optimality properties of these partition exponential families. The case where D = log ( 2 ) is studied in detail. This case is special, because if D < log ( 2 ) , then contains all probability...

Order statistics and ( r , s ) -entropy measures

María Dolores Esteban, Domingo Morales, Leandro Pardo, María Luisa Menéndez (1994)

Applications of Mathematics

K. M. Wong and S. Chen [9] analyzed the Shannon entropy of a sequence of random variables under order restrictions. Using ( r , s ) -entropies, I. J. Taneja [8], these results are generalized. Upper and lower bounds to the entropy reduction when the sequence is ordered and conditions under which they are achieved are derived. Theorems are presented showing the difference between the average entropy of the individual order statistics and the entropy of a member of the original independent identically distributed...

Plan de muestreo secuencial basado en la energía informacional para una población exponencial.

Leandro Pardo, Domingo Morales, Vicente Quesada (1985)

Trabajos de Estadística e Investigación Operativa

En Pardo (1984), se propuso un Plan de Muestreo Secuencial basado en la Energía Informacional (P.M.S.E.I.), análogo al propuesto por Lindley (1956, 1957) a partir de la Entropía de Shannon, con el fin de recoger información acerca de un parámetro desconocido θ. En esta comunicación se aplica el P.M.S.E.I. al caso concreto de la recogida de información acerca del parámetro θ de una distribución exponencial y se extiende el concepto de P.M.S.E.I. al caso en que el estadístico esté interesado en recoger...

Problemas de óptimo que relacionan la información de Kullback y el conjunto de riesgos de Neyman-Pearson.

Ramiro Melendreras Gimeno (1983)

Trabajos de Estadística e Investigación Operativa

Consideramos la conexión que existe entre la información de Kullback y los tests admisibles óptimos en el conjunto de riesgos de Neyman-Pearson, usando para ello el estudio de problemas de programación matemática de tipo infinito. Se obtienen resultados que caracterizan un subconjunto de soluciones Bayes como consecuencia del conocimiento de la información, así como una medida de discriminación entre hipótesis para el conjunto de riesgos.

Currently displaying 181 – 200 of 281