Displaying 121 – 140 of 179

Showing per page

On the curvature of the space of qubits

Attila Andai (2006)

Banach Center Publications

The Fisher informational metric is unique in some sense (it is the only Markovian monotone distance) in the classical case. A family of Riemannian metrics is called monotone if its members are decreasing under stochastic mappings. These are the metrics to play the role of Fisher metric in the quantum case. Monotone metrics can be labeled by special operator monotone functions, according to Petz's Classification Theorem. The aim of this paper is to present an idea how one can narrow the set of monotone...

On the Jensen-Shannon divergence and the variation distance for categorical probability distributions

Jukka Corander, Ulpu Remes, Timo Koski (2021)

Kybernetika

We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled Jeffreys' divergence and a reversed Jensen-Shannon divergence. Upper and lower bounds for the Jensen-Shannon divergence are then found in terms of the squared (total) variation distance. The derivations rely upon the Pinsker inequality and the reverse Pinsker inequality. We use these bounds to prove the asymptotic equivalence of the maximum likelihood estimate and minimum Jensen-Shannon divergence...

Optimality conditions for maximizers of the information divergence from an exponential family

František Matúš (2007)

Kybernetika

The information divergence of a probability measure P from an exponential family over a finite set is defined as infimum of the divergences of P from Q subject to Q . All directional derivatives of the divergence from are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from are presented, including new ones when P  is not projectable to .

Optimally approximating exponential families

Johannes Rauh (2013)

Kybernetika

This article studies exponential families on finite sets such that the information divergence D ( P ) of an arbitrary probability distribution from is bounded by some constant D > 0 . A particular class of low-dimensional exponential families that have low values of D can be obtained from partitions of the state space. The main results concern optimality properties of these partition exponential families. The case where D = log ( 2 ) is studied in detail. This case is special, because if D < log ( 2 ) , then contains all probability...

Order statistics and ( r , s ) -entropy measures

María Dolores Esteban, Domingo Morales, Leandro Pardo, María Luisa Menéndez (1994)

Applications of Mathematics

K. M. Wong and S. Chen [9] analyzed the Shannon entropy of a sequence of random variables under order restrictions. Using ( r , s ) -entropies, I. J. Taneja [8], these results are generalized. Upper and lower bounds to the entropy reduction when the sequence is ordered and conditions under which they are achieved are derived. Theorems are presented showing the difference between the average entropy of the individual order statistics and the entropy of a member of the original independent identically distributed...

Plan de muestreo secuencial basado en la energía informacional para una población exponencial.

Leandro Pardo, Domingo Morales, Vicente Quesada (1985)

Trabajos de Estadística e Investigación Operativa

En Pardo (1984), se propuso un Plan de Muestreo Secuencial basado en la Energía Informacional (P.M.S.E.I.), análogo al propuesto por Lindley (1956, 1957) a partir de la Entropía de Shannon, con el fin de recoger información acerca de un parámetro desconocido θ. En esta comunicación se aplica el P.M.S.E.I. al caso concreto de la recogida de información acerca del parámetro θ de una distribución exponencial y se extiende el concepto de P.M.S.E.I. al caso en que el estadístico esté interesado en recoger...

Problemas de óptimo que relacionan la información de Kullback y el conjunto de riesgos de Neyman-Pearson.

Ramiro Melendreras Gimeno (1983)

Trabajos de Estadística e Investigación Operativa

Consideramos la conexión que existe entre la información de Kullback y los tests admisibles óptimos en el conjunto de riesgos de Neyman-Pearson, usando para ello el estudio de problemas de programación matemática de tipo infinito. Se obtienen resultados que caracterizan un subconjunto de soluciones Bayes como consecuencia del conocimiento de la información, así como una medida de discriminación entre hipótesis para el conjunto de riesgos.

Currently displaying 121 – 140 of 179