Currently displaying 1 – 12 of 12

Showing per page

Order by Relevance | Title | Year of publication

On generalized measures of relative information and inaccuracy

Inder Jeet TanejaH. C. Gupta — 1978

Aplikace matematiky

Kullback's relative information and Kerridge's inaccuracy are two information-theoretic measures associated with a pair of probability distributions of a discrete random variable. The authors study a generalized measure which in particular contains a parametric generalization of relative information and inaccuracy. Some important properties of this generalized measure along with an inversion theorem are also studied.

On generalized information and divergence measures and their applications: a brief review.

The aim of this review is to give different two-parametric generalizations of the following measures: directed divergence (Kullback and Leibler, 1951), Jensen difference divergence (Burbea and Rao 1982 a,b; Rao, 1982) and Jeffreys invariant divergence (Jeffreys, 1946). These generalizations are put in the unified expression and their properties are studied. The applications of generalized information and divergence measures to comparison of experiments and the connections with Fisher information...

( R , S ) -information radius of type t and comparison of experiments

Inder Jeet TanejaLuis PardoD. Morales — 1991

Applications of Mathematics

Various information, divergence and distance measures have been used by researchers to compare experiments using classical approaches such as those of Blackwell, Bayesian ets. Blackwell's [1] idea of comparing two statistical experiments is based on the existence of stochastic transformations. Using this idea of Blackwell, as well as the classical bayesian approach, we have compared statistical experiments by considering unified scalar parametric generalizations of Jensen difference divergence measure....

Page 1

Download Results (CSV)