Generalized Jensen difference based on entropy functions
Prasanna K. Sahoo, Andrew K. C. Wong (1988)
Kybernetika
Imre Csiszár, František Matúš (2012)
Kybernetika
Integral functionals based on convex normal integrands are minimized subject to finitely many moment constraints. The integrands are finite on the positive and infinite on the negative numbers, strictly convex but not necessarily differentiable. The minimization is viewed as a primal problem and studied together with a dual one in the framework of convex duality. The effective domain of the value function is described by a conic core, a modification of the earlier concept of convex core. Minimizers...
Taneja, Inder Jeet (2004)
JIPAM. Journal of Inequalities in Pure & Applied Mathematics [electronic only]
Shih, Mau-Hsiang, Tsai, Feng-Sheng (2011)
Fixed Point Theory and Applications [electronic only]
Asha Garg (1981)
RAIRO - Operations Research - Recherche Opérationnelle
Teófilo Brezmes, Pedro Gil Alvarez (1985)
Trabajos de Estadística e Investigación Operativa
Se introduce en este trabajo una axiomatización de las medidas de incertidumbre (información a priori) condicionada por una experiencia que generaliza la dada para la incertidumbre condicionada por un suceso. El concepto de medidas de incertidumbre condicionalmente componibles permite, en determinadas condiciones (componibilidad de tipo M), una construcción de las mismas. Por último se analizan diversos ejemplos (medidas de Shannon, Renyi, etc.), constatándose la igualdad de las construcciones dadas...
Triclot, Mathieu (2007)
Journal Électronique d'Histoire des Probabilités et de la Statistique [electronic only]
Minaketan Behara, Prem Nath (1974)
Kybernetika
J. P. Benzécri, K. Ibrahim Hamouda (1983)
Cahiers de l'analyse des données
Milan Mareš, Radko Mesiar (2013)
Kybernetika
This paper deals with the concept of the “size“ or “extent“ of the information in the sense of measuring the improvement of our knowledge after obtaining a message. Standard approaches are based on the probabilistic parameters of the considered information source. Here we deal with situations when the unknown probabilities are subjectively or vaguely estimated. For the considered fuzzy quantities valued probabilities we introduce and discuss information theoretical concepts.
Milan Mareš (2011)
Acta Universitatis Palackianae Olomucensis. Facultas Rerum Naturalium. Mathematica
The structures of the fuzzy information theory are focused on the concept of fuzzy entropy, where the individual information of symbols is considered only implicitely. This paper aims to fill this gap and to study the concepts of fuzzy information. Special attention is paid to the typical fuzzy set theoretical paradigma of monotonicity of operations.
Andrew Rukhin (1997)
Applicationes Mathematicae
The so-called ϕ-divergence is an important characteristic describing "dissimilarity" of two probability distributions. Many traditional measures of separation used in mathematical statistics and information theory, some of which are mentioned in the note, correspond to particular choices of this divergence. An upper bound on a ϕ-divergence between two probability distributions is derived when the likelihood ratio is bounded. The usefulness of this sharp bound is illustrated by several examples of...
Carla Poggi (1982)
Stochastica
The notion of relative measure of information in an abstract information space with generalized independence law is studied. The axiomatic definition is given and the form of dependence on the absolute measures is determined, as a solution of a system of functional equations.
Alberto Barchielli, Giancarlo Lupieri (2006)
Banach Center Publications
General quantum measurements are represented by instruments. In this paper the mathematical formalization is given of the idea that an instrument is a channel which accepts a quantum state as input and produces a probability and an a posteriori state as output. Then, by using mutual entropies on von Neumann algebras and the identification of instruments and channels, many old and new informational inequalities are obtained in a unified manner. Such inequalities involve various quantities which characterize...
J. S. Chawla (1987)
Kybernetika
Braz e Silva, Pablo, Papa, Andrés R.R. (2006)
International Journal of Mathematics and Mathematical Sciences
Peter Harremoës (2009)
Kybernetika
The exact range of the joined values of several Rényi entropies is determined. The method is based on topology with special emphasis on the orientation of the objects studied. Like in the case when only two orders of the Rényi entropies are studied, one can parametrize the boundary of the range. An explicit formula for a tight upper or lower bound for one order of entropy in terms of another order of entropy cannot be given.
Julio Angel Pardo Llorente (1990)
Trabajos de Estadística
En este trabajo se establece un criterio de comparación entre sistemas de información difusa basado en la maximización de la energía informacional esperada de orden α y tipo β y se comprueba que verifica las propiedades más relevantes que a nuestro juicio debe verificar un criterio de comparación.
Domingo Morales González (1986)
Trabajos de Estadística
In this paper the Kagan divergence measure is extended in order to establish a measure of the information that a random sample gives about a Dirichlet process as a whole. After studying some of its properties, the expression obtained in sampling from the step n to the step n+1 is given, and its Bayesian properties are studied. We finish proving the good behaviour of a stopping rule defined on the basis of the information obtained in sampling when passing from a step to the following.
Olivier Catoni (2003)
Annales de l'I.H.P. Probabilités et statistiques