Displaying 241 – 260 of 268

Showing per page

The entropy of Łukasiewicz-languages

Ludwig Staiger (2005)

RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications

The paper presents an elementary approach for the calculation of the entropy of a class of languages. This approach is based on the consideration of roots of a real polynomial and is also suitable for calculating the Bernoulli measure. The class of languages we consider here is a generalisation of the Łukasiewicz language.

The entropy of Łukasiewicz-languages

Ludwig Staiger (2010)

RAIRO - Theoretical Informatics and Applications

The paper presents an elementary approach for the calculation of the entropy of a class of languages. This approach is based on the consideration of roots of a real polynomial and is also suitable for calculating the Bernoulli measure. The class of languages we consider here is a generalisation of the Łukasiewicz language.

The irrelevant information principle for collective probabilistic reasoning

Martin Adamčík, George Wilmers (2014)

Kybernetika

Within the framework of discrete probabilistic uncertain reasoning a large literature exists justifying the maximum entropy inference process, error , as being optimal in the context of a single agent whose subjective probabilistic knowledge base is consistent. In particular Paris and Vencovská completely characterised the error inference process by means of an attractive set of axioms which an inference process should satisfy. More recently the second author extended the Paris-Vencovská axiomatic approach...

Tropical probability theory and an application to the entropic cone

Rostislav Matveev, Jacobus W. Portegies (2020)

Kybernetika

In a series of articles, we have been developing a theory of tropical diagrams of probability spaces, expecting it to be useful for information optimization problems in information theory and artificial intelligence. In this article, we give a summary of our work so far and apply the theory to derive a dimension-reduction statement about the shape of the entropic cone.

Un contraste de normalidad basado en la energía informacional.

M.ª del Carmen Pardo (1993)

Qüestiió

En este trabajo se presenta un contraste de normalidad basado en la Energía Informacional de forma paralela al obtenido por Vasicek (1976) basándose en la Entropía de Shannon. Se estima la potencia de este contraste para diversas alternativas comparándola con la de otros contrastes de normalidad. Estos resultados permiten afirmar que este contraste es preferido en algunos casos a algunos contrastes clásicos.

Una medida de incertidumbre probabilística para sucesos difusos.

María Teresa López García, Pedro Gil Alvarez (1986)

Trabajos de Estadística

El objetivo de este artículo es proponer una medida de incertidumbre asociada a un conjunto difuso, de un referencial finito, que generalice la entropía de Shannon; es decir, que además de considerar la distribución de probabilidades definida en el referencial considere también la función de pertenencia del conjunto difuso.Posteriormente se estudian algunas propiedades de la medida propuesta.

Une condition asymptotique pour le calcul de constantes de Sobolev logarithmiques sur la droite

Laurent Miclo (2009)

Annales de l'I.H.P. Probabilités et statistiques

On présente une formule explicite pour la constante de Sobolev logarithmique correspondant à des diffusions réelles ou à des processus entiers de vie et de mort, sous l’hypothèse que certaines quantités, naturellement associées à des inégalités de Hardy dans ce contexte, approchent leur supremum au bord de leur domaine de définition. La preuve se ramène au cas de la constante de Poincaré, à l’aide de comparaisons exactes entre entropie et variances appropriées.

Universally typical sets for ergodic sources of multidimensional data

Tyll Krüger, Guido F. Montúfar, Ruedi Seiler, Rainer Siegmund-Schultze (2013)

Kybernetika

We lift important results about universally typical sets, typically sampled sets, and empirical entropy estimation in the theory of samplings of discrete ergodic information sources from the usual one-dimensional discrete-time setting to a multidimensional lattice setting. We use techniques of packings and coverings with multidimensional windows to construct sequences of multidimensional array sets which in the limit build the generated samples of any ergodic source of entropy rate below an h 0 with...

Velocity and Entropy of Motion in Periodic Potentials

Andreas Knauf (1996/1997)

Séminaire Équations aux dérivées partielles

This is a report on recent joint work with J. Asch, and with T. Hudetz and F. Benatti.We consider classical, quantum and semiclassical motion in periodic potentials and prove various results on the distribution of asymptotic velocities.The Kolmogorov-Sinai entropy and its quantum generalization, the Connes-Narnhofer-Thirring entropy, of the single particle and of a gas of noninteracting particles are related.

Currently displaying 241 – 260 of 268