Currently displaying 1 – 2 of 2

Showing per page

Order by Relevance | Title | Year of publication

A note on how Rényi entropy can create a spectrum of probabilistic merging operators

Martin Adamčík — 2019

Kybernetika

In this paper we present a result that relates merging of closed convex sets of discrete probability functions respectively by the squared Euclidean distance and the Kullback-Leibler divergence, using an inspiration from the Rényi entropy. While selecting the probability function with the highest Shannon entropy appears to be a convincingly justified way of representing a closed convex set of probability functions, the discussion on how to represent several closed convex sets of probability functions...

The irrelevant information principle for collective probabilistic reasoning

Martin AdamčíkGeorge Wilmers — 2014

Kybernetika

Within the framework of discrete probabilistic uncertain reasoning a large literature exists justifying the maximum entropy inference process, error , as being optimal in the context of a single agent whose subjective probabilistic knowledge base is consistent. In particular Paris and Vencovská completely characterised the error inference process by means of an attractive set of axioms which an inference process should satisfy. More recently the second author extended the Paris-Vencovská axiomatic approach...

Page 1

Download Results (CSV)