The irrelevant information principle for collective probabilistic reasoning
Martin Adamčík; George Wilmers
Kybernetika (2014)
- Volume: 50, Issue: 2, page 175-188
- ISSN: 0023-5954
Access Full Article
topAbstract
topHow to cite
topAdamčík, Martin, and Wilmers, George. "The irrelevant information principle for collective probabilistic reasoning." Kybernetika 50.2 (2014): 175-188. <http://eudml.org/doc/261847>.
@article{Adamčík2014,
abstract = {Within the framework of discrete probabilistic uncertain reasoning a large literature exists justifying the maximum entropy inference process, $\operatorname\{\mathbf \{ME\}\}$, as being optimal in the context of a single agent whose subjective probabilistic knowledge base is consistent. In particular Paris and Vencovská completely characterised the $\operatorname\{\mathbf \{ME\}\}$ inference process by means of an attractive set of axioms which an inference process should satisfy. More recently the second author extended the Paris-Vencovská axiomatic approach to inference processes in the context of several agents whose subjective probabilistic knowledge bases, while individually consistent, may be collectively inconsistent. In particular he defined a natural multi-agent extension of the inference process $\operatorname\{\mathbf \{ME\}\}$ called the social entropy process, $\operatorname\{\mathbf \{SEP\}\}$. However, while $\operatorname\{\mathbf \{SEP\}\}$ has been shown to possess many attractive properties, those which are known are almost certainly insufficient to uniquely characterise it. It is therefore of particular interest to study those Paris-Vencovská principles valid for $\operatorname\{\mathbf \{ME\}\}$ whose immediate generalisations to the multi-agent case are not satisfied by $\operatorname\{\mathbf \{SEP\}\}$. One of these principles is the Irrelevant Information Principle, a powerful and appealing principle which very few inference processes satisfy even in the single agent context. In this paper we will investigate whether $\operatorname\{\mathbf \{SEP\}\}$ can satisfy an interesting modified generalisation of this principle.},
author = {Adamčík, Martin, Wilmers, George},
journal = {Kybernetika},
keywords = {uncertain reasoning; discrete probability function; social inference process; maximum entropy; Kullback–Leibler; irrelevant information principle; uncertain reasoning; discrete probability function; social inference process; maximum entropy; Kullback-Leibler; irrelevant information principle},
language = {eng},
number = {2},
pages = {175-188},
publisher = {Institute of Information Theory and Automation AS CR},
title = {The irrelevant information principle for collective probabilistic reasoning},
url = {http://eudml.org/doc/261847},
volume = {50},
year = {2014},
}
TY - JOUR
AU - Adamčík, Martin
AU - Wilmers, George
TI - The irrelevant information principle for collective probabilistic reasoning
JO - Kybernetika
PY - 2014
PB - Institute of Information Theory and Automation AS CR
VL - 50
IS - 2
SP - 175
EP - 188
AB - Within the framework of discrete probabilistic uncertain reasoning a large literature exists justifying the maximum entropy inference process, $\operatorname{\mathbf {ME}}$, as being optimal in the context of a single agent whose subjective probabilistic knowledge base is consistent. In particular Paris and Vencovská completely characterised the $\operatorname{\mathbf {ME}}$ inference process by means of an attractive set of axioms which an inference process should satisfy. More recently the second author extended the Paris-Vencovská axiomatic approach to inference processes in the context of several agents whose subjective probabilistic knowledge bases, while individually consistent, may be collectively inconsistent. In particular he defined a natural multi-agent extension of the inference process $\operatorname{\mathbf {ME}}$ called the social entropy process, $\operatorname{\mathbf {SEP}}$. However, while $\operatorname{\mathbf {SEP}}$ has been shown to possess many attractive properties, those which are known are almost certainly insufficient to uniquely characterise it. It is therefore of particular interest to study those Paris-Vencovská principles valid for $\operatorname{\mathbf {ME}}$ whose immediate generalisations to the multi-agent case are not satisfied by $\operatorname{\mathbf {SEP}}$. One of these principles is the Irrelevant Information Principle, a powerful and appealing principle which very few inference processes satisfy even in the single agent context. In this paper we will investigate whether $\operatorname{\mathbf {SEP}}$ can satisfy an interesting modified generalisation of this principle.
LA - eng
KW - uncertain reasoning; discrete probability function; social inference process; maximum entropy; Kullback–Leibler; irrelevant information principle; uncertain reasoning; discrete probability function; social inference process; maximum entropy; Kullback-Leibler; irrelevant information principle
UR - http://eudml.org/doc/261847
ER -
References
top- Adamčík, M., Wilmers, G. M., Probabilistic merging operators., Logique et Analyse (2013), to appear.
- Carnap, R., 10.2307/2102920, Philosophy and Phenomenological Research 8 (1947), 133-148. MR0023216DOI10.2307/2102920
- French, S., Group consensus probability distributions: A critical survey., In: Bayesian Statistics (J. M. Bernardo, M. H. De Groot, D. V. Lindley, and A. F. M. Smith, eds.), Elsevier, North Holland 1985, pp. 183-201. Zbl0671.62010MR0862490
- Hardy, G. H., Littlewood, J. E., Pólya, G., Inequalities., Cambridge University Press, 1934. Zbl0634.26008
- Hawes, P., An Investigation of Properties of Some Inference Processes., Ph.D. Thesis, The University of Manchester, Manchester 2007.
- Jaynes, E. T., Where do we stand on maximum entropy?, In: The Maximum Entropy Formalism (R. D. Levine and M. Tribus, eds.), M.I.T. Press, Cambridge 1979. MR0521743
- Kern-Isberner, G., Rödder, W., 10.1002/int.20027, Internat. J. of Intelligent Systems 19 (2004), 837-857. Zbl1101.68944DOI10.1002/int.20027
- Kracík, J., Cooperation Methods in Bayesian Decision Making with Multiple Participants., Ph.D. Thesis, Czech Technical University, Prague 2009.
- Matúš, F., On Iterated Averages of -projections., Universität Bielefeld, Germany 2007.
- Osherson, D., Vardi, M., 10.1016/j.geb.2006.04.001, Games and Economic Behavior 56 (2006), 1, 148-173. Zbl1127.62129MR2235941DOI10.1016/j.geb.2006.04.001
- Paris, J. B., The Uncertain Reasoner's Companion., Cambridge University Press, Cambridge 1994. Zbl0838.68104MR1314199
- Paris, J. B., Vencovská, A., 10.1016/0888-613X(89)90012-1, Internat. J. of Approximate Reasoning 3 (1989), 1-34. Zbl0665.68079MR0975613DOI10.1016/0888-613X(89)90012-1
- Paris, J. B., Vencovská, A., 10.1016/0888-613X(90)90020-3, Internat. J. of Approximate Reasoning 4 (1990), 183-224. MR1051032DOI10.1016/0888-613X(90)90020-3
- Predd, J. B., Osherson, D. N., Kulkarni, S. R., Poor, H. V., 10.1287/deca.1080.0119, Decision Analysis 5 (2008), 4, 177-189. DOI10.1287/deca.1080.0119
- Shore, J. E., Johnson, R. W., 10.1109/TIT.1980.1056144, IEEE Trans. Inform. Theory 26 (1980), 1, 26-37. Zbl0532.94004MR0560389DOI10.1109/TIT.1980.1056144
- Vomlel, J., Methods of Probabilistic Knowledge Integration., Ph.D. Thesis, Czech Technical University, Prague 1999.
- Wilmers, G. M., The social entropy process: Axiomatising the aggregation of probabilistic beliefs., In: Probability, Uncertainty and Rationality (H. Hosni and F. Montagna, eds.), 10 CRM series, Scuola Normale Superiore, Pisa 2010, pp. 87-104. Zbl1206.03025MR2731977
- Wilmers, G. M., Generalising the Maximum Entropy Inference Process to the Aggregation of Probabilistic Beliefs., available from http://manchester.academia.edu/GeorgeWilmers/Papers
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.