Rank of tensors of -out-of- k functions: An application in probabilistic inference

Jiří Vomlel

Kybernetika (2011)

  • Volume: 47, Issue: 3, page 317-336
  • ISSN: 0023-5954

Abstract

top
Bayesian networks are a popular model for reasoning under uncertainty. We study the problem of efficient probabilistic inference with these models when some of the conditional probability tables represent deterministic or noisy -out-of- k functions. These tables appear naturally in real-world applications when we observe a state of a variable that depends on its parents via an addition or noisy addition relation. We provide a lower bound of the rank and an upper bound for the symmetric border rank of tensors representing -out-of- k functions. We propose an approximation of tensors representing noisy -out-of- k functions by a sum of r tensors of rank one, where r is an upper bound of the symmetric border rank of the approximated tensor. We applied the suggested approximation to probabilistic inference in probabilistic graphical models. Numerical experiments reveal that we can get a gain in the order of two magnitudes but at the expense of a certain loss of precision.

How to cite

top

Vomlel, Jiří. "Rank of tensors of $\ell $-out-of-$k$ functions: An application in probabilistic inference." Kybernetika 47.3 (2011): 317-336. <http://eudml.org/doc/196828>.

@article{Vomlel2011,
abstract = {Bayesian networks are a popular model for reasoning under uncertainty. We study the problem of efficient probabilistic inference with these models when some of the conditional probability tables represent deterministic or noisy $\ell $-out-of-$k$ functions. These tables appear naturally in real-world applications when we observe a state of a variable that depends on its parents via an addition or noisy addition relation. We provide a lower bound of the rank and an upper bound for the symmetric border rank of tensors representing $\ell $-out-of-$k$ functions. We propose an approximation of tensors representing noisy $\ell $-out-of-$k$ functions by a sum of $r$ tensors of rank one, where $r$ is an upper bound of the symmetric border rank of the approximated tensor. We applied the suggested approximation to probabilistic inference in probabilistic graphical models. Numerical experiments reveal that we can get a gain in the order of two magnitudes but at the expense of a certain loss of precision.},
author = {Vomlel, Jiří},
journal = {Kybernetika},
keywords = {Bayesian networks; probabilistic inference; uncertainty in artificial intelligence; tensor rank; symmetric rank; border rank; Bayesian networks; probabilistic inference; tensor rank; uncertainty in artificial intelligence; symmetric rank; border rank},
language = {eng},
number = {3},
pages = {317-336},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Rank of tensors of $\ell $-out-of-$k$ functions: An application in probabilistic inference},
url = {http://eudml.org/doc/196828},
volume = {47},
year = {2011},
}

TY - JOUR
AU - Vomlel, Jiří
TI - Rank of tensors of $\ell $-out-of-$k$ functions: An application in probabilistic inference
JO - Kybernetika
PY - 2011
PB - Institute of Information Theory and Automation AS CR
VL - 47
IS - 3
SP - 317
EP - 336
AB - Bayesian networks are a popular model for reasoning under uncertainty. We study the problem of efficient probabilistic inference with these models when some of the conditional probability tables represent deterministic or noisy $\ell $-out-of-$k$ functions. These tables appear naturally in real-world applications when we observe a state of a variable that depends on its parents via an addition or noisy addition relation. We provide a lower bound of the rank and an upper bound for the symmetric border rank of tensors representing $\ell $-out-of-$k$ functions. We propose an approximation of tensors representing noisy $\ell $-out-of-$k$ functions by a sum of $r$ tensors of rank one, where $r$ is an upper bound of the symmetric border rank of the approximated tensor. We applied the suggested approximation to probabilistic inference in probabilistic graphical models. Numerical experiments reveal that we can get a gain in the order of two magnitudes but at the expense of a certain loss of precision.
LA - eng
KW - Bayesian networks; probabilistic inference; uncertainty in artificial intelligence; tensor rank; symmetric rank; border rank; Bayesian networks; probabilistic inference; tensor rank; uncertainty in artificial intelligence; symmetric rank; border rank
UR - http://eudml.org/doc/196828
ER -

References

top
  1. Bini, D., Lotti, G., Romani, F., 10.1137/0209053, SIAM J. Comput. 9 (1980), 692–697. (1980) Zbl0446.68035MR0592760DOI10.1137/0209053
  2. Carroll, J. D., Chang, J. J., 10.1007/BF02310791, Psychometrika 35 (1970), 283–319. (1970) Zbl0202.19101DOI10.1007/BF02310791
  3. Lathauwer, L. De, Moor, B. De, From matrix to tensor: multilinear algebra and signal processing, In: 4th Internat. Conf. on Mathematics in Signal Processing, Part I, Warwick 1996, IMA Conf. Series, pp. 1–11. (1996) 
  4. Díez, F. J., Galán, S. F., 10.1002/int.10080, Internat. J. Intell. Systems 18 (2003), 2, 165–177. (2003) DOI10.1002/int.10080
  5. Harshman, R. A., Foundations of the PARAFAC procedure: Models and conditions for an “explanatory" multi-mode factor analysis, UCLA Working Papers in Phonetics 16 (1970), 1–84. (1970) 
  6. Jensen, F. V., Bayesian Networks and Decision Graphs, Springer-Verlag, New York 2001. (2001) Zbl0973.62005MR1876880
  7. Jensen, F. V., Lauritzen, S. L., Olesen, K. G., Bayesian updating in recursive graphical models by local computation, omputational Statistics Quarterly 4 (1990), 269–282. (1990) MR1073446
  8. Lauritzen, S. L., Spiegelhalter, D. J., Local computations with probabilities on graphical structures and their application to expert systems (with discussion), J. Roy. Statist. Soc., Ser. B 50 (1988), 157–224. (1988) MR0964177
  9. Madsen, A. L., Jensen, F. V., 10.1016/S0004-3702(99)00062-4, Artificial Intelligence 113 (1999), 1–2, 203–245. (1999) Zbl0939.68848MR1724104DOI10.1016/S0004-3702(99)00062-4
  10. Olmsted, S. M., On Representing and Solving Decision Problems, PhD. Thesis, Stanford University 1983. (1983) 
  11. Team, R Development Core, R: A Language and Environment for Statistical Computing, R Foundation for Statistical Computing, Vienna 2008. (2008) 
  12. Rose, D. J., A graph-theoretic study of the numerical solution of sparse positive definite systems of linear equations, Graph Theory Comput. (1972), 183–217. (1972) Zbl0266.65028MR0341833
  13. Savický, P., Vomlel, J., Exploiting tensor rank-one decomposition in probabilistic inference, Kybernetika 43 (2007), 5, 747–764. (2007) Zbl1148.68539MR2376335
  14. Savický, P., Vomlel, J., Triangulation heuristics for BN2O networks, In: Tenth European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty, ECSQARU 2009, Lecture Notes in Comput. Sci. 5590, Springer, Berlin – Heidelberg 2009, pp. 566–577. (2009) Zbl1245.62019MR2893316
  15. Shafer, G. R., Shenoy, P. P., 10.1007/BF01531015, Ann. Math. Artif. Intell. 2 (1990), 1–4, 327–351. (1990) Zbl0875.68676MR1215075DOI10.1007/BF01531015
  16. Vomlel, J., Exploiting functional dependence in Bayesian network inference, In: Proc. 18th Conference on Uncertainty in AI (UAI), Morgan Kaufmann Publ. 2002, pp. 528–535. (2002) 
  17. Vomlel, J., Savický, P., Arithmetic circuits of the noisy-or models, In: Prof. Fourth European Workshop on Probabilistic Graphical Models (PGM’08), Hirtshals 2005, pp. 297–304. (2005) 
  18. Wiegerinck, W., Kappen, B., Burgers, W., Bayesian Networks for Expert Systems: Theory and Practical Applications, In: Interactive Collaborative Information Systems – Studies in Computational Intelligence (R. Babuška and F. C. A. Groen, eds.), Springer-Verlag, Berlin – Heidelberg 2010, pp. 547–578. (2010) 
  19. Wikipedia: Minesweeper (Computer game), http://en.wikipedia.org/wiki/Minesweeper_(computer_game). 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.