Neuromorphic features of probabilistic neural networks

Jiří Grim

Kybernetika (2007)

  • Volume: 43, Issue: 5, page 697-712
  • ISSN: 0023-5954

Abstract

top
We summarize the main results on probabilistic neural networks recently published in a series of papers. Considering the framework of statistical pattern recognition we assume approximation of class-conditional distributions by finite mixtures of product components. The probabilistic neurons correspond to mixture components and can be interpreted in neurophysiological terms. In this way we can find possible theoretical background of the functional properties of neurons. For example, the general formula for synaptical weights provides a statistical justification of the well known Hebbian principle of learning. Similarly, the mean effect of lateral inhibition can be expressed by means of a formula proposed by Perez as a measure of dependence tightness of involved variables.

How to cite

top

Grim, Jiří. "Neuromorphic features of probabilistic neural networks." Kybernetika 43.5 (2007): 697-712. <http://eudml.org/doc/33889>.

@article{Grim2007,
abstract = {We summarize the main results on probabilistic neural networks recently published in a series of papers. Considering the framework of statistical pattern recognition we assume approximation of class-conditional distributions by finite mixtures of product components. The probabilistic neurons correspond to mixture components and can be interpreted in neurophysiological terms. In this way we can find possible theoretical background of the functional properties of neurons. For example, the general formula for synaptical weights provides a statistical justification of the well known Hebbian principle of learning. Similarly, the mean effect of lateral inhibition can be expressed by means of a formula proposed by Perez as a measure of dependence tightness of involved variables.},
author = {Grim, Jiří},
journal = {Kybernetika},
keywords = {probabilistic neural networks; distribution mixtures; sequential EM algorithm; pattern recognition; distribution mixtures; sequential EM algorithm; pattern recognition},
language = {eng},
number = {5},
pages = {697-712},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Neuromorphic features of probabilistic neural networks},
url = {http://eudml.org/doc/33889},
volume = {43},
year = {2007},
}

TY - JOUR
AU - Grim, Jiří
TI - Neuromorphic features of probabilistic neural networks
JO - Kybernetika
PY - 2007
PB - Institute of Information Theory and Automation AS CR
VL - 43
IS - 5
SP - 697
EP - 712
AB - We summarize the main results on probabilistic neural networks recently published in a series of papers. Considering the framework of statistical pattern recognition we assume approximation of class-conditional distributions by finite mixtures of product components. The probabilistic neurons correspond to mixture components and can be interpreted in neurophysiological terms. In this way we can find possible theoretical background of the functional properties of neurons. For example, the general formula for synaptical weights provides a statistical justification of the well known Hebbian principle of learning. Similarly, the mean effect of lateral inhibition can be expressed by means of a formula proposed by Perez as a measure of dependence tightness of involved variables.
LA - eng
KW - probabilistic neural networks; distribution mixtures; sequential EM algorithm; pattern recognition; distribution mixtures; sequential EM algorithm; pattern recognition
UR - http://eudml.org/doc/33889
ER -

References

top
  1. Bialasiewicz J., Statistical data reduction via construction of sample space partitions, Kybernetika 6 (1970), 6, 371–379 (1970) Zbl0218.94005MR0283910
  2. Dempster A. P., Laird N. M., Rubin D. B., Maximum-likelihood from incomplete data via the EM algorithm, J. Royal Statist. Soc. B 39 (1977), 1–38 (1977) Zbl0364.62022MR0501537
  3. Grim J., On numerical evaluation of maximum-likelihood estimates for finite mixtures of distributions, Kybernetika 18 (1982), 3, 173–190 (1982) Zbl0489.62028MR0680154
  4. Grim J., Design and optimization of multilevel homogeneous structures for multivariate pattern recognition, In: Fourth FORMATOR Symposium 1982, Academia, Prague 1982, pp. 233–240 (1982) MR0726960
  5. Grim J., Multivariate statistical pattern recognition with non-reduced dimensionality, Kybernetika 22 (1986), 6, 142–157 (1986) 
  6. Grim J., Maximum-likelihood design of layered neural networks, In: Proc. Internat. Conference Pattern Recognition. IEEE Computer Society Press, Los Alamitos 1996, pp. 85–89 (1996) 
  7. Grim J., Design of multilayer neural networks by information preserving transforms, In: Third European Congress on Systems Science (E. Pessa, M. P. Penna, and A. Montesanto, eds.). Edizioni Kappa, Roma 1996, pp. 977–982 (1996) 
  8. Grim J., Information approach to structural optimization of probabilistic neural networks, In: Fourth European Congress on Systems Science (L. Ferrer and A. Caselles, eds.). SESGE, Valencia 1999, pp. 527–539 (1999) 
  9. Grim J., Discretization of probabilistic neural networks with bounded information loss, In: Computer–Intensive Methods in Control and Data Processing. (Preprints of the 3rd European IEEE Workshop CMP’98, Prague 1998, J. Rojicek et al., eds.), ÚTIA AV ČR, Prague 1998, pp. 205–210 (1998) 
  10. Grim J., A sequential modification of EM algorithm, In: Proc. Classification in the Information Age (W. Gaul and H. Locarek-Junge, eds., Studies in Classification, Data Analysis, and Knowledge Organization), Springer, Berlin 1999, pp. 163–170 (1999) 
  11. J. J. Grim, Self-organizing maps and probabilistic neural networks, Neural Network World 10 (2000), 3, 407–415 
  12. Grim J., Probabilistic Neural Networks (in Czech), In: Umělá inteligence IV. (V. Mařík, O. Štěpánková, and J. Lažanský, eds.), Academia, Praha 2003, pp. 276–312 
  13. Grim J., Just, P., Pudil P., Strictly modular probabilistic neural networks for pattern recognition, Neural Network World 13 (2003), 6, 599–615 
  14. Grim J., Kittler J., Pudil, P., Somol P., Combining multiple classifiers in probabilistic neural networks, In: Multiple Classifier Systems (Lecture Notes in Computer Science 1857, J. Kittler and F. Roli, eds.). Springer, Berlin 2000, pp. 157–166 
  15. Grim J., Kittler J., Pudil, P., Somol P., Information analysis of multiple classifier fusion, In: Multiple Classifier Systems 2001 (Lecture Notes in Computer Science 2096, J. Kittler and F. Roli, eds.). Springer, Berlin – New York 2001, pp. 168–177 Zbl0987.68898MR2043268
  16. Grim J., Kittler J., Pudil, P., Somol P., Multiple classifier fusion in probabilistic neural networks, Pattern Analysis & Applications 5 (2002), 7, 221–233 Zbl1021.68079MR1930448
  17. Grim J., Pudil, P., Somol P., Recognition of handwritten numerals by structural probabilistic neural networks, In: Proc. Second ICSC Symposium on Neural Computation (H. Bothe and R. Rojas, eds.). ICSC, Wetaskiwin 2000, pp. 528–534 
  18. Grim J., Pudil, P., Somol P., Boosting in probabilistic neural networks, In: Proc. 16th International Conference on Pattern Recognition (R. Kasturi, D. Laurendeau and C. Suen, eds.). IEEE Computer Society, Los Alamitos 2002, pp. 136–139 
  19. Grim J., Somol P., Pudil, P., Just P., Probabilistic neural network playing a simple game, In: Artificial Neural Networks in Pattern Recognition (S. Marinai and M. Gori, eds.). University of Florence, Florence 2003, pp. 132–138 
  20. Grim J., Somol, P., Pudil P., Probabilistic neural network playing and learning Tic-Tac-Toe, Pattern Recognition Letters, Special Issue 26 (2005), 12, 1866–1873 
  21. Haykin S., Neural Networks: A Comprehensive Foundation, Morgan Kaufman, San Mateo 1993 Zbl0934.68076
  22. McLachlan G. J., Peel D., Finite Mixture Models, Wiley, New York – Toronto 2000 Zbl0963.62061MR1789474
  23. Perez A., Information, ε -sufficiency and data reduction problems, Kybernetika 1 (1965), 4, 297–323 (1965) MR0205410
  24. Perez A., ε -admissible simplification of the dependence structure of a set of random variables, Kybernetika 13 (1977), 6, 439–449 (1977) MR0472224
  25. Schlesinger M. I., Relation between learning and self-learning in pattern recognition (in Russian), Kibernetika (1968), 6, 81–88 (1968) 
  26. Specht D. F., Probabilistic neural networks for classification, mapping or associative memory, In: Proc. IEEE Internat. Conference on Neural Networks 1988, Vol. I, pp. 525–532 (1988) 
  27. Streit L. R., Luginbuhl T. E., Maximum-likelihood training of probabilistic neural networks, IEEE Trans. Neural Networks 5 (1994), 764–783 (1994) 
  28. Vajda I., Grim J., About the maximum information and maximum likelihood principles in neural networks, Kybernetika 34 (1998), 4, 485–494 (1998) MR0359208
  29. Watanabe S., Fukumizu K., Probabilistic design of layered neural networks based on their unified framework, IEEE Trans. Neural Networks 6 (1995), 3, 691–702 (1995) 
  30. Xu L., Jordan M. I., On convergence properties of the EM algorithm for Gaussian mixtures, Neural Computation 8 (1996), 129–151 (1996) 

NotesEmbed ?

top

You must be logged in to post comments.

To embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.

Only the controls for the widget will be shown in your chosen language. Notes will be shown in their authored language.

Tells the widget how many notes to show per page. You can cycle through additional notes using the next and previous controls.

    
                

Note: Best practice suggests putting the JavaScript code just before the closing </body> tag.