Page 1

Displaying 1 – 4 of 4

Showing per page

Neural network realizations of Bayes decision rules for exponentially distributed data

Igor Vajda, Belomír Lonek, Viktor Nikolov, Arnošt Veselý (1998)

Kybernetika

For general Bayes decision rules there are considered perceptron approximations based on sufficient statistics inputs. A particular attention is paid to Bayes discrimination and classification. In the case of exponentially distributed data with known model it is shown that a perceptron with one hidden layer is sufficient and the learning is restricted to synaptic weights of the output neuron. If only the dimension of the exponential model is known, then the number of hidden layers will increase...

Neural networks using Bayesian training

Gabriela Andrejková, Miroslav Levický (2003)

Kybernetika

Bayesian probability theory provides a framework for data modeling. In this framework it is possible to find models that are well-matched to the data, and to use these models to make nearly optimal predictions. In connection to neural networks and especially to neural network learning, the theory is interpreted as an inference of the most probable parameters for the model and the given training data. This article describes an application of Neural Networks using the Bayesian training to the problem...

Neuromorphic features of probabilistic neural networks

Jiří Grim (2007)

Kybernetika

We summarize the main results on probabilistic neural networks recently published in a series of papers. Considering the framework of statistical pattern recognition we assume approximation of class-conditional distributions by finite mixtures of product components. The probabilistic neurons correspond to mixture components and can be interpreted in neurophysiological terms. In this way we can find possible theoretical background of the functional properties of neurons. For example, the general...

Currently displaying 1 – 4 of 4

Page 1