The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

The search session has expired. Please query the service again.

Displaying similar documents to “Neural network realizations of Bayes decision rules for exponentially distributed data”

Neural networks using Bayesian training

Gabriela Andrejková, Miroslav Levický (2003)

Kybernetika

Similarity:

Bayesian probability theory provides a framework for data modeling. In this framework it is possible to find models that are well-matched to the data, and to use these models to make nearly optimal predictions. In connection to neural networks and especially to neural network learning, the theory is interpreted as an inference of the most probable parameters for the model and the given training data. This article describes an application of Neural Networks using the Bayesian training...

Evolutionary learning of rich neural networks in the Bayesian model selection framework

Matteo Matteucci, Dario Spadoni (2004)

International Journal of Applied Mathematics and Computer Science

Similarity:

In this paper we focus on the problem of using a genetic algorithm for model selection within a Bayesian framework. We propose to reduce the model selection problem to a search problem solved using evolutionary computation to explore a posterior distribution over the model space. As a case study, we introduce ELeaRNT (Evolutionary Learning of Rich Neural Network Topologies), a genetic algorithm which evolves a particular class of models, namely, Rich Neural Networks (RNN), in order to...

Neuromorphic features of probabilistic neural networks

Jiří Grim (2007)

Kybernetika

Similarity:

We summarize the main results on probabilistic neural networks recently published in a series of papers. Considering the framework of statistical pattern recognition we assume approximation of class-conditional distributions by finite mixtures of product components. The probabilistic neurons correspond to mixture components and can be interpreted in neurophysiological terms. In this way we can find possible theoretical background of the functional properties of neurons. For example,...

About the maximum information and maximum likelihood principles

Igor Vajda, Jiří Grim (1998)

Kybernetika

Similarity:

Neural networks with radial basis functions are considered, and the Shannon information in their output concerning input. The role of information- preserving input transformations is discussed when the network is specified by the maximum information principle and by the maximum likelihood principle. A transformation is found which simplifies the input structure in the sense that it minimizes the entropy in the class of all information-preserving transformations. Such transformation need...