Displaying 81 – 100 of 166

Showing per page

Global exponential stability of pseudo almost automorphic solutions for delayed Cohen-Grosberg neural networks with measure

Chaouki Aouiti, Hediene Jallouli, Mohsen Miraoui (2022)

Applications of Mathematics

We investigate the Cohen-Grosberg differential equations with mixed delays and time-varying coefficient: Several useful results on the functional space of such functions like completeness and composition theorems are established. By using the fixed-point theorem and some properties of the doubly measure pseudo almost automorphic functions, a set of sufficient criteria are established to ensure the existence, uniqueness and global exponential stability of a ( μ , ν ) -pseudo almost automorphic solution. The...

Image recall using a large scale generalized Brain-State-in-a-Box neural network

Cheolhwan Oh, Stanisław Żak (2005)

International Journal of Applied Mathematics and Computer Science

An image recall system using a large scale associative memory employing the generalized Brain-State-in-a-Box (gBSB) neural network model is proposed. The gBSB neural network can store binary vectors as stable equilibrium points. This property is used to store images in the gBSB memory. When a noisy image is presented as an input to the gBSB network, the gBSB net processes it to filter out the noise. The overlapping decomposition method is utilized to efficiently process images using their binary...

Local stability conditions for discrete-time cascade locally recurrent neural networks

Krzysztof Patan (2010)

International Journal of Applied Mathematics and Computer Science

The paper deals with a specific kind of discrete-time recurrent neural network designed with dynamic neuron models. Dynamics are reproduced within each single neuron, hence the network considered is a locally recurrent globally feedforward. A crucial problem with neural networks of the dynamic type is stability as well as stabilization in learning problems. The paper formulates local stability conditions for the analysed class of neural networks using Lyapunov's first method. Moreover, a stabilization...

Maximizing multi–information

Nihat Ay, Andreas Knauf (2006)

Kybernetika

Stochastic interdependence of a probability distribution on a product space is measured by its Kullback–Leibler distance from the exponential family of product distributions (called multi-information). Here we investigate low-dimensional exponential families that contain the maximizers of stochastic interdependence in their closure. Based on a detailed description of the structure of probability distributions with globally maximal multi-information we obtain our main result: The exponential family...

Mean almost periodicity and moment exponential stability of discrete-time stochastic shunting inhibitory cellular neural networks with time delays

Tianwei Zhang, Lijun Xu (2019)

Kybernetika

By using the semi-discrete method of differential equations, a new version of discrete analogue of stochastic shunting inhibitory cellular neural networks (SICNNs) is formulated, which gives a more accurate characterization for continuous-time stochastic SICNNs than that by Euler scheme. Firstly, the existence of the 2th mean almost periodic sequence solution of the discrete-time stochastic SICNNs is investigated with the help of Minkowski inequality, Hölder inequality and Krasnoselskii's fixed...

Mean mutual information and symmetry breaking for finite random fields

J. Buzzi, L. Zambotti (2012)

Annales de l'I.H.P. Probabilités et statistiques

G. Edelman, O. Sporns and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely invariance under permutations and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies...

Mixture of experts architectures for neural networks as a special case of conditional expectation formula

Jiří Grim (1998)

Kybernetika

Recently a new interesting architecture of neural networks called “mixture of experts” has been proposed as a tool of real multivariate approximation or prediction. We show that the underlying problem is closely related to approximating the joint probability density of involved variables by finite mixture. Particularly, assuming normal mixtures, we can explicitly write the conditional expectation formula which can be interpreted as a mixture-of- experts network. In this way the related optimization...

Currently displaying 81 – 100 of 166