Global dissipativity on uncertain discrete-time neural networks with time-varying delays.
We investigate the Cohen-Grosberg differential equations with mixed delays and time-varying coefficient: Several useful results on the functional space of such functions like completeness and composition theorems are established. By using the fixed-point theorem and some properties of the doubly measure pseudo almost automorphic functions, a set of sufficient criteria are established to ensure the existence, uniqueness and global exponential stability of a -pseudo almost automorphic solution. The...
An image recall system using a large scale associative memory employing the generalized Brain-State-in-a-Box (gBSB) neural network model is proposed. The gBSB neural network can store binary vectors as stable equilibrium points. This property is used to store images in the gBSB memory. When a noisy image is presented as an input to the gBSB network, the gBSB net processes it to filter out the noise. The overlapping decomposition method is utilized to efficiently process images using their binary...
The paper deals with a specific kind of discrete-time recurrent neural network designed with dynamic neuron models. Dynamics are reproduced within each single neuron, hence the network considered is a locally recurrent globally feedforward. A crucial problem with neural networks of the dynamic type is stability as well as stabilization in learning problems. The paper formulates local stability conditions for the analysed class of neural networks using Lyapunov's first method. Moreover, a stabilization...
Stochastic interdependence of a probability distribution on a product space is measured by its Kullback–Leibler distance from the exponential family of product distributions (called multi-information). Here we investigate low-dimensional exponential families that contain the maximizers of stochastic interdependence in their closure. Based on a detailed description of the structure of probability distributions with globally maximal multi-information we obtain our main result: The exponential family...
By using the semi-discrete method of differential equations, a new version of discrete analogue of stochastic shunting inhibitory cellular neural networks (SICNNs) is formulated, which gives a more accurate characterization for continuous-time stochastic SICNNs than that by Euler scheme. Firstly, the existence of the 2th mean almost periodic sequence solution of the discrete-time stochastic SICNNs is investigated with the help of Minkowski inequality, Hölder inequality and Krasnoselskii's fixed...
G. Edelman, O. Sporns and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely invariance under permutations and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies...
Recently a new interesting architecture of neural networks called “mixture of experts” has been proposed as a tool of real multivariate approximation or prediction. We show that the underlying problem is closely related to approximating the joint probability density of involved variables by finite mixture. Particularly, assuming normal mixtures, we can explicitly write the conditional expectation formula which can be interpreted as a mixture-of- experts network. In this way the related optimization...