Displaying similar documents to “Global stability of Clifford-valued Takagi-Sugeno fuzzy neural networks with time-varying delays and impulses”

New results on stability of periodic solution for CNNs with proportional delays and D operator

Bo Du (2019)

Kybernetika

Similarity:

The problems related to periodic solutions of cellular neural networks (CNNs) involving D operator and proportional delays are considered. We shall present Topology degree theory and differential inequality technique for obtaining the existence of periodic solution to the considered neural networks. Furthermore, Laypunov functional method is used for studying global asymptotic stability of periodic solutions to the above system.

Existence, uniqueness and global asymptotic stability for a class of complex-valued neutral-type neural networks with time delays

Manchun Tan, Desheng Xu (2018)

Kybernetika

Similarity:

This paper explores the problem of delay-independent and delay-dependent stability for a class of complex-valued neutral-type neural networks with time delays. Aiming at the neutral-type neural networks, an appropriate function is constructed to derive the existence of equilibrium point. On the basis of homeomorphism theory, Lyapunov functional method and linear matrix inequality techniques, several LMI-based sufficient conditions on the existence, uniqueness and global asymptotic stability...

Generating series and asymptotics of classical spin networks

Francesco Costantino, Julien Marché (2015)

Journal of the European Mathematical Society

Similarity:

We study classical spin networks with group SU 2 . In the first part, using Gaussian integrals, we compute their generating series in the case where the edges are equipped with holonomies; this generalizes Westbury’s formula. In the second part, we use an integral formula for the square of the spin network and perform stationary phase approximation under some non-degeneracy hypothesis. This gives a precise asymptotic behavior when the labels are rescaled by a constant going to infinity. ...

Stability analysis and absolute synchronization of a three-unit delayed neural network

Lin Jun Wang, You Xiang Xie, Zhou Chao Wei, Jian Peng (2015)

Kybernetika

Similarity:

In this paper, we consider a three-unit delayed neural network system, investigate the linear stability, and obtain some sufficient conditions ensuring the absolute synchronization of the system by the Lyapunov function. Numerical simulations show that the theoretically predicted results are in excellent agreement with the numerically observed behavior.

Delay dependent complex-valued bidirectional associative memory neural networks with stochastic and impulsive effects: An exponential stability approach

Chinnamuniyandi Maharajan, Chandran Sowmiya, Changjin Xu (2024)

Kybernetika

Similarity:

This paper investigates the stability in an exponential sense of complex-valued Bidirectional Associative Memory (BAM) neural networks with time delays under the stochastic and impulsive effects. By utilizing the contracting mapping theorem, the existence and uniqueness of the equilibrium point for the proposed complex-valued neural networks are verified. Moreover, based on the Lyapunov - Krasovskii functional construction, matrix inequality techniques and stability theory, some novel...

Stability analysis for neutral-type impulsive neural networks with delays

Bo Du, Yurong Liu, Dan Cao (2017)

Kybernetika

Similarity:

By using linear matrix inequality (LMI) approach and Lyapunov functional method, we obtain some new sufficient conditions ensuring global asymptotic stability and global exponential stability of a generalized neutral-type impulsive neural networks with delays. A simulation example is provided to demonstrate the usefulness of the main results obtained. The main contribution in this paper is that a new neutral-type impulsive neural networks with variable delays is studied by constructing...

Local stability conditions for discrete-time cascade locally recurrent neural networks

Krzysztof Patan (2010)

International Journal of Applied Mathematics and Computer Science

Similarity:

The paper deals with a specific kind of discrete-time recurrent neural network designed with dynamic neuron models. Dynamics are reproduced within each single neuron, hence the network considered is a locally recurrent globally feedforward. A crucial problem with neural networks of the dynamic type is stability as well as stabilization in learning problems. The paper formulates local stability conditions for the analysed class of neural networks using Lyapunov's first method. Moreover,...

Self-similarly expanding networks to curve shortening flow

Oliver C. Schnürer, Felix Schulze (2007)

Annali della Scuola Normale Superiore di Pisa - Classe di Scienze

Similarity:

We consider a network in the Euclidean plane that consists of three distinct half-lines with common start points. From that network as initial condition, there exists a network that consists of three curves that all start at one point, where they form 120 degree angles, and expands homothetically under curve shortening flow. We also prove uniqueness of these networks.