We study the stability of a delayed Hopfield neural network with periodic coefficients and inputs and an arbitrary and constant delay. We consider non-decreasing activation functions which may also have jump discontinuities in order to model the ideal situation where the gain of the neuron amplifiers is very high and tends to infinity. In particular, we drop the assumption of Lipschitz continuity on the activation functions, which is usually required in most of the papers. Under suitable assumptions on the interconnection matrices, we prove that the delayed neural network has a unique periodic solution which is globally exponentially stable independently of the size of the delay. The assumptions we exploit concern the theory of M-matrices and are easy to check. Due to the possible discontinuities of the activation functions, the convergence of the output of the neural network is also studied by a suitable notion of limit. The existence, uniqueness and continuability of the solution of suitable initial value problems are proved. © 2005 Elsevier B.V. All rights reserved.

Global exponential stability of the periodic solution of a delayed neural network with discontinuous activations

Papini D.
;
2005-01-01

Abstract

We study the stability of a delayed Hopfield neural network with periodic coefficients and inputs and an arbitrary and constant delay. We consider non-decreasing activation functions which may also have jump discontinuities in order to model the ideal situation where the gain of the neuron amplifiers is very high and tends to infinity. In particular, we drop the assumption of Lipschitz continuity on the activation functions, which is usually required in most of the papers. Under suitable assumptions on the interconnection matrices, we prove that the delayed neural network has a unique periodic solution which is globally exponentially stable independently of the size of the delay. The assumptions we exploit concern the theory of M-matrices and are easy to check. Due to the possible discontinuities of the activation functions, the convergence of the output of the neural network is also studied by a suitable notion of limit. The existence, uniqueness and continuability of the solution of suitable initial value problems are proved. © 2005 Elsevier B.V. All rights reserved.
File in questo prodotto:
File Dimensione Formato  
PaTa_PLA2005.pdf

non disponibili

Descrizione: Articolo principale
Tipologia: Versione Editoriale (PDF)
Licenza: Non pubblico
Dimensione 169.74 kB
Formato Adobe PDF
169.74 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11390/1197782
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 84
  • ???jsp.display-item.citation.isi??? 81
social impact