Paper

Global Exponential Attractive Sets for Recurrent Neural Networks with Infinite Distributed Delays


Authors:
Xiaohong Wang
Abstract
This paper is concerned with the problem on global exponential attractive sets of recurrent neural networks (RNNs) with general activation functions and infinite distributed delays. By employing a new differential inequality and abandoning the limitation on activation functions, several sufficient conditions for detailed estimations of the global exponential attractive sets are derived in terms of linear matrix inequalities (LMIs) which can be easily checked by LMI Control Toolbox in Matlab. Compared with previous methods, the results obtained are independent of the time-varying delays and do not require the differentiability of delay functions. They extend and improve the earlier publications. Finally, a numerical example is provided to demonstrate the potential effectiveness of the proposed results.
Keywords
Recurrent Neural Network; Global Exponential Attractive Sets; Infinite Distributed Delays; Linear Matrix Inequalities
StartPage
27
EndPage
35
Doi
10.18005/JCSE0203001
Download | Back to Issue| Archive