Mean-square exponential stability for the hysteretic Hopfield neural networks with stochastic disturbances

ABSTRACT In this paper, the exponential stability problem is considered for a class of hysteretic Hopfield neural networks with stochastic disturbances. The hysteretic nonlinearities are characterized by a Lipschitz-type constraint where the internal parameters of the hysteretic function are reflected. By resorting to Lyapunov function approach and stochastic analysis, a sufficient condition has been obtained under which the underlying hysteretic Hopfield neural network is exponentially stable in the mean square. The obtained condition is expressed in terms of linear matrix inequalities (LMIs) which can be easily checked via the Matlab toolbox. Finally, an illustrative example is provided to show the effectiveness of the results derived in this paper.


Introduction
For decades, neural networks have attracted much attention from researchers due primarily to their wide applications such as signal processing, combinatorial optimization, pattern recognition and associative memory (Lian, Zhang, & Feng, 2012;Lu, Ho, & Wang, 2009). It is worth mentioning that the successes of these applications are highly dependent on the understanding of the dynamic behaviours for neural networks. As such, the dynamic analysis issues for neural networks have gained a growing research interest in the past few years, see e.g. (Chen, 2001;He, Liu, Rees, & Wu, 2007;Liu & Du, 2014;Sakthivel, Raja, & Anthoni, 2011;Song, Park, Wu, & Zhang, 2013;Wang, Shu, Fang, & Liu, 2006;Xiong & Zhang, 2018;Zhang, Han, & Wang, 2018) and the references therein. In particular, as one of the most desirable dynamic properties, the stability of neural networks has recently become a focus of research and a great many results have been reported on the stability problem for various types of neural networks, such as Hopfield neural networks (Chen, 2001), Cohen-Grossberg neural networks (Wang, Liu, Li, & Liu, 2006) and memristive neural networks (Qi, Li, & Huang, 2014;Wang, Li, Huang, & Duan, 2014).
In practice, due to the external environmental fluctuations, neural networks are often affected by disturbances and the stochastic disturbances should be adequately taken into account in the network analysis in order to CONTACT Jinghui Suo sophiatongji2012@163.com avoid undesirable instability. As such, in the past decade, the stability problems for neural networks with stochastic disturbances have received considerable attention from a variety of communities and a series of results have been reported on this topic, see e.g. (Chen & Zheng, 2013;Ding, Wang, & Shen, 2012;Kan, Wang, & Shu, 2013;Liang, Lam, & Wang, 2009;Liang, Wang, & Liu, 2009Liu, Wang, & Liu, 2008Ren, Wang, & Lu, 2017;Shan, Zhang, Wang, & Zhang, 2018;Shen, Wang, Ding, & Shu, 2013;Wang, Liu, Liu, & Shi, 2010;Wang, Wang, & Liang, 2010) and the references therein. For example, in Shan et al. (2018), a general noise disturbance (that may be non-white) has been introduced to neural networks and the global mean-square asymptotic stability problem has been analyzed by using random field approach. In Chen and Zheng (2013), the stability analysis problem has been studied for the time-delay neural networks subject to stochastic perturbations. As a typical kind of nonlinearities, hysteresis occurs frequently in many engineering systems. Note that, if it is not dealt with appropriately, hysteresis might deteriorate the system performance or even cause the undesirable instability of the overall system. Therefore, it is of great importance to consider the hysteresis in the modelling process of neural networks and examine its effects on the stability property of neural networks. Up to now, the investigation on the stability for hysteretic Hopfield neural networks has received some initial research interests, see e.g. (Bharitkar & Mendel, 2000;Schonfeld, 1993). However, when the stochastic disturbances are taken into account simultaneously, the corresponding results have not been reported yet. It is, therefore, the main motivation of this paper is to shorten such a gap by studying the stability problem for the hysteretic Hopfield neural network with stochastic disturbances.
Summarizing the above discussions, the aim of this paper is to investigate the mean-square exponential stability problem for a class of hysteretic Hopfield neural networks with stochastic disturbances. The main difficulties encountered in the proposed research are outlined as follows: (1) how to develop an appropriate technique to examine the influence from the hysteretic nonlinearities on the stability performance; and (2) how to analyze the stability issue of the hysteretic Hopfield neural networks subject to hysteretic nonlinearities and stochastic disturbances.
The primary contributions of this paper can be briefly stated as follows: (1) the stability problem is addressed for hysteretic Hopfield neural networks in the presence of stochastic disturbances and (2) by using a combination of Lyapunov function approach and stochastic analysis, a stability criterion is established for hysteretic Hopfield neural networks with stochastic disturbance. Finally, a simulation example is provided to show the effectiveness of the stability criterion established in this paper.

Notation:
The notation used here is fairly standard except where otherwise stated. R N and R N×M denote, respectively, the N dimensional Euclidean space and the set of all N × M real matrices. I denotes the identity matrix of compatible dimension. The notation X ≥ Y (respectively, X > Y), where X and Y are symmetric matrices, means that X−Y is positive semi-definite (respectively, positive definite). A T represents the transpose of A. If A is a symmetric matrix, λ max (A) and λ min (A) denote the maximum and minimum eigenvalue of A, respectively. E{x} stands for the expectation of the stochastic variable x. x describes the Euclidean norm of a vector x. In symmetric block matrices, the symbol * is used as an ellipsis for terms induced by symmetry.

Problem formulation and preliminaries
In this paper, the hysteretic Hopfield neural network is described as follows: where . . . , y(x N (t))] T ∈ R N is the hysteretic neuron gain function vector, W = (w ij ) ∈ R N×N is the connection weight matrix of the neural network and l = [l 1 , l 2 , . . . , l N ] T is a constant external input vector. In model (1), the hysteretic neuron gain function y(x j (t)) is described as )/δt) and α, β, γ α , γ β are four parameters of the hysteretic neuron's activation function. It is seen that the neuron's output not only depends on its input x, but also depends on its derivative information, namely,ẋ. Suppose that the equilibrium point of system (1) is Actually, the neuron states themselves are often subject to the noises. Then, we consider the following class of comprehensive hysteretic Hopfield neural networks: where g(·, ·) : R × R N → R N is the noise intensity function and ω(t) is a scalar Brownian motion satisfying Throughout this paper, we make the following assumption.
Assumption 2.1: The noise intensity function g(t, e(t)) : where M ∈ R N × R N is a known matrix.
Definition 2.1: The hysteretic Hopfield neural network (6) is said to be exponentially stable in the mean square if, there exist constants α 1 > 0 and α 2 > 0 such that In this paper, we aim to deal with the exponential mean-square stability problem for the hysteretic Hopfield neural network (6), and establish an LMI-based sufficient condition under which the hysteretic neural network (6) is mean-square exponentially stable.

Main results
In this section, before stating our main results, we introduce the following important Lemmas that will be used in deriving the main results.

Lemma 3.1: If f (·) is a continuous differentiable function and its derivative function satisfies
where C is a given positive scalar.
Note that the hyperbolic tangent function is continuous differentiable and its derivative function is bounded. By using Lemma 3.1, the constants C 1 , C 2 , C 3 and C 4 can be determined by calculating the derivation of the hyperbolic tangent functions.
For case 1, the derivative of the hyperbolic tangent function can be obtained as follows: Then, C 1 is determined by For case 2, we have Then, C 2 is determined by For case 3, we have and C 3 is obtained as Finally, for case 4, we have and hence obtain By letting C * = max{C 1 , C 2 , C 3 , C 4 } = 4(|γ α | + |γ β |) 2 + e −2(|αγ α |+|βγ β |) , (25) and noting we arrive at The main result of this paper is given in the following theorem.

+ 2y T (e(t))W T Pe(t) + δ 1 y T (e(t))y(e(t))
− δ 2 e T (t)Pe(t) where ξ(t) = [e T (t) y T (e(t))] T . By taking the mathematical expectation on both sides of (31), it is derived from (36) that from which, we can infer that Then, we can derive where α 0 = V(0, e(0))/λmin(P). According to Definition 2.1, the hysteretic neural network (6) is exponentially stable in the mean square and hence the proof is complete.
Remark 3.1: The hysteretic Hopfield neural network described by (6) includes hysteresis nonlinearities and stochastic disturbances, which make the stability analysis complicated. In Theorem 3.1, these two factors have both been properly handled. Furthermore, note that all the information on the addressed problem (i.e. the network parameters, hysteresis nonlinearities and external disturbances) have been reflected in the LMI-based sufficient conditions established in Theorem 3.1. In next section, the effectiveness of the established stability criterion will be verified by a numerical simulation example.

A numerical example
In this section, a numerical example is provided to demonstrate the effectiveness of the derived results.
The hysteretic Hopfield neural network (6) is assumed to have two neurons with the following parameters With the above parameters, by using the Matlab LMI toolbox, we can solve the LMIs (28)-(29) and obtain the  Therefore, according to Theorem 3.1, the neural network (6) with parameters given above is exponentially stable in the mean square. In simulation, the initial values are taken as e 1 (0) = −3.4 and e 2 (0) = −1.3. The state trajectories of e 1 (t) and e 2 (t) are plotted in Figure. 1 where the hysteretic Hopfield neural network (6) is indeed exponentially stable which demonstrates our results.

Conclusions
In this paper, the mean-square exponential stability problem has been investigated for a class of hysteretic Hopfield neural networks with stochastic disturbances. The hysteretic nonlinearities have been characterised by a Lipschitz-type condition which includes the internal parameters of the defined hysteretic function. By employing the Lyapunov function method combined with the stochastic analysis technology, a mean-square exponential stability criterion has been established for the hysteretic Hopfield neural networks. Finally, a simulation example has been employed to show the usefulness of the derived results.

Disclosure statement
No potential conflict of interest was reported by the authors.