Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions
This paper is concerned with the problem of the global robust exponential stability for Hopfield neural networks with norm-bounded parameter uncertainties and inverse Holder neuron activation functions. By ¨ applying Brouwer degree properties and some analysis techniques, the existence and uniquenes...
Збережено в:
Дата: | 2012 |
---|---|
Автори: | , |
Формат: | Стаття |
Мова: | English |
Опубліковано: |
Інститут математики НАН України
2012
|
Назва видання: | Нелінійні коливання |
Онлайн доступ: | http://dspace.nbuv.gov.ua/handle/123456789/175586 |
Теги: |
Додати тег
Немає тегів, Будьте першим, хто поставить тег для цього запису!
|
Назва журналу: | Digital Library of Periodicals of National Academy of Sciences of Ukraine |
Цитувати: | Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions / Hongtao Yu, Huaiqin Wu // Нелінійні коливання. — 2012. — Т. 15, № 1. — С. 127-138. — Бібліогр.: 26 назв. — англ. |
Репозитарії
Digital Library of Periodicals of National Academy of Sciences of Ukraineid |
irk-123456789-175586 |
---|---|
record_format |
dspace |
spelling |
irk-123456789-1755862021-02-02T01:27:58Z Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions Hongtao Yu Huaiqin Wu This paper is concerned with the problem of the global robust exponential stability for Hopfield neural networks with norm-bounded parameter uncertainties and inverse Holder neuron activation functions. By ¨ applying Brouwer degree properties and some analysis techniques, the existence and uniqueness of the equilibrium point are investigated. Based on the Lyapunov stability theory, a global robust exponential stability criterion is derived in terms of linear matrix inequality (LMI). Two numerical examples are provided to demonstrate the effectiveness and validity of the proposed robust stability results. Розглянуто задачу глобальної робастної експоненцiальної стiйкостi для нейронних мереж Хопфiльда з обмеженими за нормою параметричною невизначенiстю та оберненими функцiями Гельдера нейронної активацiї. Використовуючи властивостi ступеня Брауера та результати з аналiзу, вивчено питання iснування та єдиностi точки рiвноваги. Критерiй глобальної робастної експоненцiальної стiйкостi в термiнах лiнiйної матричної нерiвностi отримано з використанням теорiї стiйкостi Ляпунова. Наведено два числових приклади для iлюстрацiї ефективностi та дiєвостi наведених результатiв. 2012 Article Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions / Hongtao Yu, Huaiqin Wu // Нелінійні коливання. — 2012. — Т. 15, № 1. — С. 127-138. — Бібліогр.: 26 назв. — англ. 1562-3076 http://dspace.nbuv.gov.ua/handle/123456789/175586 517.9 en Нелінійні коливання Інститут математики НАН України |
institution |
Digital Library of Periodicals of National Academy of Sciences of Ukraine |
collection |
DSpace DC |
language |
English |
description |
This paper is concerned with the problem of the global robust exponential stability for Hopfield neural networks with norm-bounded parameter uncertainties and inverse Holder neuron activation functions. By ¨ applying Brouwer degree properties and some analysis techniques, the existence and uniqueness of the equilibrium point are investigated. Based on the Lyapunov stability theory, a global robust exponential stability criterion is derived in terms of linear matrix inequality (LMI). Two numerical examples are provided to demonstrate the effectiveness and validity of the proposed robust stability results. |
format |
Article |
author |
Hongtao Yu Huaiqin Wu |
spellingShingle |
Hongtao Yu Huaiqin Wu Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions Нелінійні коливання |
author_facet |
Hongtao Yu Huaiqin Wu |
author_sort |
Hongtao Yu |
title |
Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions |
title_short |
Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions |
title_full |
Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions |
title_fullStr |
Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions |
title_full_unstemmed |
Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions |
title_sort |
global robust exponential stability for hopfield neural networks with non-lipschitz activation functions |
publisher |
Інститут математики НАН України |
publishDate |
2012 |
url |
http://dspace.nbuv.gov.ua/handle/123456789/175586 |
citation_txt |
Global robust exponential stability for Hopfield neural networks with non-Lipschitz activation functions / Hongtao Yu, Huaiqin Wu // Нелінійні коливання. — 2012. — Т. 15, № 1. — С. 127-138. — Бібліогр.: 26 назв. — англ. |
series |
Нелінійні коливання |
work_keys_str_mv |
AT hongtaoyu globalrobustexponentialstabilityforhopfieldneuralnetworkswithnonlipschitzactivationfunctions AT huaiqinwu globalrobustexponentialstabilityforhopfieldneuralnetworkswithnonlipschitzactivationfunctions |
first_indexed |
2025-07-15T12:53:56Z |
last_indexed |
2025-07-15T12:53:56Z |
_version_ |
1837717558746152960 |
fulltext |
UCD 517.9
GLOBAL ROBUST EXPONENTIAL STABILITY FOR HOPFIELD NEURAL
NETWORKS WITH NON-LIPSCHITZ ACTIVATION FUNCTIONS*
ГЛОБАЛЬНА РОБАСТНА ЕКСПОНЕНЦIАЛЬНА СТIЙКIСТЬ
ДЛЯ НЕЙРОННИХ МЕРЕЖ ХОПФIЛЬДА
З НЕЛIПШИЦЕВОЮ ФУНКЦIЄЮ АКТИВАЦIЇ
Hongtao Yu
College Inform. Sci. and Engineering, Yanshan Univ.
Qinhuangdao 066004, China
e-mail: yu5771@163.com
Huaiqin Wu
College Sci., Yanshan Univ.
Qinhuangdao 066001, China
e-mail: huaiqinwu@ysu.edu.cn
This paper is concerned with the problem of the global robust exponential stability for Hopfield neural
networks with norm-bounded parameter uncertainties and inverse Hölder neuron activation functions. By
applying Brouwer degree properties and some analysis techniques, the existence and uniqueness of the
equilibrium point are investigated. Based on the Lyapunov stability theory, a global robust exponential
stability criterion is derived in terms of linear matrix inequality (LMI). Two numerical examples are provi-
ded to demonstrate the effectiveness and validity of the proposed robust stability results.
Розглянуто задачу глобальної робастної експоненцiальної стiйкостi для нейронних мереж Хоп-
фiльда з обмеженими за нормою параметричною невизначенiстю та оберненими функцiями
Гельдера нейронної активацiї. Використовуючи властивостi ступеня Брауера та результати
з аналiзу, вивчено питання iснування та єдиностi точки рiвноваги. Критерiй глобальної ро-
бастної експоненцiальної стiйкостi в термiнах лiнiйної матричної нерiвностi отримано з ви-
користанням теорiї стiйкостi Ляпунова. Наведено два числових приклади для iлюстрацiї ефек-
тивностi та дiєвостi наведених результатiв.
1. Introduction. In recent years, there has been increasing interest in the dynamic analysis of
artificial neural networks. Among the most popular models in the previous literatures are the
Hopfield neural networks (HNNs) proposed by Hopfield. This network has attracted numerous
attention due to their promising application in the various engineering problems, such as, classi-
fication of patterns, solving optimization problems, designing associative memory. It has been
observed that such applications greatly rely on the dynamical analysis of the neural network,
in particular, the stability analysis of the neural network. As is well known, when designing
neural network, it is central to investigate stability problem of neural networks. However, in the
process of implementations of neural networks, parametric uncertainty which often breaks the
∗ This work was supported by the Natural Science Foundation of Hebei Province of China (A2011203103) and
the Hebei Province Education Foundation of China (2009157).
c© Hongtao Yu, Huaiqin Wu, 2012
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1 127
128 HONGTAO YU, HUAIQIN WU
stability of a neural network can be commonly encountered due to the modeling inaccuracies
and changes in the environment of the model. For example, in the practical application of neural
networks, some vital data such as the neuron firing rates and the weight coefficients are usually
acquired and processed by means of the statistical estimates. Thus, the robust stability analysis
of different uncertain neural networks in the presence of parametric uncertainties has gained
much research attention, see, e.g. [1 – 21] and the references therein. In a general way, there are
two forms of parametric uncertainties, namely the interval uncertainty and the norm-bounded
uncertainty. Based on matrix norms and Lyapunov stability theory, Refs. [1 – 6] derived some
delay-independent or delay-dependent conditions of the existence, uniqueness and robust stabi-
lity for interval uncertain HNNs with constant time delay or interval time-varying delay in terms
of LMI. By applying matrix decomposition method, Halanay inequality and LMI techniques,
Refs. [7, 8] established some delay-independent criteria for the robust stability for uncertain
HNNs with multiple time-varying delays and continuously distributed delays. By using Jensen’s
integral inequality and Lyapunov-Krasovskii method, Refs. [9 – 11] achieved delay-dependent
criteria of the robust stability for norm-bounded uncertain HNNs with time-varying delay. By
applying the free weight method, LMI and Jensen’s integral inequality techniques, Refs. [12 –
17] proposed some delay-dependent criteria of the robust stability for norm-bounded uncertain
HNNs with multiple time-varying delay. In addition, by using Jensen’s integral inequality and
LMIs, Refs. [18 – 20] proposed some delay-dependent robust stability criteria for HNNs with
time-varying delay and linear fractional uncertainties and nonlinear uncertainties. Based on
Brouwer degree properties, Refs [21] proved the existence of the equilibrium point of the
interval neural network model with delays and inverse Hölder neuron activation functions,
and by applying Lyapunov functional approach presented a sufficient condition which is used
to ensure the interval robust stability of the network in terms of LMIs.
It should be noted, in the existing literature, almost all results on the robust stability of
neural networks with parametric uncertainties are conducted under some special assumptions
on neuron activation functions. These assumptions frequently include those such as Lipschitz,
bounded and/or monotonic increasing property. To the best of our knowledge, there are few
papers to deal with the global robust stability for neural networks with non-Lipschitz activati-
on functions. However, many neural networks without Lipschitz continuous neuron activation
functions frequently appear in the theoretical study of dynamics of neural networks. In addi-
tion, in order to solve a lot of practical engineering problems, whether neural networks with
non-Lipschitz activation functions is stable or robust stable should be determined too. Hence
giving the conditions of the robust stability for neural networks without Lipschitz continuous
activation functions is very valuable in both theory and practice.
Motivated by the preceding discussion, the aim of this paper is to study the global robust
exponential stability of HNNs with norm-bounded parameter uncertainties and inverse Hölder
neuron activation functions. a sufficient condition will be derived to ensure the network to
be globally robustlly exponential stable for all admissible parameter uncertainties. The rest of
this paper is organized as follows. In Section 2, the model formulation and some preliminaries
is given. The main result are stated in Section 3. In Section 4, two numerical examples are
presented to demonstrate the effectiveness and validity of the proposed stability results. Finally,
some conclusions are made in Section 5.
Notations. The notations used throughout this paper are standard. AT and A−1 denote the
transpose and the inverse of any square matrix A. A > 0 (A < 0) means that A is positive
definite negative definite. R denotes the set of real numbers, Rn denotes the n-dimensional
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1
GLOBAL ROBUST EXPONENTIAL STABILITY FOR HOPFIELD NEURAL . . . 129
Euclidean space, Rm×n denotes the set of all m×n real matrices. I denotes the identity matrix
with appropriate dimension. Given the column vectors x = (x1, . . . , xn)T ∈ Rn, the norm is
the Euclidean vector norm, i.e., ‖x‖ =
(∑n
i=1 x
2
i
) 1
2 .
2. Model of neural network and preliminaries. Consider the following HNNs with parametric
uncertainties described by the differential equation system
dx
dt
= −(D + ∆D(t))x(t) + (A+ ∆A(t)) g(x(t)) + I, (1)
where x(t) = (x1(t), . . . , xn(t))T is the vector of neuron states at time t; D = diag (d1, . . . , dn)
is an n×n constant diagonal matrices, di > 0, i = 1, . . . , n, are the neural self-inhibitions; A =
= (aij)n×n is an n×n interconnection matrix; g(x) = (g1(x1), . . . , gn(xn))T , gi, i = 1, . . . , n, are
called the neuron activation functions; I = (I1, . . . , In)T denots the external input. ∆D(t) =
= diag (4d1(t), . . . ,4dn(t)) and ∆A(t) = (4aij(t))n×n are continuous matrix-valued functi-
ons of t, and are used to denote the parametric uncertainties in the network.
Correspondingly, HNNs without parametric uncertainties
dx
dt
= −Dx(t) +Ag(x(t)) + I, (2)
is called as the reference neural network of (1).
The parametric uncertainties ∆D(t) and ∆A(t) are assumed to satisfy:
A1 :4di(t) : R → R is a continuous function, and satisfies |4di(t)| < di, i = 1, 2, . . . , n.
A2 : ∆A(t) = HF (t)E, where H and E are known constant matrices with appropri-
ate dimensions. The uncertain matrix F (t) is an unknown time varying matrix with Lebegue
measurable elements, and satisfies
F T (t)F (t) ≤ I ∀ t ∈ R. (3)
Definition 2.1. The equilibrium point x∗ of the system (2) is said to be globally exponentially
stable with convergence rate γ, if there are positive constants γ, T and β, such that for any solution
x(t, 0, x0) of the system (2) with initial value x(0) = x0,
‖x(t, 0, x0)− x∗‖ ≤ βe−γt, t ≥ T.
If the equilibrium point x∗ of the system (1) is globally exponentially stable, then the system
(2) is said to be globally robustly exponentially stable.
Definition 2.2. A continuous function G : R → R, is said to be an α-inverse Hölder functi-
on, if
(i) G is a monotonic nondecreasing function,
(ii) for any ρ ∈ R, there exist constants qρ > 0 and rρ > 0 which are correlated with ρ,
satisfying
| G(θ)− G(ρ) |≥ qρ | θ − ρ |α ∀ | θ − ρ |≤ rρ,
where α > 0 is a constant.
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1
130 HONGTAO YU, HUAIQIN WU
The class of α-inverse Hölder functions is denoted by IL(α).When α = 1, 1-inverse Hölder
functions are called to be inverse Lipschitz functions. It is easy to check G(θ) = arctan θ ∈
∈ IL(1), G(θ) = θ3 ∈ IL(3).
Remark 1. It is obvious that α-inverse Hölder functions are a class of non-Lipschitz functi-
ons.
Lemma 2.1 [22]. If G(θ) ∈ IL(α), then for any ρ0 ∈ R, we have
+∞∫
ρ0
[G(θ)− G(ρ0)] dθ =
−∞∫
ρ0
[G(θ)− G(ρ0)] dθ = +∞.
Lemma 2.2 [23]. If G(θ) ∈ IL(α) and G(0) = 0, then there exist constants q0 > 0 and r0 > 0,
such that
|G(θ)| ≥ q0|θ|α ∀ |θ| ≤ r0.
Moreover,
|G(θ)| ≥ q0r
α
0 ∀ |θ| ≥ r0.
Let Ω be a nonempty, bounded and open subset of Rn. The closure of Ω is denoted by Ω,
and the boundary of Ω is denoted by ∂Ω.
Lemma 2.3 [24]. (1). Let H : [0, 1] × Ω → Rn be a continuous mapping. If p∈̄H(λ, ∂ Ω)
for all λ ∈ [0, 1], then Brouwer degree deg (H(λ, ·),Ω, p) is constant (∀ λ ∈ [0, 1]). In this case,
deg (H(0, ·),Ω, p) = deg (H(1, ·),Ω, p).
(2) Let H : Ω → Rn be a continuous mapping. If deg (H,Ω, p) 6= 0, then the equation
H(x) = p has at least a solution in Ω.
Lemma 2.4 (Schur complement). Given constant matrices Σ1, Σ2 and Σ3 with appropriate
dimensions, where Σ1 = ΣT
1 and Σ2 = ΣT
2 > 0, then
Σ1 + ΣT
3 Σ−1
2 Σ3 < 0 ⇔
[
Σ1 ΣT
3
Σ3 −Σ2
]
< 0, or
[
−Σ2 Σ3
ΣT
3 Σ1
]
< 0.
Lemma 2.5. For matrices P ∈ Rn×n, M ∈ Rn×k, N ∈ Rl×n and F ∈ Rk×l with P > 0,
F TF ≤ I, and scalar ε > 0, the following matrix inequality holds:
PMFN + (MFN)TP ≤ εPMMTP + ε−1NTN.
3. Main results.
Theorem 3.1. Suppose gi ∈ IL(α), i = 1, 2, . . . , n. Under the assumptions A1 and A2, if
there exists a positive diagonal matrix P = diag (p1, . . . , pn) and a scalar ε > 0 such that
E =
[
PA+ATP + ε−1ETE PH
HTP −ε−1I
]
< 0, (4)
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1
GLOBAL ROBUST EXPONENTIAL STABILITY FOR HOPFIELD NEURAL . . . 131
then the neural network (1) has a unique equilibrium point which is globally exponentially stable,
i.e., the reference neural network (2) is globally robustly exponentially stable.
Proof. The proof is devided into three steps.
Step 1: In this step, the proof of existence for the equilibrium point will be given.
LetH(x, t) = (D+ ∆D(t))x− (A+ ∆A(t))g(x)− I. x∗ ∈ Rn is an equilibrium point of the
system (1) if and only ifH(x∗, t) = 0. RewriteH(x, t) as
H(x, t) = (D + ∆D(t))x− (A+ ∆A(t))g̃(x) +H(0, t),
where g̃(x) = g(x)−g(0).By gi ∈ IL(α), it follows that g̃i ∈ IL(α), g̃i(0) = 0 and xig̃i(xi) > 0
(xi 6= 0). Set ΩR = {x ∈ Rn :‖ x ‖< R},R > 0. Define the mappingH : [0, 1]× Ω → Rn as
H(λ, x) = (D + ∆D(t))x− λ(A+ ∆A(t))g̃(x) + λH(0, t) ∀ t,
where ΩR = {x ∈ Rn :‖ x ‖≤ R}. By means of Lemma 2.5,
(g̃(x))TPH(λ, x) = (g̃(x))TP ((D + ∆D(t))x+ λH(0, t))− λ(g̃(x))TP (A+ ∆A(t))(g̃(x)) =
= (g̃(x))TP ((D + ∆D(t))x+ λH(0, t))−
− 1
2
λ(g̃(x))T (P (A+HF (t)E) + (A+HF (t)E)TP )(g̃(x)) ≥
≥ (g̃(x))TP ((D + ∆D(t))x+ λH(0, t))−
− 1
2
λ(g̃(x))T (PA+ATP + ε(PH(PH)T + ε−1ETE)(g̃(x)).
Let d̃i = mint>0(di−|4di(t)|).By Lemma 2.4, (4) is equivalent to PA+ATP+ε(PH)(PH)T +
+ε−1ETE < 0. Hence,
(g̃(x))TPH(λ, x) ≥ (g̃(x))TP ((D + ∆D(t))x+ λH(0, t)) ≥
≥
n∑
i=1
[pi(di + ∆di(t))|g̃i(xi)||xi| − λpi|g̃i(xi| |H(0, t)i|] ≥
≥
n∑
i=1
d̃ipi|g̃i(xi)|
[
|xi| −
maxt>0 |H(0, t)i|
d̃i
]
,
whereH(0, t)i denotes the ith element ofH(0, t).
By Lemma 2.2, there exist constants q0i > 0 and r0i > 0 such that
|g̃i(xi)| ≥ q0ir
α
0i ∀ |xi| ≥ r0i , i = 1, 2, . . . , n. (5)
Let r0 = max1≤i≤n r0i , a = max1≤i≤n
maxt>0 |H(0, t)i|
d̃i
, Nk = {n1, . . . , nk} ⊂ {1, 2, . . . , n},
ΩNk = {x : |xi| ≤ a, i ∈ Nk, x ∈ Rk} ∀ k < n. Define
g̃Nk(x) =
∑
i∈Nk
d̃ipi|g̃i(xi)|[|xi| − a].
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1
132 HONGTAO YU, HUAIQIN WU
Noting that ΩNk is a compact subset of Rk and g̃Nk is continuous on ΩNk , g̃Nk can reach its
minimum minx∈ΩNk
g̃Nk(x) on ΩNk .
Let l = min1≤i≤n{d̃ipiq0ir
α
0i
},MNk = minx∈ΩNk
g̃Nk(x) and M = min{MNk : Nk ⊂
⊂ {1, 2, . . . , n}}. Set R > max
{
√
n
(
a− M
l
)
,
√
nr0
}
and x ∈ ∂ΩR, then there exist two
index sets N and N such that
|xi| ≤ a, i ∈ N , and |xi| > a, i ∈ N ,
where N
⋃
N = {1, 2, . . . , n}. Furthermore, there exists an index i0 in N such that
|xi0 | ≥
R√
n
≥ max{a, r0}. (6)
By using (5) and (6), for any x ∈ ∂ΩR and λ ∈ [0, 1],
(g̃(x))TPH(λ, x) ≥
n∑
i=1
d̃ipi|g̃i(xi)|
[
|xi| −
maxt>0 |H(0, t)i|
d̃i
]
≥
≥
∑
i∈N
d̃ipi|g̃i(xi)|[|xi| − a] +
∑
i∈N
d̃ipi|g̃i(xi)|[|xi| − a] ≥
≥ d̃ipiq0i0
rα0i0
[|xi0 | − a] +M ≥
≥ d̃i0pi0q0i0
rα0i0
[
|xi0 | − a+
M
l
]
≥
≥ d̃i0pi0q0i0
rα0i0
[
R√
n
− a+
M
l
]
> 0.
HenceH(λ, x) 6= 0, x ∈ ∂ΩR and λ ∈ [0, 1]. By Lemma 2.3(1),
deg (H(0, x),ΩR, 0) = deg (H(1, x),ΩR, 0),
i.e., deg (H(x, t),ΩR, 0) = deg ((D + ∆D(t))x,ΩR, 0) = deg |D + ∆D(t)| 6= 0, where |D +
+∆D(t)| is the determinant ofD+∆D(t). By Lemma 2.3(2),H(x, t) = 0 has at least a solution
in ΩR. Thus, the system (1) has at least an equilibrium point.
Step 2: In this step, the uniqueness of equilibrium point of the system (1) will be proved by
the method of contradiction.
Assume that x∗1 and x∗2 are two different equilibrium points of the system (1). Then
(D + ∆D(t))(x∗1 − x∗2) = (A+ ∆A(t)) (g(x∗1)− g(x∗2)) .
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1
GLOBAL ROBUST EXPONENTIAL STABILITY FOR HOPFIELD NEURAL . . . 133
Hence,
0 <d̃ipi(g(x∗1)− g(x∗2))T (x∗1 − x∗2) < (g(x∗1)− g(x∗2))TP (D + ∆D(t))(x∗1 − x∗2) =
= (g(x∗1)− g(x∗2))TP (A+ ∆A(t)) (g(x∗1)− g(x∗2)) ≤
≤ 1
2
(g(x∗1)− g(x∗2))T [P (A+HF (t)E) + (A+HF (t)E)TP ] (g(x∗1)− g(x∗2)) ≤
≤ 1
2
(g(x∗1)− g(x∗2))T (PA+ATP + εPHHTP + ε−1ETE) (g(x∗1)− g(x∗2)) < 0.
This is a contradiction. Hence x∗1 = x∗2. This implies that the equilibrium point of the system
(1) is unique.
Step 3: In this step, by applying Lyapunov function method, the global robust exponential
stability of the system (2) will be presented.
Let F(x, t) = −(D + ∆D(t))x + (A + ∆A(t))g(x) + I, F : Rn → Rn is continuous and
local bounded on x.Hence the existence of the local solution of the system (1) with initial value
x(0) = x0 on [0, t∗(x0)) is obvious, where [0, t∗(x0)) is the maximal right-hand side existence
interval of the local solution. This local solution is denoted by x(t, 0, x0). Let x∗ be the unique
equilibrium point of the system (1). Make a transformation y(t) = x(t) − x∗, the system (1) is
transformed into
dx
dt
= −(D + ∆D(t))y(t) + (A+ ∆A(t))ĝ(y(t)), (7)
where y(t) = (y1(t), . . . , yn(t))T , ĝ(y) = (ĝ1(y1), . . . , ĝn(yn))T , and ĝi(yi) = gi(yi + x∗i ) −
−gi(x∗i ), i = 1, 2, . . . , n. y(t, 0, y0) = x(t, 0, x0) − x∗ is a solution of the system (7) with with
initial values y(0) = x0 − x∗ on [0, t∗(x0)).
Consider the following Lyapunov function:
V (t) = 2eηt
n∑
i=1
pi
yi(t,0,y0i )∫
0
ĝi(θ)dθ,
where 0 < η < min1≤i≤n d̃i is a scalar. Calculating the derivative of V (t) along the solution
y(t, 0, y0) of the system (7) on [0, t∗(x0)), it follows that
dV
dt
= 2ηeηt
n∑
i=1
pi
∫ yi(t,0,y0i )
0
ĝi(θ)dθ+
+ 2eηt(ĝ(y(t, 0, y0)))TP [−(D + ∆D(t))y(t, 0, y0) + (A+ ∆A(t))ĝ(y(t, 0, y0))] ≤
≤ 2eηt(ĝ(y(t, 0, y0)))T (ηP − P (D + ∆D(t)))y(t, 0, y0)+
+ eηt(ĝ(y(t, 0, y0)))T [P (A+HF (t)E) + (A+HF (t)E)TP ]ĝ(y(t, 0, y0)) ≤
≤ 2(η − d̃i)eηt(ĝ(y(t, 0, y0)))TPy(t, 0, y0)+
+ eηt(ĝ(y(t, 0, y0)))T [PA+ATP + εPHHTP + ε−1ETE]ĝ(y(t, 0, y0)) ≤ 0.
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1
134 HONGTAO YU, HUAIQIN WU
This implies V (t) ≤ V (0). Hence
2
n∑
i=1
pi
yi(t,0,y0i )∫
0
ĝi(θ)dθ ≤ V (0)e−ηt ≤ V (0). (8)
By (8) and Lemma 2.1, it is easy to derive that yi(t, 0, y0), i = 1, 2, . . . , n, are bounded on
[0, t∗(x0)), ie., y(t, 0, y0) are bounded on [0, t∗(x0)). By the continuous theorem [26], the system
(7) has a solution y(t, 0, y0) with the initial values y(0) = x0 − x∗ on [0,+∞). Moreover, by (8)
lim
t→+∞
yi(t, 0, y0i) = 0, i = 1, 2, . . . , n.
Hence, there exists a constant T > 0 such that yi(t) ∈ [−r0, r0], i = 1, 2, . . . , n, ∀ t ≥ T. By
Lemma 2.2, when t ≥ T,
n∑
i=1
pi
yi(t,0,y0i )∫
0
ĝi(θ)dθ ≥
n∑
i=1
pi
|yi(t,0,y0i )|∫
0
q0|θ|α dθ ≥
pq0
α+ 1
{
max
1≤i≤n
|yi(t, 0, y0i)|
}α+1
,
i.e.,
max
1≤i≤n
|yi(t, 0, y0i)| ≤
[
α+ 1
2pq0
V (0)
] 1
1+α
e−
η
α+1
t,
‖x(t, 0, x0)− x∗‖ ≤
√
n
[
α+ 1
2pq0
V (0)
] 1
1+α
e−
η
α+1
t,
where p = min1≤i≤n pi. This shows that the equilibrium point of the system (1) is globally
exponentially stable, i.e., the system (2) is globally robustly exponentially stable. The proof is
completed.
Remark 2. (i) The condition (4) in Theorem 3.1 is a LMI if ε is given. Hence when ε is
given, the problem to seek a feasible diagonal matrix P which is used to check the global
exponential robust stability of the system (2) can be solved by using appropriate LMI solver
in the Matlab [25].
(ii) Let P = {(ε, p1, p2, . . . , pn)|ε, pi > 0, i = 1, 2, . . . , n,E < 0} . Generally, it is difficult to
obtain the detail of P. However, the bounds of P can be determined if the following optimization
problems can be solved
Maximize ε, p1, p2, . . . , pn
subject to (ε, p1, p2, . . . , pn) ∈ P, (9)
Minimize ε, p1, p2, . . . , pn
subject to (ε, p1, p2, . . . , pn) ∈ P. (10)
(9) and (10) are quasiconvex optimization problems and can be easily solved by using the
Matlab Toolbox. Let ε, p1, p2, . . . , pn and ε, p1, p2, . . . , pn be the feasible solutions of (9) and (10)
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1
GLOBAL ROBUST EXPONENTIAL STABILITY FOR HOPFIELD NEURAL . . . 135
respectively, D = {(ε, p1, p2, . . . , pn)| ε < ε < ε, pi < pi < pi, i = 1, 2, . . . , n}. D is a polytopic
domain. Obviously, based on Theorem 3.1, the vector (ε, p1, p2, . . . , pn) which ensues that the
system (2) is globally robustly exponentially stable should be contained in D. If D = ∅, then the
global exponential robust stability of the system (2) can not be checked by using Theorem 3.1.
4. Illustrative examples. Consider the second-order neural network (1) described by D =
= diag (1, 1), ∆D(t) = diag (0.5 sin t, 0.4 cos t), A =
(
−1 −2
2 −1
)
, E = H =
(
0, 5 0
0 0, 5
)
,
F (t) =
(
0, 5 sin t 0
0 0, 5 cos t
)
, I = (0, 0)T and g(θ) = θ3 ∈ IL(3). It is easy to check that
(0, 0)T is the equilibrium point of the network. Choose ε = 1. Solving the LMI in (4) by using
appropriate LMI solver in the Matlab, the positive diagonal matrix P could be as
P =
(
1, 1580 0
0 1.1580
)
.
By Theorem 3.1, the unique equilibrium point of this neural network is global exponential
stable.
Figures 1 and 2 display the time-domain behavior of the network. It can be seen that state
trajectories of this network with 10 initial values converge to the equilibrium point (0, 0)T . This
is in accordance with the conclusion of Theorem 3.1.
Fig. 1. The state trajectory x1 of the network
with 10 initial values in Example 1.
Fig. 2. The state trajectory x2 of the network
with 10 initial values in Example 1.
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1
136 HONGTAO YU, HUAIQIN WU
Example 2. Consider the third-order neural network (1) descried by
D = diag (1, 1, 1), ∆D(t) = diag (0, 5 sin t, 0, 4 cos t, 0, 5 sin t),
A =
−3 −2 1
2 −1 1
−1 2 −3
, H = E =
0, 5 0 0
0 0.5 0
0 0 0.5
,
F (t) =
0, 2 sin t 0 0
0 0, 5 cos t 0
0 0 0, 4 sin t
, I = (0, 0, 0)T and g(θ) = arctan (θ) ∈ IL(1).
It is easy to check (0, 0, 0)T is the equilibrium point of the network. Choose ε = 1. Solving
the LMI in (4) by using appropriate LMI solver in the Matlab, the positive diagonal matrix P
could be as
P =
1, 7917 0 0
0 1, 3466 0
0 0 0, 7271
.
By Theorem 3.1, the unique equilibrium point of this neural network is global exponential
stable.
Figures 3, 4 and 5 display the time-domain behavior of the network. It can be seen that state
trajectories of this network with 10 initial values converge to the equilibrium point (0, 0, 0)T .
This is in accordance with the conclusion of Theorem 3.1.
Fig. 3. The state trajectory x1 of the network
with 10 initial values in Example 2.
Fig. 4. The state trajectory x2 of the network
with 10 initial values in Example 2.
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1
GLOBAL ROBUST EXPONENTIAL STABILITY FOR HOPFIELD NEURAL . . . 137
Fig. 5. The state trajectory x3 of the network
with 10 initial values in Example 2.
5. Conclusion. In this paper, a new class of Hopfield neural networks with norm-bounded
parameter uncertainties and inverse Hölder neuron activation functions has been presented.
A sufficient condition to the existence, uniqueness and global robust stability of equilibrium
point for such neural networks has been derived by employing Brouwer degree properties and
Lyapunov method. The results obtained in this paper are given in the form of LMIs, which can
be checked and applied easily in practice. Two numerical examples have been also exploited to
show the use of the proposed LMI-based stability criteria.
As is well known, when neuron activation functions are non-Lipschitz functions, it is possi-
ble that the neural network system has not the global solution and / or the equilibrium point.
Hence, this leads to difficulty in solving the problem of the stability for neural networks with
non-Lipschitz activation functions. In the future, when the neuron activations are other non-
Lipschitz activation functions, the stability, particularly robust stability for neural networks with
the norm-bounded parameter uncertainties or the interval uncertainty will be an important yet
challenging problem. It will be expected to be solved by the researchers in this filed.
1. Singh V. A new criterion for global robust stability of interval delayed neural networks // J. Comput. Appl.
Math. — 2008. — 221. — P. 219 – 225.
2. Singh V. Improved global robust stability for interval-delayed hopfield neural networks // Neural. Process
Lett. — 2008. — 27. — P. 257 – 265.
3. Zhang B., Xu S., Li Y. Delay-dependent robust exponential stability for uncertain recurrent neural networks
with time-varying delays // Int. J. Neural. Syst. — 2007. — 17. — P. 207 – 218.
4. Zhang H., Wang Z., Liu D. Robust stability analysis for interval Cohen-Grossberg neural networks with
unknown time-varying delays // IEEE Trans Neural Network. — 2008. — 19. — P. 1942 – 1955.
5. Zhao W., Zhu Q. New results of global robust exponential stability of neural networks with delays // Nonli-
near Anal.: Real World Appl. — 2010. — 11. — P. 1190 – 1197.
6. Kwon O., Park J., Lee S. On roust stability for uncertain neural networks with interval time-varying delays //
IET Control Theory Appl. — 2008. — 2. — P. 625 – 634.
7. Gau R., Lien C., Hsieh J. Global exponential stability for uncertain cellular neural networks with multiple
time-varying delays via LMI approach // Chaos Soliton Fract. — 2007. — 32. — P. 1258 – 1267.
8. Shao J., Huang T., Zhou S. An analysis on global robust exponential stability of neural networks with time-
varying delays // Neurocomputing. — 2009. — 72. — P. 1993 – 1998.
9. Zheng C., Zhang H., Wang Z. Novel delay-dependent criteria for global robust exponential stability of
delayed cellular neural networks with norm-bounded uncertainties // Neurocomputing. — 2009. — 721. —
P. 744 – 1754.
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1
138 HONGTAO YU, HUAIQIN WU
10. Qiu J., Zhang J., Wang J., Xia Y., Shi P. A new global robust stability criteria for uncertain neural networks
with fast time-varying delays // Chaos Soliton Fract. — 2008. — 37. — P. 360 – 368.
11. Wang Z., Zhang H., Yu W. Robust exponential stability analysis of neural networks with multiple time delays
// Neurocomputing. — 2007. — 70. — P. 2534 – 2543.
12. Senan S., Arik S. New results for global robust stability of bidirectional associative memory neural networks
with multiple time delays // Chaos, Solitons and Fractals. — 2009. — 41. — P. 2106 – 2114.
13. Sheng L., Yang H. Robust stability of uncertain Markovian jumping Cohen-Grossberg neural networks with
mixed time-varying delays // Chaos, Solitons and Fractals. — 2009. — 42. — P. 2120 – 2128.
14. Wang Z., Zhang H., Yu W. Robust stability of Cohen-Grossberg neural networks via state transmission matrix
// IEEE Trans Neural Network. — 2009. — 20. — P. 169 – 174.
15. Su W., Chen Y. Global robust exponential stability analysis for stochastic interval neural networks with time-
varying delays // Commun. Nonlinear Sci. Numer. Simulat. — 2009. — 14. — P. 2293 – 2300.
16. Wang L., Zhang Y., Zhang Z., Wang Y. LMI-based approach for global exponential robust stability for
reaction-diffusion uncertain neural networks with time-varying delay // Chaos, Solitons and Fractals. — 2009.
— 41. — P. 900 – 905.
17. Zhang R., Wang L. Global exponential robust stability of interval cellular neural networks with S-type distri-
buted delays // Math. Comput. Model. — 2009. — 50. — P. 380 – 385.
18. Zhang J., Peng S., Qiu J. Robust stability criteria for uncertain neutral system with time delay and nonlinear
uncertainties // Chaos, Solitons and Fractals. — 2008. — 38, № 1. — P. 160 – 167.
19. Chen Y., Xue A., Lu R., Zhou S. On robustly exponential stability of uncertain neutral systems with time-
varying delays and nonlinear perturbations // Nonlinear Anal. — 2008. — 68. — P. 2464 – 2470.
20. Zheng C., Jing X., Wang Z., Feng J. Further results for robust stability of cellular neural networks with linear
fractional uncertainty // Commun Nonlinear Sci. Numer. Simulat. — 2009. / doi:10.1016/j.cnsns.2009.11.007.
21. Wu H., Tao F., Qin L., Shi R., He L. Robust exponential stability for interval neural networks with delays and
non-Lipschitz activation functions // Nonlinear Dynam. — 2010 // DOI 10.1007 / s11071-010-9926-9.
22. Wu H., Xue X. Stability analysis for neural networks with inverse Lipschitzan neuron activations and impulses
// Appl. Math. Model. — 2008. — 32. — P. 2347 – 2359.
23. Wu H. Global exponential stability of Hopfiled neural networks with delays and inverse Lipschitz neuron
activations, Nonlinear Anal.: Real World Appl. — 2009. — 10. — P. 2297 – 2306.
24. Deimling K. Nonlinear functional analysis. — Berlin: Springer, 1985.
25. Boyd S., Ghaoui L. E., Feron E., Balakrishnan V. Linear matrix in equalities in system and control theory. —
Philadelphia, PA: SIAM, 1994.
26. Miller P., Michel A. Differential equations. — New York: Academic, 1982.
Received 06.01.11,
after revision — 25.04.11
ISSN 1562-3076. Нелiнiйнi коливання, 2012, т . 15, N◦ 1
|