Pattern formation in neural dynamical systems governed by mutually Hamiltonian and gradient vector field structures
We analyze dynamical systems of general form possessing gradient (symmetric) and Hamiltonian (antisymmetric) flow parts. The relevance of such systems to self-organizing processes is discussed. Coherent structure formation and related gradient flows on matrix Grassmann type manifolds are conside...
Збережено в:
Дата: | 2004 |
---|---|
Автори: | , |
Формат: | Стаття |
Мова: | English |
Опубліковано: |
Інститут фізики конденсованих систем НАН України
2004
|
Назва видання: | Condensed Matter Physics |
Онлайн доступ: | http://dspace.nbuv.gov.ua/handle/123456789/119026 |
Теги: |
Додати тег
Немає тегів, Будьте першим, хто поставить тег для цього запису!
|
Назва журналу: | Digital Library of Periodicals of National Academy of Sciences of Ukraine |
Цитувати: | Pattern formation in neural dynamical systems governed by mutually Hamiltonian and gradient vector field structures / V.V. Gafiychuk, A.K. Prykarpatsky // Condensed Matter Physics. — 2004. — Т. 7, № 3(39). — С. 551–563. — Бібліогр.: 20 назв. — англ. |
Репозитарії
Digital Library of Periodicals of National Academy of Sciences of Ukraineid |
irk-123456789-119026 |
---|---|
record_format |
dspace |
spelling |
irk-123456789-1190262017-06-04T03:04:16Z Pattern formation in neural dynamical systems governed by mutually Hamiltonian and gradient vector field structures Gafiychuk, V.V. Prykarpatsky, A.K. We analyze dynamical systems of general form possessing gradient (symmetric) and Hamiltonian (antisymmetric) flow parts. The relevance of such systems to self-organizing processes is discussed. Coherent structure formation and related gradient flows on matrix Grassmann type manifolds are considered. The corresponding graph model associated with the partition swap neighborhood problem is studied. The criterion for emerging gradient and Hamiltonian flows is established. As an example we consider nonlinear dynamics in a neuron network system described by a simulative vector field. A simple criterion was written in order to establish conditions for the formation of an oscillatory pattern in a model neural system under consideration. Аналізуються динамічні системи загального виду, векторні поля яких складаються з градієнтної (симетричної) та Гамільтонової (антисиметричної) складових. Дискутується відповідність таких систем процесам самоорганізації. Розглядається виникнення когерентних структур і відповідних градієнтних потоків на грасманових многовидах, а також моделювання таких структур відповідною моделлю графа, який виникає в результаті такого формування. Встановлено критерій виникнення гамільтонових і градієнтних векторних полів. Розглядається модельний приклад нейронної динамічної системи, для якої встановлені умови виникнення осциляційних структур. 2004 Article Pattern formation in neural dynamical systems governed by mutually Hamiltonian and gradient vector field structures / V.V. Gafiychuk, A.K. Prykarpatsky // Condensed Matter Physics. — 2004. — Т. 7, № 3(39). — С. 551–563. — Бібліогр.: 20 назв. — англ. 1607-324X PACS: 05.45.-a, 07.05.Mh, 05.65.+b DOI:10.5488/CMP.7.3.551 http://dspace.nbuv.gov.ua/handle/123456789/119026 en Condensed Matter Physics Інститут фізики конденсованих систем НАН України |
institution |
Digital Library of Periodicals of National Academy of Sciences of Ukraine |
collection |
DSpace DC |
language |
English |
description |
We analyze dynamical systems of general form possessing gradient (symmetric)
and Hamiltonian (antisymmetric) flow parts. The relevance of such
systems to self-organizing processes is discussed. Coherent structure formation
and related gradient flows on matrix Grassmann type manifolds are
considered. The corresponding graph model associated with the partition
swap neighborhood problem is studied. The criterion for emerging gradient
and Hamiltonian flows is established. As an example we consider nonlinear
dynamics in a neuron network system described by a simulative vector
field. A simple criterion was written in order to establish conditions for the
formation of an oscillatory pattern in a model neural system under consideration. |
format |
Article |
author |
Gafiychuk, V.V. Prykarpatsky, A.K. |
spellingShingle |
Gafiychuk, V.V. Prykarpatsky, A.K. Pattern formation in neural dynamical systems governed by mutually Hamiltonian and gradient vector field structures Condensed Matter Physics |
author_facet |
Gafiychuk, V.V. Prykarpatsky, A.K. |
author_sort |
Gafiychuk, V.V. |
title |
Pattern formation in neural dynamical systems governed by mutually Hamiltonian and gradient vector field structures |
title_short |
Pattern formation in neural dynamical systems governed by mutually Hamiltonian and gradient vector field structures |
title_full |
Pattern formation in neural dynamical systems governed by mutually Hamiltonian and gradient vector field structures |
title_fullStr |
Pattern formation in neural dynamical systems governed by mutually Hamiltonian and gradient vector field structures |
title_full_unstemmed |
Pattern formation in neural dynamical systems governed by mutually Hamiltonian and gradient vector field structures |
title_sort |
pattern formation in neural dynamical systems governed by mutually hamiltonian and gradient vector field structures |
publisher |
Інститут фізики конденсованих систем НАН України |
publishDate |
2004 |
url |
http://dspace.nbuv.gov.ua/handle/123456789/119026 |
citation_txt |
Pattern formation in neural dynamical systems governed by mutually Hamiltonian and gradient vector field structures / V.V. Gafiychuk, A.K. Prykarpatsky // Condensed Matter Physics. — 2004. — Т. 7, № 3(39). — С. 551–563. — Бібліогр.: 20 назв. — англ. |
series |
Condensed Matter Physics |
work_keys_str_mv |
AT gafiychukvv patternformationinneuraldynamicalsystemsgovernedbymutuallyhamiltonianandgradientvectorfieldstructures AT prykarpatskyak patternformationinneuraldynamicalsystemsgovernedbymutuallyhamiltonianandgradientvectorfieldstructures |
first_indexed |
2025-07-08T15:06:43Z |
last_indexed |
2025-07-08T15:06:43Z |
_version_ |
1837091734757048320 |
fulltext |
Condensed Matter Physics, 2004, Vol. 7, No. 3(39), pp. 551–563
Pattern formation in neural dynamical
systems governed by mutually
Hamiltonian and gradient vector field
structures
V.V.Gafiychuk 1 , A.K.Prykarpatsky 2
1 Institute of computer modeling, Krakow University of Technology,
24 Warszawska Street, 31155, Krakow, Poland
2 Department of Applied Mathematics
at the AGH University of Science and Technology,
30 Mickiewicz Al. Bl. A4, 30059 Krakow, Poland
Received April 29, 2004, in final form July 21, 2004
We analyze dynamical systems of general form possessing gradient (sym-
metric) and Hamiltonian (antisymmetric) flow parts. The relevance of such
systems to self-organizing processes is discussed. Coherent structure for-
mation and related gradient flows on matrix Grassmann type manifolds are
considered. The corresponding graph model associated with the partition
swap neighborhood problem is studied. The criterion for emerging gradient
and Hamiltonian flows is established. As an example we consider nonlin-
ear dynamics in a neuron network system described by a simulative vector
field. A simple criterion was written in order to establish conditions for the
formation of an oscillatory pattern in a model neural system under consid-
eration.
Key words: dynamical system, gradient flow, Hamiltonian flow,
self-organization, neural network
PACS: 05.45.-a, 07.05.Mh, 05.65.+b
1. Introduction. Complex dynamics – immanent property
of nonlinear systems
The emergence of spatio-temporal patterns in nonlinear dynamical systems is a
very well established fact. Accounts of the state of art on the problem can be found
in many books written recently on this field. In the last couple of decades especially
popular became investigations of pattern formation and propagation phenomena in
self-organizing systems (see for example [1–4]).
Self-organization phenomena are inherent to a wide class of nonlinear systems.
c© V.V.Gafiychuk, A.K.Prykarpatsky 551
V.V.Gafiychuk, A.K.Prykarpatsky
They are so popular that a new interdisciplinary science has been established [1,2].
The goal of investigations of these science centers is to describe the evolution of
nonlinear systems and to explore general rules of the pattern formation and to
reveale the structures arising in the process of this evolution. Diverse mathematical
tools are used to connect general properties of these phenomena in all specific areas.
Meanwhile there is no strict definition of the concepts of self-organization. In spite
of many attempts to express exact meaning of this term different people imply by
self-organization different meaning. By the term “self-organization” people consider
patterns in dissipative and even Hamiltonian systems not caring that initially this
term was introduced to specify certain phenomena in open dissipative systems. Such
an approach frequently has got some sense because not all properties emerging in
these systems are so easy to distinguish. It is surprising that the evolution of the
system as a whole is difficult to predict by considering the local interaction of system
components. At the same time, the local information available amongst components
of the system is of major importance for self-organization and formation of the
pattern.
In this case the problem of classification of possible evolutionary dynamics is still
not resolved. The question whether the patterns arising in the physical system be-
long to self-organization phenomena or are just parts of nonlinear complex dynamics
is far from being simple to answer. The matter is that there is no strict boundary
between the phenomena emerging in the process of system evolution. The classifi-
cation of self-organization phenomena is possible to perform by quantifying entropy
as a criterion of the order of the system showing that self-organization is a result of
entropy decreasing [1]. Such an approach was practically realized by Yu.L. Klimon-
tovich in [5] where he showed that the turbulence is much more self-organized than
the corresponding laminar flow. This result was obtained by calculating the entropy
production in two regimes: laminar and turbulent. But, eventually, Klimontovich’s
result is still a unique one because it is very difficult to calculate the entropy pro-
duction for real evolutionary systems due to their essential nonlinearity. In fact, in
order to be synergetic these systems must be nonlinear and the behavior of the whole
systems is not a simple sum of the behaviors of their parts. Moreover, the dynam-
ics of the systems must be a consequence of the intrinsic properties governing the
system. It should, naturally, possess feedbacks (feedback is an output sent back into
a system as the input and can be positive leading to self-production and negative
corresponding to the system stabilizing). As a result the dynamics leads to nonlin-
ear emergent behavior. In this case finding exact special solutions of such a system
is a challenge and usually the computer simulation methods are used to analyze
the system dynamics. To get the solutions that have a physical meaning and have
practical sense, the physical understanding of the phenomena and the mechanisms
of formation of these solutions should be clearly realized. The situation becomes
much easier if it is possible to develop some approximate methods of analyzing the
possible solutions with special properties. Dealing with the evolution it is beneficial
to make use of the general extremality principles which determine the direction of
evolution. In finding the extremals with respect to certain variation principle such
552
Pattern formation in neural dynamical systems
an approach often makes it possible to understand the system organization laws.
In conservative systems invariants play a role of the main constraints involved on
the system [6]. For dissipative systems some variation principles can be constructed
as well. Some approaches in this direction were made in [7,8]. In this case using
model functions for an approximate attractor present in the system appears to be
very productive and makes it possible to find an approximate form of the governing
solutions. In this case the evolution of the parameters characterizing the evolution
of model functions is often of gradient type.
It should be noted that the development of variational principles is also beneficial
in describing specific nonlinear properties of natural systems such as living systems,
market structures etc. (see [9,10]). The matter is that natural systems evolve towards
increasingly adapted states, building higher and higher degrees of order starting from
almost nothing. Their behavior could be understood as adaptive self-organization
and the dynamics could be frequently described as a gradient dynamics with respect
to a certain potential. Herein below by considering a certain example of a nonlinear
system we develop an approach which enables us to analyze the existence of stable
patterns and periodic structures in the nonlinear system and extends computer
assisted methods to its investigation. As an example of general system we consider a
dynamical system of neurons which in reality has a very complex behavior. We only
present our approach to the study of this neural system but do not claim a specific
result which would explain the real neuron dynamics.
2. Physical models and related gradient and Hamiltonian flow s
It is well known [11] that synaptic connections in biological neural networks are
seldom symmetric since the signal sent by neurons along their axons are sharp spikes
and the relevant information is not contained in the spikes themselves but in the
so-called firing rates, which depends on the magnitude of the membrane potentials
which governs all the process. On the other hand it should be pointed out that
the recent neurophysiological observation of extremely low firing rates [12] without
some doubt on the general usefulness of this notion is really the relevant neural
variable. Thereby one can use some natural continuous variables to describe neural
networks as dynamical systems of special structure similar to gradient (symmetric)
and Hamiltonian skew-symmetrical flows. This gives rise to making use of a lot of
methods and techniques of studying the structural stability of the networks and the
existence of the so-called coherent temporal structures fitting for learning process.
Based on the considerations above one can introduce a class of nonlinear dynam-
ical systems
du
dt
= K(u), (1)
where M 3 u is some smooth finite-dimensional metrizible manifold, K : M →
T (M) is a smooth vector field on M modeling the information transfer process in a
biological neural network under regard. The question is what conditions should be
553
V.V.Gafiychuk, A.K.Prykarpatsky
involved on the flow (1) for it to be represented as follows:
K(u) = −gradV (u) − ϑ(u)∇H(u), (2)
that is as a mixed sum of a gradient flow and a Hamiltonian flow on M . Here
V : M → R is the potential function and H : M → R is the Hamiltonian function
relevant to the flow (1), grad := g−1(u)∇, ∇ := ∂/∂u : u ∈M}, g : T (M)×T (M) →
R+ is a Riemannian metrics and ϑ : T ∗(M) → T (M) is a Poisson structure on M.
Thus we need to find the corresponding metrics and Poisson structure on M
subject to which the representation (2) holds on M . We shall dwell on this topic
more in detail within the proceeding chapter.
3. Poissonian structure analysis
Assume first that the representation (2) holds, that is
−ϑ(u)∇H(u) = K(u) + gradV (u) := KV (u) (3)
for all u ∈ M and some relevant ϑ and g structures on M . This means therefore
that the constructed vector field (3) is exactly Hamiltonian. Thereby one has ([13])
the expression
ϑ−1(u) = ϕ′(u) − ϕ′∗(u), (4)
where ϕ ∈ T ∗(M) is some nonsymmetric solution to the linear determining equation
dϕ
dt
+K ′∗
V ϕ = ∇L. (5)
Here, by definition, the flow KV is defined as
du
dτ
= KV (u) (6)
and L : M → R is a suitable smooth function chosen for convenience when solving
(4). It is clear (see [13]) that the symplectic structure (4) does not depend on the
choice of the function L : M → R.
As the second step, assume that the metrics and Poisson structures on M are
given a priori. Then due to (5) the following equation on an element ψ ∈ T (M) for
determining the potential function V : M → R holds:
ϕ′ ·K + ϕ′ · ψ +K ′∗ · ϕ+ ψ′,∗ · ϕ = ∇L, (7)
where the element ϕ ∈ T ∗(M) has been assumed also to be known a priori as a
solution to the equation (4). The expression (7) is a linear second order equation
in partial derivatives on the potential function V : M → R. If this equation is
compatible, then its solution exists and the decomposition (2) holds.
554
Pattern formation in neural dynamical systems
As one can check, the equation (7) almost everywhere possesses a solution for
the vector ψ = gradV, that is the following expression
gradV = ψ = g−1∇V (8)
holds on M for some ψ ∈ T (M). Thereby, one gets
∇V = gψ. (9)
Making use now of the well-known Volterra condition (see [13]), (∇V )′∗ ≡ (∇V )′,
we obtain the following criterion on the metrics g : T (M) × T (M) → R+ :
(gψ)′∗ = (gψ)′. (10)
Since from (9) one also follows that
〈gψ, ux〉 = 〈∇V, ux〉 =
dV
dx
, (11)
the condition (11) is evidently equivalent to such one:
(gψ)′∗ux −
d
dx
(gψ) = 0. (12)
Calculating the left hand side expression of (12) one gets the following final result:
(g′∗ux − g′ux)ψ = gψ′ux − ψ′(gux), (13)
which is feasible at check, if the metrics is given. Otherwise, if this is not the case,
the linear expression (13) determines a suitable metrics as its solution with respect
to the mapping g : T (M)×T (M) → R+. As soon as the equation (13) is compatible,
its solution exists defining a suitable metrics on the manifold M.
4. Conditions of temporal pattern structure formation
Let us proceed to considering a network with two groups of neuron {xi ∈ R : i =
1, n} and {yj ∈ R : j = 1, m}, connected in such a way that within both groups the
synaptic strengths are symmetric, whereas between groups they are antisymmetric.
That is, neurons {x} are excitatory to {y} and neurons {y} are inhibitory to {x} .
This model is expressed in the form (2), where
V =
n
∑
i=1
(
−
1
2
β1x
2
i + β2
x4
i
4
)
+
1
2
n
∑
i,j=1
β
(1)
i,j xixj
+
m
∑
j=1
(
−
1
2
β4y
2
j + β5
y4
j
4
)
+
1
2
m
∑
i,j=1
β
(2)
i,j yiyj , (14)
H =
1
2
(
n
∑
i=1
x2
i +
m
∑
j=1
y2
j +
n
∑
i=1
m
∑
j=1
wijxiyj
)
(15)
555
V.V.Gafiychuk, A.K.Prykarpatsky
with the standard metrics g = 1, a skew-symmetric Poisson structure ϑ = J ∈
Sp(R(n+m)), u = {(x, y) ∈ Rn × Rm} ,
g =
1 0
. . .
0 1
, J =
(
0 I(n,m)
−I(n,m) 0
)
(16)
or
J =
(
J(n) 0
0 J(m)
)
(17)
with constant β and elements wij being parameters, I(n,m) = {δij : i = 1, n, j =
1, m}, J(n) and J(m) being some skew-symmetric matrices.
It is worth mentioning here that the representation (2) with structures (16) is
not unique and some other solutions to the equation (13) can be found.
This system (14) as we shall demonstrate below possesses a so-called coherent-
temporal structure important in studying learning processes in biological neural
networks.
Assume for simplicity that all β - parameters are proportional to a small enough
parameter ε > 0, that is {β} ' {εβ} and consider first our flow (2) at ε = 0. It is
easy to see that our model then possesses a closed orbit in the space of {x} and {y}
- parameters, say σ : S1 →M = Rn × Rm, satisfying the equation
dσ
dτ
= −J∇H(σ) (18)
for all τ ∈ S1. Moreover, the Hamiltonian function H : M → R in (15) is a conser-
vation law of (18). Take now ε 6= 0; then one can state ( [14]) that there exists a
function Hε : M → R, such, that for some closed orbit σε : S1 → M this function
Hε : M → R be a constant of motion (not a conservative quantity), that is for all
small enough ε > 0
dHε(σε)
dt
= O(ε2) (19)
as ε → 0. Then one can formulate the following proposition [15,16] about the exis-
tence of a limiting cycle in our model at ε > 0 small enough.
Proposition. Let our model possess at small enough ε > 0 a smooth constant
of motion Hε : M → R and a closed ε-deformed orbit σε : S1 → R. Moreover,
at ε = 0 the constant of motion H0 : M → R is a first integral of the model in
the neighborhood of the orbit σ0. Then a necessary condition for the existence of a
limiting cycle at ε > 0 small enough is vanishing of the following circular integral:
∮
S1
〈∇H0(σ0), gradV (σ0)〉dt = 0. (20)
Having substituted expression (14) into (20), one finds numerical constraints on
the parameters locating our closed orbit σ0 : S1 →M in the phase spaceM . Thereby,
556
Pattern formation in neural dynamical systems
we can localize possible coherent temporal patterns available in our neuron network
under study.
Using this approach let us consider the equation of motion on the variables
(x, y) ∈ Rn+m. The Lagrangian equation corresponding to potential (14), Hamilto-
nian (15) and matrix (17) can be represented as
( ..
x
..
y
)
+W
(
x
y
)
= 0, (21)
where columns x = {x1,...xn}
T , y = {y1,...,ym}
T , and matrix
W =
(
A1 B1
A2 B2
)
,
A1 = J2
n + JnwJmw
T , B1 = J2
nw + JnwJm,
A2 = J2
m + Jmw
TJnw, B2 = J2
mw + Jmw
TJn,
w = {wik}.
A solution to the matrix equation (21) can be represented as (x,y)T = a exp(iλt),
with λ ∈ C being nontrivial only if the following determinant
∣
∣−λ2g +W
∣
∣ = 0 (22)
is equal to zero.
Equation (22) is one of the degree n + m subject to λ2 ∈ C and determines
2(n+m) eigenfrequencies ωr = {±ω1, ...,±ωm+n} . In this case the solution gets the
form
(
xr
yr
)
= ar exp(iωrt)+a∗
r exp(−iωrt). (23)
Amplitudes ar = {ar,1, ..., ar,n+m}
T should satisfy the matrix equation
(
−ω2
rg +W
)
ar = 0. (24)
If we take, for example, that amplitudes ar1 = 1 for any r ∈ 1, n+m and solve (24),
we can get coefficients Kri of the distribution of amplitudes relative to any frequency
ωr. For this case the solution (23) can be represented as
xrj = Kr,j exp(iωrt)+Kr,j exp(−iωrt) = 2Kr,j cos(ωrt), (25)
yrj = Kr,j+n exp(iωrt)+Kr,j+n exp(−iωrt) = 2Kr,j+n cos(ωrt). (26)
Here Kr,j, Kr,j+n are given constants depending on the frequencies ωr. Consider now
557
V.V.Gafiychuk, A.K.Prykarpatsky
the scalar product
〈∇H0(σ0),∇V (σ0)〉 =
(
x1 +
m
∑
j=1
w1jyj
)(
−β1x1 + β2x
3
1 +
1
2
n
∑
j=1
β
(1)
1,jxj
)
+ . . .
+
(
xn +
m
∑
j=1
wnjyj
)(
−β1xn + β2x
3
n +
1
2
n
∑
j=1
β
(1)
n,jxj
)
+
(
y1 +
n
∑
i=1
wi1xi
)(
−β1y1 + β2y
3
1 +
1
2
m
∑
j=1
β
(2)
1,j yj
)
+ . . .
+
(
yn +
n
∑
j=1
winxi
)(
−β1ym + β2y
3
m +
1
2
m
∑
j=1
β
(2)
n,jyj
)
.
Having substituted solution (25) into last expression and having integrated it along
the period T = 2π/ωr we get a hyperplane which determines the parameters of our
neural network model:
(
1 +
m
∑
j=1
w1jKr,n+j
)(
−β1 +
3
4
β2 +
1
2
n
∑
j=1
β
(1)
1,jKr,j
)
+ . . .
+
(
Kr,n +
m
∑
j=1
wnjKr,n+j
)(
−β1Kr,n +
3
4
β2K
3
r,n +
1
2
n
∑
j=1
β
(1)
n,jKr,j
)
+ . . .
+
(
Kr,n+1 +
n
∑
i=1
wi1Kr,i
)(
−β3Kr,1+n +
3
4
β4K
3
r,1+n +
1
2
m
∑
j=1
β
(2)
1,jKr,j+n
)
+ . . .
+
(
Kr,n+m +
n
∑
j=1
winKr,i
)(
−β3Kr,m+n +
3
4
β4K
3
r,m+n +
1
2
m
∑
j=1
β
(2)
n,jKr,j+n
)
= 0.
(27)
Thus, as the index r changes from r = 1 to n + m, we can get in the general case
n +m constraints determining parameters {β1, β2, β3, β4, β
(1)
n,j , β
(2)
n,j}, at which the
chosen oscillatory structure will persist for all t ∈ R+ thereby realizing a stable
neural network and the temporal pattern under study related with it.
5. Coherent structure formation and related gradient flows o n
matrix Grassmann type manifolds
The results presented above can be successfully applied to many interesting dy-
namical system modeling information processes in neural networks, mentioned in
the introduction. In section 3 above we have studied a many agent neural model
allowing for the persistence of some coherent structures under applied small enough
dissipative perturbations. In order to describe all of them one can make use of the
analytical relationships (27) for some indices r ∈ {1, n+m} and find corresponding
558
Pattern formation in neural dynamical systems
constraints on the set of external parameters{β1, β2, β3, β4, β
(1)
n,j β
(2)
n,j , j = 1, m} at
which compact coherent structures persist. Unfortunatelly this problem is compu-
tationally too complicated and cumbersome. That is why we need to apply to its
solving a more suitable approach based on a related statistical description of the
available set of individual oscillatory orbits relating to two groups of {x} and {y}
neurons.
Namely each possible oscillatory manymode orbit with phase variables {xi, yi}
i = 1, n, j = 1, m can be modeled by directed edge with vertices at xi and yi
and directed, for instance, from xi to yi. This edge is taken with a weight Cij+n
,
i = 1, n, j = 1, m, depending both on the corresponding mode frequencies from
the spectrum of the problem (24) and on the set of β-parameters mentioned above.
Shortly speaking the weight matrix C = {Cij : i, j = 1, n+m} represents a vector
space distribution of the persisting manymode oscillatory orbits taking part in the
coherent structure formation permitted by the interacting neural network.
This weight matrix C can be effectively evaluated by means of an extremality
graph problem related with the graph G constructed above from the vertices and
connecting their edges correspondingly to a chosen coherent structure with pre-
scribed oscillatory modes. To proceed with, we describe our problem within the
graph notions introduced above.
Let Sl represent the symmetric matrix group on l ∈ Z+ symbols. Denote by Tl
the set of (l× l) matrices l ∈ Z+, realizing the correspondence between elements of
Sl and Tl, l ∈ Z+. By viewing Tl, l ∈ Z+, as the set of incident matrix for a related
graph Gl, one also obtains a one-to-one correspondence between Tl and Gl, l ∈ Z+.
The set of directed graphs with the property that every vertex is both the source of
directed edges and the sink of directed edges.
Proceed now to a definition of a neighborhood around an arbitrary element
within an available matrix space modeling interaction of many agent system under
regard above.
Definition 1 For any l−symbol extremality graph problem G we define a k− change
neighborhood of G as the set of elements from Gl, l ∈ Z+, that can be obtained by
removing k ∈ Z+ directed edges from Gl and then placing k ∈ Z+ alternative directed
edges to the remaining graph.
For example, if a matrix τ ∈ Tl, l ∈ Z+, it is easy to see that the 2−change
neighborhood consists of the l(l−1)/2 permutation matrices of the form τNij , where
Nij, i, j = 1, l, are permutation matrices that have only two nonzero entries at (i, j)
and (j, i)-places.
Let G be a fully connected undirected graph with l ∈ Z+ vertices with some
weights assigned to its undirected edges. The standard graph partition problem is
to find a partition of a graph into subsets with p and q ∈ Z+ elements such that the
sum of the weights on the cut edges (that is, edges with their endpoints in different
subsets of the partition) is minimized. To formulate this problem more analytically,
let us denote by C a cost matrix constructed in such a way that for any original
559
V.V.Gafiychuk, A.K.Prykarpatsky
weight assigned to an edge, we assign half of it to each of the two directed edges
going to the same vertices in the undirected graph. Now define [17] the matrix
SG(p, q) :=
[
Op,p Ip,q
Iq,p Oq,q
]
, (28)
where q, p ∈ Z+ are given, Op,p and Oq,q are zero matrices and Ip,q denotes the
(p× q)−matrix with all elements equal to 1. Then the graph partition problem with
the swap neighborhood can be represented in the following way: find the infinum
inf Sp(CT τTSG(p, q)τ), (29)
where τ ∈ Tl, at which two different partitions appear to be neighbors if the one can
be made identical to the other by swapping two vertices.
Now we proceed to embeding the set Tl into SO(l), l ∈ Z+, which makes it
possible to construct the element corresponding to (29) in Tl, l ∈ Z+, related with
the element in SO(l), l ∈ Z+, by the following mapping:
i : τ →
τ = A ∈ SO(l), det τ = 1,
[
−1 0
0 1
]
, τ := Jτ = A ∈ SO(l), det τ = −1.
(30)
Then it is easy to see that (29) is equivalent to the problem on SO(l), l ∈ Z+:
inf Sp(CᵀAᵀSG(p, q)A), (31)
where A ∈ SO(l), l ∈ Z+, A
ᵀA = 1 = AAᵀ, detA = 1.
Before proceeding to solving the problem (31), observe that the cost matrix
C ∈ End El satisfies the projection condition C2 = C, entailing the following
reformulation of the problem (31):
inf Ψ(P ), Ψ(P ) = Sp(PSG(p, q)), (32)
where, by definition Ψ : P → R is a Lyapunov function, P := ACᵀAᵀ ∈ P and P
is the standard compact Grassmann manifold [18] of projection matrices, satisfying
the constraint
P 2 = P. (33)
Owing to the fact that the matrix SG(p, q) is symmetric, the functional Ψ : P →
R in (32) can be without changing its values replaced by a functional Ψᵀ : P → R,
where the Lyapunov function
Ψᵀ(P ) = Sp(SG(p, q)P ᵀ) (34)
satisfies the condition
Ψᵀ(P ) = Ψ(P ) (35)
for any P ∈ P. Thereby, the general true problem setting of (32) is as follows:
1
2
inf(Ψ(P ) + Ψᵀ(P )) (36)
560
Pattern formation in neural dynamical systems
for P ∈ P. In a particular case when Cᵀ = C ∈ End El, the condition P ᵀ = P
follows for all P ∈ P and the problem (36) reduces to (32), which will be our main
subject of the study below.
It is easy to show that solutions to the problem (32) are given by the critical
points P̄ ∈ P of the following gradient vector field on P :
dP
dt
= ∇ϕΨ(P ), (37)
where the constraint functional ϕ : P × P → R is given as follows:
ϕ(X,P ) := Sp(X(P 2 − P )) = 0 (38)
for all X ∈ End El. As a result one easily finds [19] that
∇ϕΨ(P ) = [[SG, P ], P ] (39)
for any P ∈ P. Thus, one readily checks that the inequality
dΨ(P )
dt
= Sp(∇ϕΨ(P )SG) = Sp([[SG, P ], P ]SG) = Sp([SG, P ] [P, SG]) =
= −Sp([SG, P ][SG, P ]ᵀ) 6 0 (40)
holds for all P ∈ P. Thereby, the infinum of the problem (32) exists since the
Grassmann manifold P is compact and the Lyapunov functional Ψ : P → R is
decreasing along the orbits of the gradient vector field (37).
Now making use of the previous interpretation of the projector matrix P ∈ P
related due to (30) with the corresponding transmutation matrix τ ∈ Tl, l ∈ Z+,
realizing the solutions to our partition problem with the swap neighborhood model-
ing a coherent structure formation within the multi-agent neural system described
by the graph G. This coherent structure formation is now modeled by means of
the gradient vector field (37) on the compact Grassmann manifold P describing
the dynamics of a virtual ”cost” matrix P := ACᵀAᵀ ∈ P tending to the stable
”cost” matrix P̄ := Ā CᵀĀᵀ ∈ P at some value Ā ∈ SO(l), or the same, at some
transmutation τ̄ ∈ Tl due to (30). This interpretation also gives rise to some other
interesting applications of this partition model, in particular, in many-agent market
theory and others.
Another important aspect of our partition model is related with a possibility to
describe our “cost” changing process (37) as a Hamiltonian flow on the Grassmann
manifold P. This aspect was just recently described in [19] and is based on the
fact [20] that the Grassmann manifold P is also symplectic with the following non-
degenerate symplectic structure ω(2) ∈ Λ2(P) on P:
ω(2) = Sp(PdP ∧ dP P ) (41)
for all points P ∈ P, subject to which the gradient field (37) appears to be Hamil-
tonian on P.
Let us also note that our gradient vector field (39) was derived above for the
symmetric case when P ᵀ = P ∈ P. If the condition P = P ᵀ does not hold for
P ∈ P, then a new vector field expression must be derived for the resulting gradient
flow (37). We plan to study these and related problems elsewhere.
561
V.V.Gafiychuk, A.K.Prykarpatsky
6. Conclusion
Conditions for emerging of different type solutions in a dynamical system is an
issue of practical importance. From the standpoint of such a science like the theory
of self-organization one needs a reliable mathematical theory which can classify
the possible solutions depending on a general form of available nonlinearities. The
coherent structure formation and related gradient flows on matrix Grassmann type
manifolds were considered as well as the corresponding graph model associated with
the partition swap neighborhood problem was studied. In the paper we have also
derived the conditions for a system vector field to be separated into two different
flows: gradient and Hamiltonian. In the general case this problem is still far from
being ultimately resolved and is under study.
References
1. Nicolis G., Prigogine I. Self-organization in Non-equilibrium Systems. Wiley, New
York, 1977.
2. Haken H. Synergetics. Heidelberg, Springer, 1978.
3. Ebeling W., Klimontovich Yu.L. Self-organization and Turbulence in Liquids. Leipzig,
Teubner-Verlag, 1984.
4. Kerner B.S., Osipov V.V. Autosolitons. Dordrecht, Kluwer, 1994.
5. Klimontovich Y.L. Turbulent Motion and the Structure of Chaos: A New Approach
to the Statistical Theory of Open Systems. Dordrecht and Boston, Kluwer Academic
Publishers, 1991.
6. Landau L.D., Lifshitz E.M. Mechanics Course of Theoretical Physics, vol. 1. Pergamon
Press, 1960.
7. Lubashevskii I.A., Gafiychuk V.V. The projection dynamics of highly dissipative sys-
tem // Phys. Rev. E, 1994, vol. 50, No. 1, p. 171–181.
8. Gafiychuk V.V., Lubashevskii I.A. Variational representation of the projection dy-
namics and random motion of highly dissipative systems // J. Math. Phys., 1995,
vol. 36, No. 10, p. 5735–5752.
9. Gafiychuk V.V., Lubashevsky I.A., Klimontovich Yu.L. Self-regulation in a simple
model of hierarchically organized market // Complex Systems, 2000, vol. 12, No. 1,
p. 103–126.
10. Lubashevsky I.A., Gafiychuk V.V. Cooperative mechanism of self-regulation in hier-
archical living systems // SIAM J. Appl. Math., 2000, vol. 60, No. 2, p. 633–663.
11. Amit D.J. Modeling Brain Functions. Cambridge, University Press, 1989.
12. Abeles M., Vaadia E., Bergman H. // Networks, 1990, vol. 1, p. 13–15.
13. Prykarpatsky A.K., Mykytyuk I.V. Algebraic Integrability of Nonlinear Dynamical
Systems on Manifolds. The Netherlands, Kluwer Acad. Publ., 1998.
14. Verhulst R. Ordinary Differential Equations. NY, Springer, 2001.
15. Guckenheim J., Holmes P. Nonlinear oscillations, dynamical systems and bifurcation
of vector fields // Appl. Math. Sci., 1983, vol. 42, No. 1, p. 28–63.
16. Duarte T.R., Mendes V.R. Deformation of Hamiltonian dynamics and constant of
motion in dissipative systems // J. Math. Phys., 1983, vol. 24, No. 7, p. 1772–1778.
562
Pattern formation in neural dynamical systems
17. Wang W.S. Gradient flows for local minima and combinatorial optimization problems
// Field structures communications, 1994, vol. 3, p. 145–155.
18. Wels R.O. Differential Analysis on Complex Manifolds. N.Y., Prentice-Hall Inc., 1973.
19. Gafiychuk V.V., Prykarpatsky A.K. Replicator dynamics and mathematical descrip-
tion of multi-agent interaction in complex systems // Journal of Nonlinear Mathe-
matical Physics, 2004, vol. 11, No. 1, p. 113–122.
20. Blackmore D.L., Prykarpatsky A.K., Zagrodzinski J.A. Lax-type flows on Grassmann
manifolds and dual momentum mappings // Report on Math.Phys., 1997, vol. 40,
p.539–549.
Формування структур в нейронних динамічних
системах обумовлене взаємодією гамільтонових і
градієнтних векторних полів
В.Гафійчук 1 , А.Прикарпатський 2
1 Краківський технологічний університет, Краків, Польща
2 Академія гірництва і металургії, Краків, Польща
Отримано 29 квітня 2004 р., в остаточному вигляді –
21 червня 2004 р.
Аналізуються динамічні системи загального виду, векторні поля яких
складаються з градієнтної (симетричної) та Гамільтонової (антисиме-
тричної) складових. Дискутується відповідність таких систем проце-
сам самоорганізації. Розглядається виникнення когерентних струк-
тур і відповідних градієнтних потоків на грасманових многовидах,
а також моделювання таких структур відповідною моделлю графа,
який виникає в результаті такого формування. Встановлено крите-
рій виникнення гамільтонових і градієнтних векторних полів. Розгля-
дається модельний приклад нейронної динамічної системи, для якої
встановлені умови виникнення осциляційних структур.
Ключові слова: Гамільтонове векторне поле градієнтне векторне
поле динамічна система самоорганізація
PACS: 05.45.-a, 07.05.Mh, 05.65.+b
563
564
|