a transient-chaotic autoassociative network (tcan) based on lee oscillators

16
1228 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 15, NO. 5, SEPTEMBER 2004 A Transient-Chaotic Autoassociative Network (TCAN) Based on Lee Oscillators Raymond S. T. Lee, Member, IEEE Abstract—In the past few decades, neural networks have been extensively adopted in various applications ranging from simple synaptic memory coding to sophisticated pattern recognition prob- lems such as scene analysis. Moreover, current studies on neuro- science and physiology have reported that in a typical scene seg- mentation problem our major senses of perception (e.g., vision, ol- faction, etc.) are highly involved in temporal (or what we call “tran- sient”) nonlinear neural dynamics and oscillations. This paper is an extension of the author’s previous work on the dynamic neural model (EGDLM) of memory processing and on composite neural oscillators for scene segmentation. Moreover, it is inspired by the work of Aihara et al. and Wang on chaotic neural oscillators in pat- tern association. In this paper, the author proposes a new transient chaotic neural oscillator, namely the “Lee oscillator,” to provide temporal neural coding and an information processing scheme. To illustrate its capability for memory association, a chaotic autoas- sociative network, namely the Transient-Chaotic Auto-associative Network (TCAN) was constructed based on the Lee oscillator. Different from classical autoassociators such as the celebrated Hopfield network, which provides a “time-independent” pattern association, the TCAN provides a remarkable progressive memory association scheme [what we call “progressive memory recalling” (PMR)] during the transient chaotic memory association. This is exactly consistent with the latest research in psychiatry and per- ception psychology on dynamic memory recalling schemes. Index Terms—Lee oscillators, temporal information pro- cessing, transient chaos, transient-chaotic autoassociative network (TCAN). I. INTRODUCTION I N the past half century, neural networks have been adopted extensively in various areas, ranging from simple synapses as memory buffers [5], [48] to sophisticated weather forecasting systems [43] and complex pattern recognition problems, in- cluding scene analysis and figure-ground segmentation [14], [41], [61]. Moreover, the latest research in neuroscience and computational neurobiology has revealed that our major senses of perception, including vision and olfaction work very much in a spatiotemporal manner rather than simply in a spatial manner in terms of pattern representation, processing, and cognition [20], [28]. As described in the celebrated correlation theory developed by von der Malsburg and his colleagues [49], which has subsequently been condensed into the dynamic link architecture (DLA) for memory association [50], all visual Manuscript received June 2, 2003; revised November 12, 2003. This work was supported by The Hong Kong Polytechnic University under CERG Grants B-Q569 and CRG G-T850. The author is with the Department of Computing, The Hong Kong Polytechnic University, Kowloon, Hong Kong (e-mail: [email protected]). Digital Object Identifier 10.1109/TNN.2004.832729 systems for pattern association and recognition are achieved by a temporal correlation between neurons in the brain. In fact, this spatial-temporal memory coding theory has gained a considerable amount of support from the findings of various physiological experiments. The experiments have included examinations of the visual cortical areas of the cat and of the nonlinear dynamics of the olfactory system where the system encodes and processes the stimulus (e.g., odor, visual image, etc.) by altering its neural dynamics in an oscillatory manner rather than by presenting the neural dynamics in a state pattern [18], [24], [27]. The latest developments and studies based on this correlation theory include work on coupled neural oscillators for sensory segmentation [51], face- and hand-ges- ture recognition [42], [56], [57], [66], and composite neural oscillators for complex scene analysis and surveillance systems [40], [41]. Moreover, regarding these spatio-temporal neuron coding in- formation processing systems [33], [68], the latest research on brain science and physiology has reported that deterministic chaotic patterns were found in these nonlinear dynamic sys- tems. They include chaotic electroenchephalogram (EEG) pat- terns found in the olfactory system [23], chaotic oscillations and bifurcations in the giant axons of squid [2], [3], chaotic neurons found in the pyloric central pattern generator (CPG) of the Cal- ifornia spiny lobster [19], [32], and various chaotic phenomena in brain functions [21], [22], [28] such as the chaotic ECoG (cor- tical EEG) in the human brain [52] and the chaotic dynamics of the EEGs during the perception of music [35]. Freeman [20] concluded that “Many other parts of the brain have chaotic attractors and multiple wings, and the stability of their EEGs shows that they are exceedingly robust. Chaotic dynamics pro- vides a basal state with ideal properties…Chaos generates the disorder needed for creating new trials-and -error learning, and for creating a new basins in assimilating new stimuli.” According to this challenging theory of neuroscience, researchers had proposed various chaotic neural models to simulate these chaotic neural phenomena [13], [19], [31], [53], [63], [64]. However, most of the chaotic neural models based on the seminal models of Hodgkin and Huxley [25], [26], [29] or Wilson and Cowan [65] are either too complicated to be adopted in artificial neural networks or too simplified to reproduce sat- isfactory chaotic phenomena [2], [4], [12], [53], [58], [59]. As an extension of the author’s previous work on the elastic-graph dynamic-link model (EGDLM) and on the composite neural oscillator [39]–[45] for vision object recognition, and inspired by the work of Aihara et al. on chaotic neural networks [1]–[3] and of Wang on the Wang oscillator [63], [64], in this paper, the author proposes a new transient chaotic oscillator, namely, the 1045-9227/04$20.00 © 2004 IEEE

Upload: rst

Post on 23-Dec-2016

213 views

Category:

Documents


2 download

TRANSCRIPT

1228 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 15, NO. 5, SEPTEMBER 2004

A Transient-Chaotic Autoassociative Network(TCAN) Based on Lee Oscillators

Raymond S. T. Lee, Member, IEEE

Abstract—In the past few decades, neural networks have beenextensively adopted in various applications ranging from simplesynaptic memory coding to sophisticated pattern recognition prob-lems such as scene analysis. Moreover, current studies on neuro-science and physiology have reported that in a typical scene seg-mentation problem our major senses of perception (e.g., vision, ol-faction, etc.) are highly involved in temporal (or what we call “tran-sient”) nonlinear neural dynamics and oscillations. This paper isan extension of the author’s previous work on the dynamic neuralmodel (EGDLM) of memory processing and on composite neuraloscillators for scene segmentation. Moreover, it is inspired by thework of Aihara et al. and Wang on chaotic neural oscillators in pat-tern association. In this paper, the author proposes a new transientchaotic neural oscillator, namely the “Lee oscillator,” to providetemporal neural coding and an information processing scheme. Toillustrate its capability for memory association, a chaotic autoas-sociative network, namely the Transient-Chaotic Auto-associativeNetwork (TCAN) was constructed based on the Lee oscillator.

Different from classical autoassociators such as the celebratedHopfield network, which provides a “time-independent” patternassociation, the TCAN provides a remarkable progressive memoryassociation scheme [what we call “progressive memory recalling”(PMR)] during the transient chaotic memory association. This isexactly consistent with the latest research in psychiatry and per-ception psychology on dynamic memory recalling schemes.

Index Terms—Lee oscillators, temporal information pro-cessing, transient chaos, transient-chaotic autoassociative network(TCAN).

I. INTRODUCTION

I N the past half century, neural networks have been adoptedextensively in various areas, ranging from simple synapses

as memory buffers [5], [48] to sophisticated weather forecastingsystems [43] and complex pattern recognition problems, in-cluding scene analysis and figure-ground segmentation [14],[41], [61]. Moreover, the latest research in neuroscience andcomputational neurobiology has revealed that our major sensesof perception, including vision and olfaction work very muchin a spatiotemporal manner rather than simply in a spatialmanner in terms of pattern representation, processing, andcognition [20], [28]. As described in the celebrated correlationtheory developed by von der Malsburg and his colleagues [49],which has subsequently been condensed into the dynamic linkarchitecture (DLA) for memory association [50], all visual

Manuscript received June 2, 2003; revised November 12, 2003. This workwas supported by The Hong Kong Polytechnic University under CERG GrantsB-Q569 and CRG G-T850.

The author is with the Department of Computing, The HongKong Polytechnic University, Kowloon, Hong Kong (e-mail:[email protected]).

Digital Object Identifier 10.1109/TNN.2004.832729

systems for pattern association and recognition are achievedby a temporal correlation between neurons in the brain. Infact, this spatial-temporal memory coding theory has gained aconsiderable amount of support from the findings of variousphysiological experiments. The experiments have includedexaminations of the visual cortical areas of the cat and of thenonlinear dynamics of the olfactory system where the systemencodes and processes the stimulus (e.g., odor, visual image,etc.) by altering its neural dynamics in an oscillatory mannerrather than by presenting the neural dynamics in a state pattern[18], [24], [27]. The latest developments and studies basedon this correlation theory include work on coupled neuraloscillators for sensory segmentation [51], face- and hand-ges-ture recognition [42], [56], [57], [66], and composite neuraloscillators for complex scene analysis and surveillance systems[40], [41].

Moreover, regarding these spatio-temporal neuron coding in-formation processing systems [33], [68], the latest research onbrain science and physiology has reported that deterministicchaotic patterns were found in these nonlinear dynamic sys-tems. They include chaotic electroenchephalogram (EEG) pat-terns found in the olfactory system [23], chaotic oscillations andbifurcations in the giant axons of squid [2], [3], chaotic neuronsfound in the pyloric central pattern generator (CPG) of the Cal-ifornia spiny lobster [19], [32], and various chaotic phenomenain brain functions [21], [22], [28] such as the chaotic ECoG (cor-tical EEG) in the human brain [52] and the chaotic dynamics ofthe EEGs during the perception of music [35]. Freeman[20] concluded that “Many other parts of the brain have chaoticattractors and multiple wings, and the stability of their EEGsshows that they are exceedingly robust. Chaotic dynamics pro-vides a basal state with ideal properties…Chaos generates thedisorder needed for creating new trials-and -error learning, andfor creating a new basins in assimilating new stimuli.”

According to this challenging theory of neuroscience,researchers had proposed various chaotic neural models tosimulate these chaotic neural phenomena [13], [19], [31], [53],[63], [64]. However, most of the chaotic neural models based onthe seminal models of Hodgkin and Huxley [25], [26], [29] orWilson and Cowan [65] are either too complicated to be adoptedin artificial neural networks or too simplified to reproduce sat-isfactory chaotic phenomena [2], [4], [12], [53], [58], [59]. Asan extension of the author’s previous work on the elastic-graphdynamic-link model (EGDLM) and on the composite neuraloscillator [39]–[45] for vision object recognition, and inspiredby the work of Aihara et al. on chaotic neural networks [1]–[3]and of Wang on the Wang oscillator [63], [64], in this paper, theauthor proposes a new transient chaotic oscillator, namely, the

1045-9227/04$20.00 © 2004 IEEE

LEE: TCAN BASED ON LEE OSCILLATORS 1229

“Lee oscillator,” to provide a chaotic-based, temporal neuralcoding and information processing model.

From the point of view of application, the author imple-mented a Transient-Chaotic Autoassociative Network (TCAN)based on the Lee oscillator as the basic neuron structure.Different from the contemporary chaotic autoassociators pro-posed by Aihara et al. [1]–[3] and Wang [63], [64], the TCANprovides a progressive memory recalling (PMR) scheme thatresembles the “progressive and constructive memory recallingscheme” found in the latest studies in psychiatry [16] andperception psychology [7], [72].

The paper is organized as follows. Section II gives anoverview of the latest work on chaotic neural networks, inparticular on the chaotic neural oscillator proposed by Wang[63], [64]. Section III presents the architecture of the Leeoscillator, its major components, and neural dynamics. Basedon the Lee oscillators as the neural components, Section IVpresents TCAN, together with the system implementationdetails and key experimental results. A comparison with thechaotic autoassociator proposed by Ahara et al. [1]–[3] will bediscussed in Section V. Section VI discusses the major biolog-ical and psychological implications of the proposed system,and is followed by the related research and conclusion.

II. CHAOTIC NEURAL OSCILLATORS—AN OVERVIEW

A. Introduction

Classical artificial neural networks (ANNs) are composed ofthe simple artificial neurons that emulate biological neural ac-tivities. The latest studies in neuroscience and neurophysiologyhave examined such issues as the functional properties of thehippocampus [8], [20], the neural activities in the pyloric cen-tral pattern generator (CPG) of the lobster [32], and other brainactivities [20]. These kinds of neural models have been stronglycriticized as being far simpler than real neural models. Studieshave provided strong evidence of chaotic neural activities inthese complex neural behaviors [21], [70].

Researchers have proposed various chaotic neural modelsover the past decades. The latest research includes: chaoticoscillators proposed by Falcke et al. [19] to model pyloric CPGneurons, cortical networks proposed by Hoshino et al. [31]for recalling long-term memory (LTM), the transient chaoticneural network (TCNN) proposed by Chen and Aihara [13] forhandling combinational optimization problems, Wang oscilla-tors [63], [64] for spatio-temporal information processing, andZhou’s work [71] based on the chaotic annealing technique fordynamic pattern retrieval.

From the system architecture point of view, most chaoticneural network models are based on the computational neuro-science models that had been developed from the theoreticalwork of Hodgkin and Huxley in 1952 [29]. These computa-tional neuroscience models focus on spiking neural dynamicbehavior. The latest theoretical developments include workby Fukai et al. [25], [26] and Aihara et al. [2]–[4]. Anothermain stream of neuroscience has focused on the behavior ofthe neural populations. Celebrated models include the neuraloscillatory model proposed by Wilson and Cowan in 1972[65], who described the behavior of the neurons as interactive

Fig. 1. Wang oscillator.

triggering (or what we call “oscillations”) between the excita-tory and inhibitory neurons. In fact, this theory has provideda vast amount of support for work that is being conductedvarious fields, including neurophysiology, neuroscience, andthe latest research on brain science [20], [23], [24], [27],[32], [49], [50], [52]. This theory has also formed the basis ofmany subsequent studies and models in the field of cognitiveinformation processing [10], [53] and on the synchronizationand desynchronization behaviors of the neural oscillators [10],[18], [27], [62]. The latest applications include pattern andmemory associations [1], [4], [6], [11], scene analysis, andpattern recognition [14], [40], [41], [51], [61], [67].

However, the models proposed based on these cele-brated models, including the Hodgkin–Huxley model,FitzHugh–Nagumo model, and Wilson–Cowan model (andtheir derivatives) are either too simplified to simulate any “real”chaotic neural behaviors or too complicated to be applied asfeasible artificial neural networks [4] for applications.

This study is an extension of the author’s previous work onneural oscillators [39], [43] and on applications in various areasincluding face recognition [42], scene analysis [41] and, mostrecently, on the implementation of an agent-based surveillancesystem based on a composite neural-oscillatory model (CNOM)[40], etc. It has also been inspired by the theoretical chaoticneural oscillator model (namely, the Wang oscillator) proposedby Wang [63], [64]. In this paper, the author is proposing a newchaotic neural model, namely, the “Lee oscillator.” The Lee os-cillator provides a feasible solution for some critical problemsencountered in the Wang oscillator when it is adopted as a bi-furcation transfer unit (BTU) [53], [63], [64] for temporal infor-mation (memory) coding.

This section gives a brief overview of the Wang oscillator, itsarchitecture, chaotic neural dynamics, and the idea of BTU fortemporal information coding. It also describes the major prob-lems found in the Wang oscillator, which prevent it from actingas an effective BTU for dynamic memory encoding and patternassociation.

1230 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 15, NO. 5, SEPTEMBER 2004

Fig. 2. Bifurcation diagram of a 5/1/1 Wang oscillator [63], [64].

B. Wang Oscillator

Most of the contemporary neural oscillators that developedtheoretically from the Wilson–Cowan model are focused on thetime-continuous framework. Wang, from 1991 to 1992 [63],[64] proposed a simple time-discrete neural oscillator model(namely, the “Wang oscillator). Different from its continuousmodel counterpart, the Wang oscillator provided a simple butremarkable neural dynamics ranging from fixed points (throughquasiperiodicity) to chaos, as revealed from its bifurcation dia-gram. This bifurcation diagram can be used as the computationalelements (BTU) for temporal information processing.

In short, the Wang oscillator is a neural oscillatory modelconsisting of two neurons: one excitatory and one inhibitory.The neural model is given in Fig. 1, and the generalized neuraldynamics are given as follows:

(1)

(2)

where and are the state values of the excitatory andinhibitory neurons at time , are the weightparameters, and are the input stimulus, and are thethresholds corresponding to the two neurons, and is the sig-moid function given by [63], [64]:

(3)

where is an important factor controlling the behavior of thesigmoid function.

As reported by Wang [63], [64], with the settings; ; ; and under the

Fig. 3. Lee-oscillator model.

condition that , varying will produce a series of period-doubling bifurcations that leads to chaos.

Moreover, the neural dynamics of this discrete time Wangoscillator is given by

(4)

LEE: TCAN BASED ON LEE OSCILLATORS 1231

Fig. 4. Bifurcation diagram of a mode (500/5/5/1) Lee oscillator.

With substitutions to (1)–(3), the neural dynamics of theWang oscillator can be described by

(5)

In a usual case of a neural oscillator with an external stimulusto the inhibitory neuron only (i.e., and ), theneural dynamic of the Wang oscillator is given by

(6)

which is termed the Wang oscillator [63], [64].One important finding of the Wang oscillator is the bifurca-

tion behavior of the Wang oscillator with the variationof input stimulus I. Fig. 2 depicts a typical bifurcation disagree-ment of a 5/1/1 Wang oscillator [63], [64].

As shown in Fig. 2, the response of the Wang oscillator toan external input stimulus can be categorized into five regions[starting from the negative (Region A) to the positive (RegionE)]. Their neural dynamics are summarized as Regions A and Eare the outermost region. They depict a typical sigmoid functiongrowth which corresponds to a periodic response to an inputstimulus, and their ends meet the tangent bifurcation regions.Region B is a bifurcation region which ends up with the reversecrisis. The bifurcation curve returns to a typical sigmoid curvein Region C, which is trapped between the crisis and reversecrisis of the attractor. Region D is the most important regionin the Wang oscillator; it denotes the region of “hysteresis” thatcorresponds to the chaotic behavior of the Wang oscillator with a

small input stimulus. A detailed analysis of the neural dynamicscan be found from the work of Minai et al. [53].

C. Major Contributions and Limitations of Wang Oscillators

One important finding and contribution of the Wang oscillatoris the property of the change in neural dynamics according tothe input stimulus. In most classical neural network models forinformation encoding and association (such as the Hopfield net-work, self organization map (SOM) , etc.), the network modelcan be generalized as a nonlinear function operator which, basedon various input stimuli, alters its internal states and “fires” theoutput according to the nonlinear transfer function. The Wangoscillator, on the other hand, encodes information (i.e., the inputstimulus) and gives the responses by altering the behavior of theneural dynamics (from chaotic states to sigmoid growth). Thisis consistent with the latest findings on how the brain processesinformation [20], [21], [49]. In a pattern association problem,when a stimulus is applied, and if the input stimulus is small,the network output is not only small, but chaotic, which indi-cates underlying complex and aperiodic neural activities in theneuron population. However, when the input stimulus increasesto a certain level, the neural dynamics within the inhibitoryand excitatory neurons become periodic, which also results in aphase-locking behavior in the neural population. In other words,the information processing model using a Wang oscillator canbe interpreted as the synchronization behavior of the neural pop-ulation upon information encoding/pattern association. This isalso consistent with information processing models of the brainproposed in other contemporary studies [10], [18], [24].

1232 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 15, NO. 5, SEPTEMBER 2004

Fig. 5. Bifurcation diagram of a (mode 500/5/5/1) Lee oscillator under a linear gain from an external stimulus.

According to this remarkable feature, the Wang oscillator canbe adopted in two directions: a) According to its remarkablebifurcation feature, it can act as a “chaotic-growth” transferfunction (namely, as a BTU) [25], [63], [64]. Compared witha classical transfer function such as the sigmoid function, thechaotic-periodic behavior found in its bifurcation diagram canbe used as a new option for transfer function chaotic-to-periodicgrowth upon various input stimuli; b) the neural model itself canbe directly adopted as a dynamic neural unit for temporal in-formation processing, including dynamic information encodingand pattern association.

However, the Wang oscillator faces several major limita-tions in the bifurcation behavior of the model, preventing itfrom being used as an effective BTU or temporal informationprocessing model. As mentioned previously, for the neuraldynamics of the Wang oscillator in various regions: i) An“unwanted” chaotic region exists in Region B, which notonly affects the continuity of the BTU, but also violates theoriginal function of the BTU, that of stimulating the temporalinformation processing for the brain. In this chaotic region, thechaotic dynamic only appears when the external input stimulusis small, while the neural dynamic will turn into a “stable”mode when the input stimulus becomes sufficiently strong. ii)Although there is chaotic behavior in the neural dynamics whenthe input stimulus is small (i.e., in Region D), in order to adoptthe oscillator as an realistic and effective transfer unit (i.e. theBTU), the change of neural dynamics of the oscillator fromperiodic to chaotic behavior should be gradual and continuous;iii) when the Wang oscillator is used as a BTU, there is eithera single state in the nonchaotic region or multiple states in the

chaotic region. However, as shown in the bifurcation diagram,in a single Wang oscillator, a stable rest state (i.e., ) existsin the bifurcation dynamics. This also prevents the oscillatorfrom being used effectively either as a BTU or for temporalinformation encoding.

In Section III, we discuss how the Lee oscillator works and,more importantly, how its remarkable neural dynamics allowsthe Lee oscillator to eliminate the problems that appear in theWang oscillator.

III. LEE OSCILLATOR

Different from the Wang oscillator, the Lee oscillator pro-vides a transient chaotic progressive growth in its neural dy-namics, which solves the fundamental shortcoming of the Wangoscillator in temporal information encoding and in acting as aBTU.

A. Lee Oscillator Model and Its Bifurcation Behavior

Different from the Wang oscillator, the Lee oscillator consistsof the neural dynamics of four constitutive neural elements: ,

, and . Fig. 3 is a depiction of the Lee oscillator model.The neural dynamics of each of these constituent neurons are

given by

(7)

(8)

(9)

(10)

LEE: TCAN BASED ON LEE OSCILLATORS 1233

Fig. 6. Bifurcation diagram of a (mode 5/5/1) Wang oscillator under a linear gain from an external stimulus.

where , , , and are the state variables of theexcitatory, inhibitory, input, and output neurons, respectively;

is the sigmoid function given by (3); , , , and arethe weight parameters for these constitutive neurons; andare the thresholds for excitatory and inhibitory neurons; isthe external input stimulus; and is the decay constant.

Similar to the Wang oscillator, the most remarkable featureof this oscillator is its bifurcation behavior under different ex-ternal input stimuli. Fig. 4 shows the bifurcation diagram of atypical mode 500/5/5/1 Lee oscillator (where “500” value ofthe decay constant ; the first “5” value of ; the second “5”

value of ; “1” values of and , respectively).Different from the Wang oscillator, the bifurcation diagram

of a single Lee oscillator is composed of three main regions:Regions A, B, and C, respectively. From the point of view ofneural dynamics, Regions A and C denote the “sigmoid-shape”region, which corresponds to the nonchaotic neural activities inthe oscillators; and Region B is the “hysteresis” region, whichcorresponds to the area of chaotic behavior that results when aweak external input stimulus is received.

In view of the major limitations of the Wang oscillator dis-cussed in Section II, the bifurcation diagram of the Lee oscil-lator provides clear evidence of how these problems have beensolved: i) As compared with the “unwanted” chaotic region Bappearing in the Wang oscillator, which affects the continuityof the neural dynamics and violates the information processingbehavior (upon stimulus) in the brain science, no “unwanted”chaotic region appears in the bifurcation diagram. ii) Wherethere was the problem of neural dynamic continuity between the

chaotic and nonchaotic regions in the Wang oscillator, the Leeoscillator provides a truly gradual change from chaotic to non-chaotic dynamics due to the adoption of the decaying factor ap-pearing in output neurons (10). iii) As compared with the Wangoscillator, there is no unwanted “rest” state in the whole bifurca-tion curve. The bifurcation upon the receipt of an external stim-ulus is either chaotic or nonchaotic and sigmoid-like with singleand continuous behavior.

B. Potential Applications of the Lee Oscillator

There are three main streams of applications for the Lee os-cillator.

1) Basic Chaotic Neural Elements for Temporal InformationProcessing: Basically speaking, information processing is theoverall process of encoding, recognizing and discriminating be-tween information (e.g., images, patterns, etc.). Since a singleLee oscillator can provide a two-state attraction from the inputspace, it is suitable not only as a basic element for informationprocessing. In addition, due to its unique chaotic features wherethe neural dynamics change with variations in the input signal,the Lee oscillator provides a good analog for simulating thechaotic and temporal information processing behavior in brainscience [20], [21], [49].

Figs. 5 and 6 are bifurcation diagrams of the (mode 500/5/5/1)Lee oscillators and (mode 5/5/1) Wang oscillators with a lineargain of external stimulus; and Figs. 7 and 8 are the bifurcationdiagrams for these oscillators under a sinusoidal external stim-ulus. It is clear that the bifurcation behavior of the Lee oscil-lator provides a better and more reasonable analog as a chaotic

1234 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 15, NO. 5, SEPTEMBER 2004

Fig. 7. Bifurcation diagram of a (mode 500/5/5/1) Lee oscillator under a sinusoidal external stimulus.

BTU for information processing. For the linear gain of the ex-ternal stimulus, the Lee oscillator provides a chaotic-to-peri-odic oscillations upon the receipt of an external stimulus (ratherthan unrealistic multiple stable states as in the Wang oscillator)under the linear gain of an external stimulus. It also provideschaotic-to-periodic sinusoidal outputs (rather than the imprac-tical multiple stable states that appear in the Wang oscillator).In fact, from the point of view of application, the temporal in-formation processing of the sinusoidal inputs can be adopted todetect some periodic structures in the input or used for signalprocessing problems.

A typical information processing scenario such as patternrecognition involves the information processing work of a col-lection of neural units (called the neural population). It also in-volves the synchronization and desynchronization operations ofthese constituent neural units. Further research related to thisarea will be discussed at the end of this paper.

2) Transient Chaotic Autoassociator: As an extensionand generalization of using Lee oscillators for informationprocessing, a two–dimensional (2-D) layer of Lee oscillatorscan be adopted as a pattern associator. As an analog to theclassical Hopfield network [30] as an autoassociator, TCANbased on Lee oscillators can be used to provide an innovativeprogressive memory association and recalling scheme. Detailswill be discussed in Section IV.

3) Chaotic Neural Oscillatory Units for Advanced Applica-tions: In fact, the Lee oscillators can be adopted and integratedwith each other to form a complex chaotic neural oscillatorymodel to tackle complex problems such as complex scene an-

alyzes and robot vision and navigation problems, etc. Studiesbeing actively carried out in these areas will be discussed at theend of this paper.

IV. TCAN

A. Introduction

An autoassociative network is one of the most important andfundamental applications in the fields of neural networks, infor-mation processing, and neurosciences. In fact, it is also the foun-dation of various complex information processing tasks suchas face (pattern) recognition, figure-ground segmentation, scenesegmentation, pattern classification, data mining, and so forth.Celebrated works include the fundamental studies conducted byHopfield [30] and Kohenon [37]. The most recent works in-clude the combined evolution model by Cheng and Guan [15],the study of associative memories and nonlinear dynamics byBibitchkov et al. [9], Karlholm [36], Morita [54], Liou and Yuan[47], Yoshizawa et al. [69], and studies on chaotic dynamics[60].

As a direct and simple implementation of the Lee oscillators,in this section, we will discuss how a 2-D autoassociator is con-structed using Lee oscillators as the neural framework. Basedon its remarkable progressive pattern association mechanismsthrough its transient-chaotic neural dynamic, this autoassociatoris named the TCAN.

I will compare the TCAN with the contemporary chaoticautoassociators proposed by Aihara and his colleagues [1],[4] and the Wang oscillators [63], [64] and will demonstrate

LEE: TCAN BASED ON LEE OSCILLATORS 1235

Fig. 8. Bifurcation diagram of a (mode 5/5/1) Wang oscillator under a sinusoidal external stimulus.

how TCAN provides efficient and effective pattern associationschemes. More importantly, we will illustrate the progresspattern (memory) recalling capacity in this direct application.As a further extension, we will demonstrate how this remark-able characteristic can be applied on a facial pattern recallingscenario.

B. System Framework of TCAN

As discussed in Section III, the direct adoption of the Leeoscillator is based on a simple 2-D single-layered neural popu-lation with an analog to a classical Hopfield network. This time,however, we are using a collection of Lee oscillators as the basicneural element. Fig. 9 shows the schematic diagram of this au-toassociative network, namely, the TCAN.

The neural dynamics are given as follows:

(11)

(12)

(13)

where are the connection weights, is the total numberof patterns stored in the TCAN, is the output neurons of theLee oscillator, is the stored pattern in the TCAN, is the

Fig. 9. TCAN using Lee oscillators.

total number of Lee oscillators in the TCAN, and , and arethe excitatory, inhibitory, and input neurons found in the Leeoscillator.

As shown in the above neural dynamic equations, the inter-actions among the constituent neurons of the Lee oscillators inthis network can act as an autoassociator upon the presence ofquery patterns that are treated as external input stimuli with ananalog to classical Hopfield networks [30] and Kohonen net-works [37], [38]. However, different from this classical model,the proposed TCAN provides a change in neural dynamics (be-tween chaotic-to-stable state transition) when pattern associa-tion occurs. A detailed discussion and the experimental tests ofthe proposed model are described in Sections V–VII.

V. SYSTEM IMPLEMENTATION AND EXPERIMENTAL RESULTS

From the implementation and system evaluation point ofview, we will compare the chaotic autoassociative behaviors

1236 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 15, NO. 5, SEPTEMBER 2004

Fig. 10. Four-stored pattern [1], [4].

TABLE IOPTIMAL PARAMETER SET USED IN THE TCAN

of the TCAN against the chaotic autoassociative network pro-posed by Aihara et al. [1], [4] and Wang oscillators [63], [64].In order to provide a fair and thorough comparison, the authoradopted the same system environment and the same set of testpatterns as described in their works [1], [4]. The testing of thewhole system is categorized into the following three sections:Section V-A compares Aihara’s model with the TCAN usingthe chaotic autoassociation of four simple patterns describedAihara’s previous works [1], [4]; Section V-B adopts the TCANmodel on autoassociation for a more complex case—hand-written Chinese character recognition; Section V-C will furtherexplore the progressive memory recalling capability of TCANupon the recognition of a human face, for which we adoptedthe Yale University facial database set A.

A. Chaotic Autoassociation on Simple Patterns

In this test, we try to compare the chaotic autoassociativeperformance of the TCAN with the remarkable chaotic neuralnetwork proposed by Aihara et al. [1], [4] by using the sametest patterns and system environment for system evaluation. Thefour stored patterns which are encoded by a 10 10 binary pixelgrids are shown in Fig. 10.

Parameters selection testBefore the TCAN is tested against the unseen or noisy pattern,

the system is fine-tuned to its optimal transient autoassociativebehavior with the “training” of the stored patterns. Table I showsthe list of optimal parameters used in the TCAN model.

Chaotic pattern association upon stored patternsFigs. 11 and 12 depict the network outputs of the three

models in the first 50 oscillation cycles upon the externalstimulus (input) of the stored pattern c.

From these figures, it is clear that all these networks achievedpattern autoassociation through chaotic neural oscillations.However, in the case of the Aihara model, it is clear that uponthe external input of a known pattern (pattern c in this case), thenetwork will not come to a steady–state; rather, it “oscillates”between different stored patterns (and their reverse patternsas well). As indicated in their previous findings [1], [4] (andconfirmed in our test), the frequency of the appearance of anyparticular pattern is not directly correlated to the known pattern

Fig. 11. Sample sequence of a spatio-temporal pattern association performedby the TCAN upon the external input of pattern (c).

given as the test pattern. For the case of Wang oscillators, it iseven worse. As shown in Fig. 12(b), in the presence of the pat-tern (c) as the test pattern, it cannot recall it but rather associatesit with a wrong stored pattern. Actually, same cases happenfor using other stored patterns for testing. As discussed in Sec-tion II, the main reason behind maybe due to the discontinuityand the existence of multiple stable states in its bifurcationdiagram when it is served as BTU for autoassociation.

On the other hand, as revealed in Fig. 11, upon the external“stimulus” of a known pattern (pattern c shown in the paper),the TCAN neural oscillator, after performing chaotic neural os-cillations for a short period of time (from to inour case), will start to “reshape” (or what we call “progressivememory recalling”—PMR) the stored pattern after the first fewtime steps (from to ). Once it “recalls” the cor-rect stored pattern, it will stabilize in a steady–state, which isclassical autoassociative behavior we have seen in other similarmodels.

Chaotic pattern association upon noisy test patternsIn the second test, we tried to present noisy stored patterns

as input stimuli to test how these models behaved upon patternassociation. The four “noisy” test patterns (with over 20% ofthe noise in pixels) are shown in Fig. 13. Since Wang oscilla-tors cannot recall even the stored pattern, in this test, we onlyfocus on the comparisons between TCAN and Aihara’s chaoticautoassociator. Figs. 14 and 15 show the transient autoassocia-tive dynamics of these patterns using the TCAN and Aihara’schaotic autoassociator, respectively.

From the above figures, it is not difficult to discover that eventhough Aihara’s autoassociator continued oscillating in varioustransient states to the stored patterns (and their reverse patterns)upon the input of noisy stored patterns, there is no evidenceto show that this chaotic oscillator associated (or recalled) any

LEE: TCAN BASED ON LEE OSCILLATORS 1237

Fig. 12. Sample sequence of a spatio-temporal pattern association performedby the Aihara Chaotic Autoassociator and Wang oscillators upon external inputsof pattern (c).

Fig. 13. Four “noisy” test patterns.

particular stored pattern. However, as shown in Fig. 14, upon thestimulus of the four noisy test patterns, by using the TCAN all

Fig. 14. Sample sequence of spatio-temporal pattern association performed bythe TCAN upon external inputs of four noisy stored patterns.

Fig. 15. Sample sequence of spatio-temporal pattern association performedby the Aihara Chaotic Autoassociator [1], [4] upon external inputs of four noisystored patterns.

of the four stored patterns could be successfully recalled withinten time steps. This is less than the 1.2 s it takes when using aPentium IV personal computer for simulation.

Also, as revealed from the last experiment, the transient au-toassociation process done by the TCAN (using Lee oscillators

1238 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 15, NO. 5, SEPTEMBER 2004

TABLE IISTOKE RELATION TABLE

as constituting components) performed a “chaotic-to-stable” au-toassociation behavior. Upon the presentation of a noisy knownpattern, the oscillatory network first performed a chaotic pat-tern association in the first few time steps. Afterwards, it beganto “rebuild” (or “reshape”) the correct stored pattern in the tem-poral outputs until it successfully “recalled” the whole pattern.This kind of Pattern Recovery operation has a direct analog tothe PMR scheme found in brain science, as discussed in Sec-tion IV. A more detailed elaboration of this PMR behavior ofthe TCAN will be discussed in Section VI.

B. Progressive Autoassociation of Handwritten ChineseCharacters

In this experiment, we tried to test the transient chaotic au-toassociation performance of the TCAN against a more complexproblem: recognizing handwritten Chinese characters. Since Ai-hara’s model cannot provide satisfactory pattern association re-sults, we tried to test the proposed system with previous workson the recognition of handwritten Chinese characters using theoscillatory elastic matching model [39]. A major focus of thetest is the accuracy of character recognition and the efficiencyof the system.

In the test, 3000 handwritten Chinese characters were usedas memory patterns. To ensure the representative nature of thecharacter library, even proportions of characters were chosenfrom each category within the seven basic stroke relation table(Table II). Fig. 16 shows a snapshot of same sample characterpatterns.

In the test, the author is not aiming at the invariant proper-ties of the proposed system, but rather at its PMR capabilityand its efficiency upon pattern association. Table III shows thecorrect retrieval rate of the stored pattern of the TCAN uponthe presentation of various percentages of noises found in theinput patterns. A comparison with the previous works on Chi-nese character recognition [39] will be made as well.

Table III reveals that TCAN outperforms our previousNOEGM system [39], ranging from 2% (for stored patternrecalling) to over 24% for an input noise of 30%, with a cor-rect recognition rate (averaged over these 3000 handwrittencharacters) of over 72%. Actually, the superior performancecan be explained by the PMR capability of the system, as

Fig. 16. Sample set of handwritten chinese characters.

explained before. Fig. 17 shows how TCAN provide efficientPMR against noisy handwritten Chinese characters.

C. Progressive Memory Recalling for Human Faces

In the last test, we tried to explored how TCAN performsin recognizing human faces (in grey-level photos). In the test,we adopted a Yale University database (set A) that containsthe facial images of 15 individuals in different views and facialexpressions, including frontal views, side views, gimmickfaces, occluded faces, etc. In this test, we try to compare thepropose TCAN system with other two models including theNOEGM system examined in the author’s previous works onChinese character recognition [39] and the enhanced DLAsystem proposed by Wiskott et al. [66]. The results are tab-ulated in Table IV. Figs. 18 and 19 show the sample facialpatterns (adopted from the Yale University facial database) andthe progressive memory recalling scheme of the TCAN uponrecalling facial patterns, respectively.

In terms of the recognition rate, the TCAN providespromising recognition results upon frontal face recognition.Owing to its nature of progressive memory recalling, theTCAN also provides reasonable recognition performance uponoccluded and gimmick face recognition. We must rememberthat the proposed TCAN system is a simple chaotic neuralautoassociative network that provides limited invariant recog-nition capability. However, the most remarkable feature foundin this experiment is the recognition speed of the proposedsystem. Due to the simple network structure of the chaotic au-toassociative network TCAN, TCAN significantly outperformsthe other two models (NOEGM model [42] and enhanced DLAmodel [66]) in overall recognition speed by 127 and 307 times,respectively. It is able to do so mainly because of the sophis-ticated feature vector selection and because of the dynamiclink association of these feature vectors upon recognizingthese models. For a detailed discussion and exploration of theprogressive pattern recalling and recognition of human facesusing LEE associators, see [45].

VI. BIOLOGICAL AND PSYCHOLOGICAL IMPLICATIONS OF LEE

OSCILLATORS AND TCAN

One of the major contributions of the Lee oscillators andTCAN is their biological and neuroscience implications. Thelatest works on visual psychology [7] and neurosciences [20],[21] have revealed that there exists strong implications forchaotic dynamics in human brain activities including memoryassociation, visual object recognition, scene segmentation,

LEE: TCAN BASED ON LEE OSCILLATORS 1239

TABLE IIIPERFORMANCE OF THE TWO MODELS ON HANDWRITTEN CHARACTER RECOGNITION

Fig. 17. Sample sequence of a spatio-temporal handwritten Chinese character “recalling” scheme performed by TCAN.

etc. Moreover, the latest research on visual psychology [7]and psychiatry [16] has reported that our memory associationand recalling operations is not a “quantum jump” (either “re-called” or “not recalled”) but rather is a kind of “progressivememory recalling (reconstruction) scheme” which is foundin memory-deficient patients. As indicated in the bifurcationdiagrams and the tests of the Lee oscillators and the TCANsystem, it is clear that the Lee oscillator provides an idealframework for modeling chaotic neural activities in progressivelearning, while the TCAN (using Lee oscillators as constitutingneural components) provides an ideal solution for modeling theprogressive memory recalling (and association) scheme.

The progressive pattern association scheme is further illus-trated using the TCAN against the “quantum jump” of memoryrecalling scheme performed in a traditional autoassociative net-work such as the Hopfield network. Fig. 20 demonstrates thenonprogressive pattern association scheme of Hopfield againstthe progressive memory recalling done by TCAN for the re-calling of the triangle pattern [pattern (b) of our test A] by usinga highly fragmented triangle input pattern with only the cutewhich show the shape of the three corners of a triangle pattern.As shown in Fig. 20, Hopfield network gave a “quantum jump”to an incorrect pattern c in about four time-steps while TCANprovided a progressive pattern recalling to the correct

1240 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 15, NO. 5, SEPTEMBER 2004

TABLE IVPERFORMANCE OF TCAN UPON RECOGNIZING A HUMAN FACE

Fig. 18. Sample sequence facial database from Yale University.

Fig. 19. Sample sequence of a progressive facial pattern recalling scheme performed by TCAN.

pattern in ten time steps. In fact, in the visual psychology pointof view, this experimental result provides an excellent analogto the Gestalt Theory of visual perception [72], which statedthat “perception is a constructive process capable of going be-

yond the information given by stimulation,” or in other words,the features (the pixel images) that we use in object recognitionare not only the features of the raw input, instead, the featuresthat we use are the ones in our organized perception of all the

LEE: TCAN BASED ON LEE OSCILLATORS 1241

Fig. 20. Autoassociation of an occluded triangle using a) Hopfield networkand b) TCAN.

input information. In our case, although the fragmented trianglepattern cannot provide sufficient information for “classical” au-toassociative, however, TCAN can provide a somewhat like aconstructive process to “rebuild” the correct pattern via transientchaotic oscillation process. In other words, TCAN provides anew era for high-level information coding and association.

Last, but not least, Piaget, in her remarkable work The Psy-chology of Intelligence [73], stated that “Behavior involves atotal field embracing subjects and objects, and the dynamics ofthis field constitutes feeling, while its structure depends on per-ception, effector-functions and intelligence…” This observationfinds an analog in the relationship between our transient chaoticLee oscillator and intelligence. If we say that intelligence (or,specifically, the “autoassociation” described in this paper) is thestructure of the TCAN system, then the chaotic neural dynamicsof the Lee oscillator will be the anatomy we called “memory re-calling” while the “total field” will be the memory patterns andthe external input patterns. This process, as described by devel-opmental psychology, is a progressive learning process knownas “assimilation,” which is a continuous process of adoptionand readoption.

Although the proposed Lee oscillators (and its TCAN)cannot give a full interpretation of these critical psychologicaland neuroscience behaviors, it sheds new light in the searchfor an appropriate solution in computational neuroscience andvisual psychology.

VII. RELATED WORKS

Actually the introduction of the Lee oscillator in this paper isone of the foundation studies of our chaotic neural processing(CNP) research group. Current and future research related toLee oscillators can be summarized into three main areas.

A. Chaotic Neural Modeling Study

This involves a fundamental study of the chaotic neural model(and its derivatives). Current research includes the study of thesynchronization and desynchronization behaviors of the Lee os-cillators, stimulated annealing effects, network hierarchy, etc.

B. Neuroscience Implications and Theoretical Study

This involves a fundamental study of major neuroscience ef-fects, including the interpretation of “meaning” and “knowl-edge,” and a study of how these phenomena can be modeledand simulated in the form of chaotic neural networks. Currentresearch includes how Lee oscillators (and its derivatives) areapplied to represent, acquire, and disseminate knowledge andthe mechanisms and interactions between working memory andLTM. Current study including the exploration of Gestalt per-ception phenomenon based on the LEE associators [44] and thelatest extended works of LEE oscillator, known as the “cog-nitron theory,” an innovative theory to explore and model thehuman cognition behaviors and the sense of experiences [46].

C. Chaotic Neural Application Study

This area involves all of the “spin-off” applications of theLee oscillators (and its derivatives). Active research includesi) an extension of previous work on scene analysis and patternrecognition [40]–[42] to focus on complex scene analysis, ac-tive vision and the invariant and progressive recalling of humanfaces [45]; ii) an extension of previous work on weather predic-tion [43] to focus on the predicting of severe weather such asrainstorm forecasts; iii) other “sidetrack” applications such aschaotic cryptosystems and related work on asynchronized datacommunication [17], [34], [55].

VIII. CONCLUSION

In this paper, the author proposed a chaotic neural oscillator(namely, the “Lee oscillator”) with its chaotic dynamics. Fromthe application point of view, a TCAN had been constructed totest its applicability for temporal pattern associations. As com-pared with the chaotic autoassociative networks developed byAihara et al. [1]–[3] and Wang [63], [64], the TCAN produces arobust progressive memory recalling scheme which is somehowanalog to the latest research on perception psychology and psy-chiatry on the progressive memory recalling (reconstruction)scheme.

In fact, the major aims and contributions of the paper aretwo-fold. From the academic point of view, instead of enhancingexisting neural network models, the aim of this paper is to intro-duce a totally new chaotic neural model (the Lee oscillator) thatprovides a new horizon on the neural foundation of temporal(transient) coding and information processing, and to “narrow”the “gap” between neural networks and the latest research on

1242 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 15, NO. 5, SEPTEMBER 2004

chaotic phenomena being conducted in the fields of brain sci-ence and neuroscience.

From the application point of view, this paper demonstrateshow chaotic neural oscillators can be applied to the autoassocia-tive problem, a fundamental memory encoding and informationprocessing problem in neural networks. As discussed in Sec-tion VII, current and future studies also involve the issue of howLee oscillators (and its derivatives) can be applied to variousproblems, such those in neuroscience, including the interpreta-tion of knowledge and meaning, complex scene analysis, andactive vision or other “sidetrack” applications such as chaoticcryptosystems.

ACKNOWLEDGMENT

The author would like to thank the Chaotic Neural Processing(CNP) Research Group of The Hong Kong Polytechnic Uni-versity for providing support and facilities. Finally, the authorwould also like to thank Yale University for providing the facialdatabase set A for experimental purposes.

REFERENCES

[1] M. Adachi and K. Aihara, “Associative dynamics in a chaotic neuralnetwork,” Neural Netw., vol. 10, no. 1, pp. 83–98, 1997.

[2] K. Aihara and G. Matsumoto, “Forced oscillations and route to chaos inthe Hodgkin-Huxley axons and squid giant axons,” in Chaos in Biolog-ical Systems, H. Degn, A. V. Holden, and L. F. Olsen, Eds. New York:Plenum, 1987, pp. 121–131.

[3] , “Chaotic oscillations and bifurcations in squid giant axons,” inChaos, A. V. Holden, Ed. Manchester, U.K.: Manchester Univ. Press,1986, pp. 257–269.

[4] K. Aihara, “Chaos in neural networks,” in The Impact of Chaos on Sci-ence and Society, Japan: United Nations Univ. Press, 1997, pp. 110–126.

[5] D. L. Alkon, K. T. Blackwell, G. S. Barbour, S. A. Werness, and T. P.Vogl, “Biological plausibility of synaptic associative memory models,”Neural Netw., vol. 7, no. 6/7, pp. 1005–1017, 1994.

[6] J. A. Anderson, “A simple neural network generating interactivememory,” Math. Biosci., vol. 14, pp. 197–220, 1972.

[7] A. M. S. Barry, Visual Intelligence: Perception, Image, and Manipula-tion in Visual Communication: State Univ. New York Press, 1997.

[8] T. W. Berger, G. Chauvet, and R. J. Sclabassi, “A biologically basedmodel of functional properties of the hippocampus,” Neural Netw., vol.7, no. 6/7, pp. 1031–1064, 1994.

[9] D. Bibitchkov, J. M. Herrmann, and T. Geisel, “Effects of short-timeplasticity on the associative memory,” Neurocomput., vol. 44–46, pp.329–335, 2002.

[10] S. Campbell and D. Wang, “Synchronization and desynchronization ina network of locally coupled Wilson–Cowan oscillators,” IEEE Trans.Neural Networks, vol. 7, pp. 541–554, May 1996.

[11] S. V. Chakravarthy and J. Ghosh, “A complex-valued associativememory for storing patterns as oscillatory states,” Biol. Cybern., vol.75, pp. 229–238, 1996.

[12] L. Chen and K. Aihara, “Chaos and asymptotical stability in discrete-time neural networks,” Phys. D, vol. 104, pp. 286–325, 1997.

[13] , “Chaotic stimulated annealing by a neural model with transientchaos,” Neural Netw., vol. 8, no. 6, pp. 915–930, 1995.

[14] K. Chen and D. Wang, “A dynamically coupled neural oscillator networkfor image segmentation,” Neural Netw., vol. 15, pp. 423–439, 2002.

[15] A. C. C. Cheng and L. Guan, “A combined evolution method for asso-ciative memory networks,” Neural Netw., vol. 11, pp. 785–792, 1998.

[16] J. Chey, J. Lee, Y. S. Kim, S. M. Kwon, and Y. M. Shin, “Spatialworking memory span, delayed response and executive function inschizophrenia,” Psych. Res., vol. 110, pp. 259–271, 2002.

[17] K. M. Cuomo, A. V. Oppenheim, and S. H. Strongatz, “Synchronizationof Lorenz-based chaotic circuits with applications to communications,”IEEE Trans. Circuits Syst. II, vol. 40, pp. 626–633, Oct. 1993.

[18] A. K. Engel, P. Konig, A. K. Kreiter, and W. Singer, “Synchronizationof oscillatory neuronal response between striate and exstriate visual cor-tical areas of the cat,” in Proc. Nat. Academy Science, vol. 88, 1991, pp.6048–6052.

[19] M. Falcke, R. Huerta, M. I. Rabinovich, H. D. I. Abarbanel, R. C. Elson,and A. I. Selverston, “Modeling observed chaotic oscillators in burstingneurons: The role of calcium dynamics and IP3,” Biol. Cybern., vol. 82,pp. 517–527, 2000.

[20] W. J. Freeman, How Brains Make Up Their Minds. New York: Co-lumbia Univ. Press, 2001.

[21] , “A proposed name for aperiodic brain activity: Stochastic chaos,”Neural Netw., vol. 13, pp. 11–13, 2000.

[22] , “Tutorial on neurobiology: From single neurons to brain chaos,”Int. J. Bifurcation Chaos, vol. 2, pp. 451–482, 1992.

[23] , “Simulation of chaotic EEG patterns with a dynamic model of theolfactory system,” Biol. Cybern., vol. 56, pp. 139–150, 1987.

[24] , “Nonlinear dynamics of palecortex manifested in the olfactoryEEG,” Biol. Cybern., vol. 35, pp. 1177–1179, 1979.

[25] H. Fukai, S. Doi, T. Nomura, and S. Sato, “Hopf bifurcations in mul-tiple-parameter space of the Hodgkin-Huxley equations I: Global or-ganization of bistable periodic solutions,” Biol. Cybern., vol. 82, pp.215–222, 2000.

[26] H. Fukai, T. Nomura, S. Doi, and S. Sato, “Hopf bifurcations in mul-tiple-parameter space of the Hodgkin-Huxley equation II: Singularitytheoretic approach and highly degenerate bifurcations,” Biol. Cybern.,vol. 82, pp. 223–229, 2000.

[27] C. M. Gary, P. Konig, A. K. Engel, and W. Singer, “Oscillatory responsesin cat visual cortex exhibit intercolumnar synchronization which reflectsglobal stimulus properties,” Nature, vol. 338, pp. 334–337, 1989.

[28] H. Haken, Principles of Brain Functioning: A Synergetic Approach toBrain Activity, Behavior and Cognition. New York: Springer-Verlag,1996.

[29] A. L. Hodgkin and A. F. Huxley, “A quantitative description of mem-brane current and its application to conduction and excitation in thenerve,” J. Physiol., vol. 117, pp. 500–544, 1952.

[30] J. J. Hopfield, “Neural networks and physical systems with emergentcollective computation ability,” in Proc. Nat. Academy Science, vol. 79,1982, pp. 2445–2558.

[31] O. Hoshino, M. Zheng, and K. Kuroiwa, “Roles of dynamic linkageof stable attractors across cortical networks in recalling long-termmemory,” Biol. Cybern., Jan. 2003.

[32] R. Huerta, P. Varona, M. I. Rabinovich, and H. D. I. Abarbanel,“Topology selection by chaotic neurons of a pyloric central patterngenerator,” Biol. Cybern., vol. 84, pp. L1–L8, 2001.

[33] S. Ishii, K. Fukumizu, and S. Watanabe, “A network of chaotic elementsfor information processing,” Neural Netw., vol. 9, no. 1, pp. 25–40, 1996.

[34] G. Jakimoski and L. Kocarev, “Chaos and cryptography: Block encryp-tion ciphers based on chaotic maps,” IEEE Trans. Circuits Syst. I, vol.48, no. 2, pp. 163–169, 2001.

[35] J. Jeong, M. K. Joung, and S. Y. Kim, “Quantification of emotion bynonlinear analysis of the chaotic dynamics of electroencephalogramsduring perception of 1=f music,” Biol. Cybern., vol. 78, pp. 217–225,1998.

[36] J. M. Karlholm, “Associative memories with short-range, higher ordercouplings,” Neural Netw., vol. 6, pp. 409–421, 1993.

[37] T. Kohenon, Self-organization and Associative Memory, 2nded. Berlin, Germany: Springer-Verlag, 1988.

[38] , “Correlation matrix memories,” IEEE Trans. Comput., vol. C-21,pp. 353–359, 1972.

[39] R. S. T. Lee and J. N. K. Liu, Invariant Object Recognition Based onElastic Graph Matching: Theory and Applications, The Netherlands:IOS Press, 2003.

[40] R. S. T. Lee, “iJADE surveillant—An intelligent multi-resolutioncomposite neuro-oscillatory agent-based surveillance system,” PatternRecognit., vol. 36, pp. 1425–1444, 2003.

[41] R. S. T. Lee and J. N. K. Liu, “Scene analysis using an integratedcomposite neural oscillatory elastic graph matching model,” PatternRecognit., vol. 35, pp. 1835–1846, 2002.

[42] R. S. T. Lee, “Elastic face recognizer : Invariant face recognition basedon elastic graph matching model,” Int. J. Pattern Recognition and Arti-ficial Intelligence (IJPRAI), vol. 16, no. 4, pp. 463–479, 2002.

[43] R. S. T. Lee and J. N. K. Liu, “Tropical cyclone identification andtracking system using integrated neural oscillatory elastic graphmatching and hybrid RBF network track mining techniques,” IEEETrans. Neural Networks, vol. 11, pp. 680–689, May 2000.

[44] R. S. T. Lee, “Gestaltistic behavior of LEE-associator,” Neural Netw.,2003, submitted for publication.

[45] , “A transient-chaotic neural oscillatory network for progressiverecognition of human faces,” Pattern Recognit., 2003, submitted for pub-lication.

LEE: TCAN BASED ON LEE OSCILLATORS 1243

[46] , “Cognitron theory: A philosophical perspective of the unificationtheory of senses and experiences,” Int. J. Philosophy, 2003, submittedfor publication.

[47] C. Y. Liou and S. K. Yuan, “Error tolerant associative memory,” Biol.Cybern., vol. 81, pp. 331–342, 1999.

[48] W. Maass and H. Markram, “Synapses as dynamic memory buffers,”Neural Netw., vol. 15, pp. 155–161, 2002.

[49] C. von der Malsburg, “The Correlation Theory of Brain Function,” MaxPlanck Inst. Biophys. Chem., Gittingen, Germany, Tech. Rep. 81-2,1981.

[50] , “Nervous structures with dynamical links,” Ber. Bunsenges. Phys.Chem., vol. 89, pp. 703–710, 1985.

[51] C. von der Malsburg and J. Buhmann, “Sensory segmentation with cou-pled neural oscillator,” Biol. Cybern., vol. 67, pp. 233–246, 1992.

[52] V. Menon, W. J. Freeman, B. A. Cutillo, J. E. Desmond, M. F. Ward, S. L.Bressler, K. D. Laxer, N. Barbaro, and A. S. Gevins, “Spatio-temporalcorrelations in human gamma band electrocorticograms,” Electronen-cephalogr. Clin. Neurophys., vol. 98, pp. 89–102, 1996.

[53] A. A. Minai and T. Anand, “Stimulus-induced bifurcations in discrete-time neural oscillators,” Biol. Cybern., vol. 79, pp. 87–96, 1998.

[54] M. Morita, “Associative memory with nonmonotone dynamics,” NeuralNetw., vol. 6, pp. 115–126, 1993.

[55] G. Parodi, S. Ridella, and R. Zunino, “Using chaos to generate keysfor associative noise-like coding memories,” Neural Netw., vol. 6, pp.559–572, 1993.

[56] J. Triesch and C. von der Malsburg, “A system for person-independenthand posture recognition against complex backgrounds,” IEEE Trans.Pattern Anal. Machine Intell., vol. 23, pp. 1449–1453, Dec. 2001.

[57] , “Robotic gesture recognition,” in Proc. 2nd Conf. Automatic Faceand Gesture Recognition, Killington, VT, 1996, pp. 170–175.

[58] I. Tsuda, “Dynamic link of memory: Chaotic memory map in nonequi-librium neural networks,” Neural Netw., vol. 5, pp. 313–326, 1992.

[59] P. Varona, J. J. Torres, H. D. I. Abarbanel, M. I. Rabinovich, and R. C.Elson, “Dynamics of two electrically coupled chaotic neurons: Exper-imental observations and model analysis,” Biol. Cybern., vol. 84, pp.91–101, 2001.

[60] C. Wagner and J. W. Stuck, “Construction of an associative memoryusing unstable periodic orbits of a chaotic attractor,” J. Theor. Biol., vol.215, pp. 375–384, 2002.

[61] D. L. Wang and D. Terman, “Image segmentation based on oscillationcorrelation,” Neural Computat., vol. 9, pp. 1623–1626, 1997.

[62] , “Locally excitatory globally inhibitory oscillator networks,” IEEETrans. Neural Networks, pp. 283–286, Nov. 1995.

[63] X. Wang, “Discrete-time Neural Networks as Dynamical Systems,”Ph.D. dissertation, Univ. Southern California, Los Angeles, CA, 1992.

[64] , “Period-doublings to chaos in a simple neural network: An ana-lytic proof,” Complex Syst., vol. 5, pp. 425–441, 1991.

[65] H. R. Wilson and J. D. Cowan, “Excitatory and inhibitory interactionsin localized populations,” Biophys. J., vol. 12, pp. 1–24, 1972.

[66] L. Wiskott, J. M. Fellous, N. Kruger, and C. von der Malsburg, “Facerecognition by elastic bunch graph matching,” IEEE Trans. Pattern Anal.Machine Intell., vol. 19, pp. 775–779, July 1997.

[67] Y. Yamaguchi and H. Shimizu, “Pattern recognition with figure-groundseparation by generation of coherent oscillations,” Neural Netw., vol. 7,no. 1, pp. 49–63, 1994.

[68] Y. Yao and W. J. Freeman, “Model of biological pattern recognition withspatially chaotic dynamics,” Neural Netw., vol. 3, pp. 153–170, 1990.

[69] S. Yoshizawa, M. Morita, and S. Amari, “Capacity of associativememory using a nonmonotonic neuron model,” Neural Netw., vol. 6,pp. 167–176, 1993.

[70] G. Yunfan, X. Jianxue, R. Wei, H. Sanjue, and W. Fuzhou, “Determiningthe degree of chaos from analysis of ISI time series in the nervoussystem: A comparison between correlation dimension and nonlinearforecasting methods,” Biol. Cybern., vol. 78, pp. 159–165, 1998.

[71] C. S. Zhou and T. L. Chen, “Chaotic neural networks and chaotic an-nealing,” Neurocomput., vol. 30, pp. 293–300, 2000.

[72] I. E. Gordon, Theory of Visual Perception. New York: Wiley, 1997.[73] J. Piaget, The Psychology of Intelligence. Padstow, Cornwall, U.K.: TJ

International, 1950.

Raymond S. T. Lee (M’98) received the B.Sc. degreefrom Hong Kong University, Kowloon, Hong Kong,in 1989 and the M.Sc. and Ph.D. degrees in informa-tion technology from Hong Kong Polytechnic Uni-versity, Kowloon, in 1997 and 2000, respectively.

After graduation from Hong Kong University, hejoined the Hong Kong Government in the Hong KongObservatory as a Meteorological Scientist on weatherforecasting and developing meteorological telecom-munication information systems from 1989 to 1993.Prior to joining Hong Kong Polytechnic University in

1998, he also worked as an MIS Manager and System Consultant in Hong Kong.He is now an Assistant Professor in the Department of Computing, Hong KongPolytechnic University. His current research interests include artificial intelli-gence, ontology and ontological agents, chaotic neural networks, neural oscil-lators, intelligent agents, pattern recognition, visual perception and visual psy-chology, weather simulation, and forecasting.

Dr. Lee is a member of ACM.