associative learning presentation(1)

9
Associative Learning Since NN are modelled after the human brain, we will first explore associative learning in humans, then move on to the NN themselves

Upload: mairos-kunze-bonga

Post on 17-Aug-2015

217 views

Category:

Documents


0 download

DESCRIPTION

Agent association

TRANSCRIPT

Associative LearningSince NN are modelled after the human brain, we will frst explore associative learning in humans, then move on to the NN themselvesWhat is associative learning? Associative learning is the process by which an association between two stimuli or a behaviour and a stimulus is learned t is the human ability to retrieve information from applied associated stimuli! "xample #$ you can remember a relative of yours after many years of not seeing them and even if they have changed and grown older we still recall who they are! "xample %$ &uman memory is essentially associative! 'ne thing may remind us of another, and that of another, and so on! We use a chain of mental associations to recovera lost memory! f we forget where we left an umbrella, we try to recall where we last had it, whatwe were doing, and who we were tal(ing to! We attempt to establish a chain of associations, andthereby to restore a lost memory! Association, Not isolation*he brain provides the perfect environment for associations to prosper!+' *&S N'W! ,lose your eyes and ta(e #- seconds to try visuali.ing only the nose on your mother/s face! Nothing else, 0ust her nose! Now ta(e #- seconds and try to visuali.e only the stove in your (itchen! Nothing else 0ust the stove! 1ou can/t do it, because as soon as you try to imagine your mom/s nose, up comes her eyes, chee(s, and chin! And, as soon as you try to thin( of only your stove, up pops the countertops and cupboards that surround it! *here are three ma0or reasons why your brain has di2culty thin(ing of things in isolation and why it has a powerful tendency to ma(e associations even when you don/t want it to$ A memory is not held in a single cell, as the grandmother theory of learning once proposed, but instead in widely distributed groups of interconnected neurons! 'ne neuron in a networ( that holds a single memory has the capacity to be part of hundreds if not thousands of other memory networ(s! Neurons that 3wire together fre together!3 *his phrase means that, when a neuron that is part of one memory networ( is activated, it will automatically fre up all other neurons in other memory networ(s with which it has been previously lin(ed!4alvov5s +og "xperiment n this experiment, at frst, the conductivity of the devices was programmed so that only the sight of food 6unconditioned stimulus7 triggers the activation of the output neuron 6salivation7! 8efore the association is made, the hearing of the bell 6neutral stimulus7 does not trigger the salivation, as the conductivity of the corresponding synapse is below the threshold of the output neuron! When both inputs are active simultaneously, the conductivity of the second synapse increases until it reaches the threshold 6conditioning7$ the association is made! 9rom this point, the hearing of the bell alone triggers the activation of output neuron 6conditioned stimulus7! n fgure : we can verify that without feedbac( from the output neuron, the conductivity change implied by the pre;synaptic pulses alone is not enough to create the association! 'ur associative learning circuit is symmetrical, which means that there is no di?s, soon after he recogni.ed that a memory was held not in a single cell but in groups of 3cell assemblies,3 the +onald '! &ebb reali.ed that associative learning can only be created when large groups of neural networ(s are fred simultaneously! n his now famous "ssay on @ind, he wrote, 3*hese self;re;exciting systems 6cell assemblies7 could not consist of one circuit of two or three neurons, but must have a number of circuits!!!could assume that when a number of neurons in the cortex are excited!!! they tend to become inner connected, some of them at least forming a multi;circuit closed system!!! *he idea then 6#=>-7 was that a precept consists of assemblies!!! a concept of assemblies excited centrally by other assemblies!A @odern research has confrmed &ebb/s fndings, and today the idea that associations are made, learning created, and memory cemented when large groups of neurons fre simultaneously is often called the 3&ebbian synaptual learning rule!3 When neurons fre in unison, memory is enhanced because the possibility is increased that a neuron will be stimulated at more than one location!Associative learning in NN*he goal of learning is to associate (nown input vectorswith given output vectors! ,ontrary to continuous mappings, the neighbourhood of a (nown input vector x should also be mapped to the image y of x, that is if 86x7 denotes all vectors whose distance from x 6using a suitable metric7 is lower than some positive constant B, then we expect the networ( to map 86x7 to y! Noisy input vectors can then be associated with the correct output! a learning algorithm derived from biological neurons can be used to train associative networ(s$ it is called &ebbian learning! *here are C overlapping (inds of associative networ(s$#! &eteroassociative networ(s map m input vectors x#, x%, ! ! ! , xm in ndimensional space to m output vectors y#, y%, ! ! ! , ym in (;dimensional space, so that xi DE yi! f Fx G xi% H B then Fx DE yi! *his should be achieved by the learning algorithm, but becomes very hard when the number m of vectors to be learned is too high!%! Autoassociative networ(s are a special subset of the heteroassociative networ(s, in which each vector is associated with itself, i!e!, yi I xi for i I #, ! ! !,m! *he function of such networ(s is to correct noisy input vectors!C! 4attern recognition networ(s are also a special type of heteroassociative networ(s! "ach vector xi is associated with the scalar i! *he goal of such a networ( is to identify the Jname5 of the input pattern,ont5d associative mapping in which the networ( learns to produce a particular pattern on the set of input units whenever another particular pattern is applied on the set of input units! *he associtive mapping can generally be bro(en down into two mechanisms$ auto;association$ an input pattern is associated with itself and the states of input and output units coincide! *his is used to provide pattern completition, ie to produce a pattern whenever a portion of it or a distorted pattern is presented! n the second case, the networ( actually stores pairs of patterns building an association between two sets of patterns! hetero;association$ is related to two recall mechanisms which are$ nearest;neighbour recall, where the output pattern produced corresponds to the input pattern stored, which is closest to the pattern presented, and interpolative recall, where the output pattern is a similarity dependent interpolation of the patterns stored corresponding to the pattern presented! 1et another paradigm, which is a variant associative mapping is classifcation, ie when there is a fxed set of categories into which the input patterns are to be classifed!