Ais machine learning

Download Ais machine learning

Post on 24-May-2015




0 download

Embed Size (px)


<ul><li> 1. Artificial Immune SystemsAndrew Watkins</li></ul> <p> 2. Why the Immune System? Recognition Anomaly detection Noise tolerance Robustness Feature extraction Diversity Reinforcement learning Memory Distributed Multi-layered Adaptive 3. DefinitionAIS are adaptive systems inspired by theoretical immunology and observedimmune functions, principles and models,which are applied to complex problemdomains (de Castro and Timmis) 4. Some History Developed from the field of theoreticalimmunology in the mid 1980s. Suggested we might look at the IS 1990 Bersini first use of immune algos tosolve problems Forrest et al Computer Security mid1990s Hunt et al, mid 1990s Machine learning 5. How does it work? 6. Immune Pattern Recognition The immune recognition is based on the complementaritybetween the binding region of the receptor and a portion ofthe antigen called epitope. Antibodies present a single type of receptor, antigensmight present several epitopes. This means that different antibodies can recognize a single antigen 7. Immune Responses Primary Response Secondary ResponseCross-ReactiveResponseAntibody Concentration Lag Lag ResponseResponse to Lagto Ag1Ag1 + Ag3...Response to Ag1 Response to Ag2...... ... AntigensTime Antigen Ag1Antigen Ag1, Ag2Ag1 + Ag3 8. Clonal Selection 9. Immune Network Theory Idiotypic network (Jerne, 1974) B cells co-stimulate each other Treat each other a bit like antigens Creates an immunological memory 10. Shape Space Formalism Repertoire of theV Vimmune system isV complete (Perelson, 1989) Extensive regions ofVcomplementarity Some threshold ofrecognition 11. Self/Non-Self Recognition Immune system needs to be able todifferentiate between self and non-self cells Antigenic encounters may result in celldeath, therefore Some kind of positive selection Some element of negative selection 12. General Framework for AIS Solution Immune Algorithms Affinity MeasuresRepresentationApplication Domain 13. Representation Shape Space Describe the general shape of a moleculeDescribe interactions between moleculesDegree of binding between moleculesComplement threshold 14. Define their Interaction Define the term Affinity Affinity is related to distance L EuclidianD= ( Abi Ag i ) 2i =1 Other distance measures such as Hamming, Manhattan etc. etc. Affinity Threshold 15. Basic Immune Models and Algorithms Bone Marrow Models Negative Selection Algorithms Clonal Selection Algorithm Somatic Hypermutation Immune Network Models 16. Bone Marrow Models Gene libraries are used to create antibodies fromthe bone marrow Use this idea to generate attribute strings thatrepresent receptors Antibody production through a randomconcatenation from gene libraries 17. Negative Selection Algorithms Forrest 1994: Idea taken from the negativeselection of T-cells in the thymus Applied initially to computer security Split into two parts: Censoring Monitoring 18. Clonal Selection Algorithm (deCastro &amp; von Zuben, 2001)Randomly initialise a population (P) For each pattern in Ag Determine affinity to each Ab in P Select n highest affinity from PClone and mutate prop. to affinity with Ag Add new mutants to P endFor Select highest affinity Ab in P to form part of M Replace n number of random new onesUntil stopping criteria 19. Immune Network Models(Timmis &amp; Neal, 2001)Initialise the immune network (P)For each pattern in AgDetermine affinity to each Ab in PCalculate network interaction Allocate resources to the strongest members of PRemove weakest Ab in PEndForIf termination condition metexitelseClone and mutate each Ab in P (based on a given probability)Integrate new mutants into P based on affinityRepeat 20. Somatic Hypermutation Mutation rate in proportion to affinity Very controlled mutation in the natural immunesystem The greater the antibody affinity the smaller itsmutation rate Classic trade-off between exploration andexploitation 21. How do AIS Compare? Basic Components: AIS B-cell in shape space (e.g. attributestrings) Stimulation level ANN Neuron Activation function GA chromosome fitness 22. Comparing Structure (Architecture) AIS and GA fixed or variable sizedpopulations, not connected in population basedAIS ANN and AIS Do have network based AIS ANN typically fixed structure (not always) Learning takes place in weights in ANN 23. Comparing Memory AIS in B-cells Network models in connections ANN In weights of connections GA individual chromosome 24. Comparing Adaptation Dynamics Metadynamics Interactions Generalisation capabilities Etc. many more. 25. Where are they used? Dependable systems Scheduling Robotics Security Anomaly detection Learning systems 26. Artificial Immune RecognitionSystem (AIRS):An Immune-Inspired Supervised Learning Algorithm 27. AIRS: Immune PrinciplesEmployed Clonal Selection Based initially on immune networks, thoughfound this did not work Somatic hypermutation Eventually Recognition regions within shape space Antibody/antigen binding 28. AIRS: Mapping from IS to AIS AntibodyFeature Vector Recognition Combination of featureBall (RB) vector and vector class AntigensTraining Data Immune Memory Memory cellsset ofmutated Artificial RBs 29. Classification Stimulation of an ARB is based not only on itsaffinity to an antigen but also on its class whencompared to the class of an antigen Allocation of resources to the ARBs also takesinto account the ARBs classifications whencompared to the class of the antigen Memory cell hyper-mutation and replacement isbased primarily on classification and secondarilyon affinity 30. AIRS Algorithm Data normalization and initialization Memory cell identification and ARBgeneration Competition for resources in thedevelopment of a candidate memory cell Potential introduction of the candidatememory cell into the set of establishedmemory cells 31. Memory Cell IdentificationAMemory Cell Pool ARB Pool 32. MCmatch FoundA 1 Memory Cell PoolMCmatchARB Pool 33. ARB GenerationA 1 Memory Cell PoolMCmatchMutated Offspring2ARB Pool 34. Exposure of ARBs to Antigen A 1 Memory Cell Pool MCmatch Mutated Offspring 3 2 ARB Pool 35. Development of a CandidateMemory CellA 1 Memory Cell PoolMCmatchMutated Offspring3 2ARB Pool 36. Comparison of MCcandidate andMCmatch A 1 Memory Cell Pool MCmatch4A Mutated Offspring 3 2MC candidate ARB Pool 37. Memory Cell IntroductionA 1 Memory Cell PoolMCmatch 4A3Mutated Offspring 52MCcandidateARB Pool 38. Memory Cells and Antigens 39. Memory Cells and Antigens 40. AIRS: Performance Evaluation Pima Indians DiabetesFishers Iris Data SetData SetIonosphere Data Set Sonar Data Set 41. IrisIonosphere DiabetesSonar1Grobian 100%3-NN + 98.7% Logdisc 77.7% TAP MFT92.3% (rough) simplexBayesian2SSV 98.0% 3-NN 96.7% IncNet77.6% Nave MFT Bayesian 90.4%3C-MLP2LN98.0% IB396.7% DIPOL92 77.6% SVM90.4%4PVM 2 rules 98.0% MLP + BP 96.0% Linear Discr. Anal. 77.5%-Best 2-layer MLP 90.4%77.2% + BP, 12 hidden5PVM 1 rule97.3%SMART 76.8% MLP+BP, 12 hidden84.7% AIRS 94.96C4.5 94.9% GTO DT (5xCV) 76.8% MLP+BP, 24 hidden84.5% AIRS96.77FuNe-I96.7% RIAC 94.6% ASI 76.6% 1-NN, Manhatten84.2%8NEFCLASS96.7% SVM93.2% Fischer discr. anal 76.5% AIRS 84.09CART96.0% Non-linear 92.0% MLP+BP76.4% MLP+BP, 683.5% perceptron hidden10 FUNN95.7% FSM +92.8% LVQ 75.8% FSM -83.6% rotation methodology?11 1-NN 92.1% LFC 75.8% 1-NN Euclidean 82.2%12 DB-CART91.3% RBF 75.7% DB-CART, 10xCV 81.8%13 Linear 90.7% NB75.5- CART, 10xCV67.9% perceptron 73.8%14 OC1 DT 89.5% kNN, k=22, Manh 75.5%15 CART 88.9% MML 75.5% ...22AIRS74.123C4.573.0%11 others reported with lowerscores, including Bayes,Kohonen, kNN, ID3 42. AIRS: Observations ARB Pool formulation was overcomplicated Crude visualization Memory only needs to be maintained in theMemory Cell Pool Mutation Routine Difference in Quality Some redundancy 43. AIRS: Revisions Memory Cell Evolution Only Memory Cell Pool has different classes ARB Pool only concerned with evolvingmemory cells Somatic Hypermutation Cells stimulation value indicates range ofmutation possibilities No longer need to mutate class 44. Comparisons: Classification Accuracy Important to maintain accuracyAIRS1: Accuracy AIRS2: Accuracy Iris 96.796.0 Ionosphere 94.995.6 Diabetes 74.174.2 Sonar84.084.9 Why bother? 45. Comparisons: Data Reduction Increase data reductionincreasedefficiency Training Set Size AIRS1: Memory Cells AIRS2: Memory CellsIris 120 42.1 / 65%30.9 / 74%Ionosphere 200 140.7 / 30% 96.3 / 52%Diabetes 691 470.4 / 32% 273.4 / 60%Sonar192 144.6 / 25% 177.7 / 7% 46. Features of AIRS No need to know best architecture to getgood results Default settings within a few percent of thebest it can get User-adjustable parameters optimizeperformance for a given problem set Generalization and data reduction 47. More Information</p>


View more >