liacs natural computing group leiden university evolutionary algorithms

Post on 14-Jan-2016

218 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

LIACS Natural Computing Group Leiden University

Evolutionary Algorithms

LIACS Natural Computing Group Leiden University2

Overview

• Introduction: Optimization and EAs• Genetic Algorithms• Evolution Strategies• Theory• Examples

LIACS Natural Computing Group Leiden University3

Background IBiology = Engineering (Daniel Dennett)

LIACS Natural Computing Group Leiden University4

Introduction • Modeling

• Simulation

• Optimization

! ! ! ! ! !???

! ! ! ???! ! !

??? ! ! !! ! !

Input: Known (measured)Output: Known (measured)

Interrelation: Unknown

Input: Will be given

Model: Already exists

How is the result for the input?

Objective: Will be givenHow (with which parameter settings)to achieve this objective?

LIACS Natural Computing Group Leiden University5

Simulation vs. Optimization

SimulatorSimulator

… what happens if?

Result

Trial & Error

SimulatorSimulator

... how do I achieve the best result?

OptimalResult

Maximization / MinimizationIf so, multiple objectives

OptimizerOptimizer

LIACS Natural Computing Group Leiden University6

Introduction:

OptimizationEvolutionary Algorithms

LIACS Natural Computing Group Leiden University7

Optimization

• f : objective function

– High-dimensional

– Non-linear, multimodal

– Discontinuous, noisy, dynamic

• M M1 M2 ... Mn heterogeneous

• Restrictions possible over

• Good local, robust optimum desired

• Realistic landscapes are like that!Global Minimum

Local, robust optimum

LIACS Natural Computing Group Leiden University8

• Illustrative Example: Optimize Efficiency– Initial:

– Evolution:

• 32% Improvement in Efficiency !

Optimization Creating Innovation

LIACS Natural Computing Group Leiden University9

Dynamic Optimization

• Dynamic Function• 30-dimensional• 3D-Projection

LIACS Natural Computing Group Leiden University10

Classification of Optimization Algorithms• Direct optimization algorithm:

Evolutionary Algorithms

• First order optimization algorithm:

e.g., gradient method

• Second order optimization algorithm:

e.g., Newton method

LIACS Natural Computing Group Leiden University11

Iterative Optimization Methods

General description:Actual Point

New Point

Directional vector

Step size (scalar)

1tx

tx

tt vs

1x

2x

3x• At every Iteration:

– Choose direction

– Determine step size

• Direction:

– Gradient

– Random

• Step size:

– 1-dim. optimization

– Random

– Self-adaptive

LIACS Natural Computing Group Leiden University12

• Global convergence with probability one:

• General, but for practical purposes useless

• Convergence velocity:

• Local analysis only, specific (convex) functions

1))(*Pr(lim

tPxt

)))(())1((( maxmax tPftPfE

The Fundamental Challenge

LIACS Natural Computing Group Leiden University13

• Global convergence (with probability 1):

• General statement (holds for all functions)

• Useless for practical situations:

– Time plays a major role in practice

– Not all objective functions are relevant in practice

1))(*Pr(lim

tPxt

Theoretical Statements

LIACS Natural Computing Group Leiden University14

x1

x2

f(x1,x2)

(x*1,x*2)

f(x*1,x*2)

An Infinite Number of Pathological Cases !

• NFL-Theorem:

– All optimization algorithms perform equally well iff performance is averaged over all possible optimization problems.

• Fortunately: We are not Interested in „all possible problems“

LIACS Natural Computing Group Leiden University15

• Convergence velocity:

• Very specific statements

– Convex objective functions

– Describes convergence in local optima

– Very extensive analysis for Evolution Strategies

)))(())1((( maxmax tPftPfE

Theoretical Statements

LIACS Natural Computing Group Leiden University

Evolutionary AlgorithmPrinciples

LIACS Natural Computing Group Leiden University17

Model-Optimization-Action

Evaluation

Optimizer

Business Process Model

Simulation

215

1

i

iii

i scale

desiredcalculatedweightquality

Function Model from Data

Experiment SubjectiveFunction(s)

...)( yfi

LIACS Natural Computing Group Leiden University18

Evolutionary Algorithms TaxonomyEvolutionary AlgorithmsEvolutionary Algorithms

Evolution StrategiesEvolution Strategies Genetic AlgorithmsGenetic Algorithms OtherOther

• Evolutionary Progr.• Differential Evol.• GP• PSO• EDA• Real-coded Gas• …

• Mixed-integer capabilities

• Emphasis on mutation

• Self-adaptation

• Small population sizes

• Deterministic selection

• Developed in Germany

• Theory focused on convergence velocity

• Discrete representations

• Emphasis on crossover

• Constant parameters

• Larger population sizes

• Probabilistic selection

• Developed in USA

• Theory focused on schema processing

LIACS Natural Computing Group Leiden University19

Generalized Evolutionary Algorithm

t := 0;

initialize(P(t));

evaluate(P(t));

while not terminate do

P‘(t) := mating_selection(P(t));

P‘‘(t) := variation(P‘(t));

evaluate(P‘‘(t));

P(t+1) := environmental_selection(P‘‘(t) Q);

t := t+1;

od

LIACS Natural Computing Group Leiden University

Genetic Algorithms

LIACS Natural Computing Group Leiden University21

Genetic Algorithms: Mutation

• Mutation by bit inversion with probability pm.

• pm identical for all bits.

• pm small (e.g., pm = 1/n).

0 1 1 1 0 1 0 1 0 0 0 0 0 01

0 1 1 1 0 0 0 1 0 1 0 0 0 01

LIACS Natural Computing Group Leiden University22

Genetic Algorithms: Crossover

• Crossover applied with probability pc.

• pc identical for all individuals.

• k-point crossover: k points chosen randomly.• Example: 2-point crossover.

LIACS Natural Computing Group Leiden University23

Genetic Algorithms: Selection

• Fitness proportional:– f fitness, population size

• Tournament selection:– Randomly select q << individuals.– Copy best of these q into next generation.– Repeat times.– q is the tournament size (often: q = 2).

LIACS Natural Computing Group Leiden University

Some Theory

LIACS Natural Computing Group Leiden University25

l

iiaaf

1

)(

jfljfl

ij

aifif

i

aa

a

a

k

k

a

a

a

a

qpj

flqp

i

fkp

kafamfkp

kpk

10

0)11(

)(

))())((Pr()(

)(max

Convergence Velocity Analysis• (1+1)-GA, (1,)-GA, (1+)-GA• For counting ones function• Convergence velocity:

LIACS Natural Computing Group Leiden University26

llafp

1

)1)((2

1*

QR

IP

0

Tj

iji ntE )(

1)()( QInN ij

• Optimum mutation rate ?

• Absorption times from transition matrix

in block form, using

Convergence Velocity Analysis II

LIACS Natural Computing Group Leiden University27

p

Convergence Velocity Analysis III• P too large:

– Exponential• P too small:

– Almost constant• Optimal:

– O(l ln l)

LIACS Natural Computing Group Leiden University28

ikk

ikk

i

fl

kk

ppi

ka

''

1)1(

min,

• (1,l)-GA (kmin = -fa), (1+l)-GA (kmin = 0) :

Convergence Velocity Analysis IV

LIACS Natural Computing Group Leiden University29

Convergence Velocity Analysis V• (1,)-GA, (1GA (1,)-ES, (1ES

• Conclusion: Unifying, search-space independent theory ?

LIACS Natural Computing Group Leiden University30

)(1

1)(

min,

kpkafl

kk

jkk

ikk

jikk

ji

ppp

j

i

ikp

'1

'1

'

0

1

1

)1(

1)(

Convergence Velocity Analysis VI• (,)-GA, (GA

• Theory• Experiment

LIACS Natural Computing Group Leiden University

Evolution Strategies

LIACS Natural Computing Group Leiden University32

Evolution Strategy – Basics

• Mostly real-valued search space IRn

– also mixed-integer, discrete spaces• Emphasis on mutation

– n-dimensional normal distribution– expectation zero

• Different recombination operators• Deterministic selection

– ()-selection: Deterioration possible– ()-selection: Only accepts improvements

• , i.e.: Creation of offspring surplus• Self-adaptation of strategy parameters.

LIACS Natural Computing Group Leiden University33

Representation of search points

)),,...,(( 1 nxxa

),...,( 1 nxxa

• Simple ES with 1/5 success rule:

– Exogenous adaptation of step size

– Mutation: N(0, )

• Self-adaptive ES with single step size:

– One controls mutation for all xi

– Mutation: N(0, )

LIACS Natural Computing Group Leiden University34

Representation of search points

)),...,(),,...,(( 11 nnxxa

)),...,(),,...,(),,...,(( 2/)1(111 nnnnxxa

• Self-adaptive ES with individual step sizes:

– One individual i per xi

– Mutation: Ni(0, i)

• Self-adaptive ES with correlated mutation:

– Individual step sizes

– One correlation angle per coordinate pair

– Mutation according to covariance matrix: N(0, C)

LIACS Natural Computing Group Leiden University35

Evolution Strategy:

AlgorithmsMutation

LIACS Natural Computing Group Leiden University36

Operators: Mutation – one

)1,0(

))1,0(exp(

)),,...,((

)),,...,((

0

1

1

iii

n

n

Nxx

N

xxa

xxa

• Self-adaptive ES with one step size:

– One controls mutation for all xi

– Mutation: N(0, )Individual before mutation

Individual after mutation

1.: Mutation of step sizes

2.: Mutation of objective variables

Here the new ‘ is used!

LIACS Natural Computing Group Leiden University37

Operators: Mutation – one

n

10

• Thereby 0 is the so-called learning rate

– Affects the speed of the -Adaptation

– 0 bigger: faster but more imprecise

– 0 smaller: slower but more precise

– How to choose 0?

– According to recommendation of Schwefel*:

*H.-P. Schwefel: Evolution and Optimum Seeking, Wiley, NY, 1995.

LIACS Natural Computing Group Leiden University38

Operators: Mutation – one

Position of parents (here: 5)

Offspring of parent lies on the hyper sphere (for n > 10);Position is uniformly distributed

Contour lines of objective function

LIACS Natural Computing Group Leiden University39

Pros and Cons: One • Advantages:

– Simple adaptation mechanism

– Self-adaptation usually fast and precise

• Disadvantages:

– Bad adaptation in case of complicated contour lines

– Bad adaptation in case of very differently scaled object variables

• -100 < xi < 100 and e.g. -1 < xj < 1

LIACS Natural Computing Group Leiden University40

Operators: Mutation – individual i

)1,0(

))1,0()1,0(exp(

)),...,(),,...,((

)),...,(),,...,((

11

11

iiii

iii

nn

nn

Nxx

NN

xxa

xxa

• Self-adaptive ES with individual step sizes:

– One i per xi

– Mutation: Ni(0, i) Individual before Mutation

Individual after Mutation

1.: Mutation of individual step sizes

2.: Mutation of object variables

The new individual i‘ are used here!

LIACS Natural Computing Group Leiden University41

Operators: Mutation – individual i

nn 2

1

2

1

• , ‘ are learning rates, again– ‘: Global learning rate– N(0,1): Only one realisation– : local learning rate

– Ni(0,1): n realisations

– Suggested by Schwefel*:

*H.-P. Schwefel: Evolution and Optimum Seeking, Wiley, NY, 1995.

LIACS Natural Computing Group Leiden University42

Position of parents (here: 5)

Offspring are located on the hyperellipsoid (für n > 10);position equally distributed.

Contour lines

Operators: Mutation – individual i

LIACS Natural Computing Group Leiden University43

Pros and Cons: Individual i

• Advantages:

– Individual scaling of object variables

– Increased global convergence reliability

• Disadvantages:

– Slower convergence due to increased learning effort

– No rotation of coordinate system possible• Required for badly conditioned objective function

LIACS Natural Computing Group Leiden University44

Operators: Correlated Mutations

),0(

)1,0(

))1,0()1,0(exp(

)),...,(),,...,(),,...,((

)),...,(),,...,(),,...,((

2/)1(111

2/)1(111

CNxx

N

NN

xxa

xxa

ii

jjj

iii

nnnn

nnnn

Individual before mutation

Individual after mutation

1.: Mutation of Individual step sizes

2.: Mutation of rotation angles

New convariance matrix C‘ used here!

• Self-adaptive ES with correlated mutations:

– Individual step sizes

– One rotation angle for each pair of coordinates

– Mutation according to covariance matrix: N(0, C)

3.: Mutation of object variables

LIACS Natural Computing Group Leiden University45

Operators: Correlated Mutations

)2tan()(2

1 22)( ijjijiijc

• Interpretation of rotation angles ij

• Mapping onto convariances according to

x1

x2

1

2 12

LIACS Natural Computing Group Leiden University46

Operators: Correlated Mutation

)(2 jjjj sign

, ‘, are again learning rates ‘ as before = 0,0873 (corresponds to 5 degree) Out of boundary correction:

LIACS Natural Computing Group Leiden University47

Correlated Mutations for ES

Position of parents (hier: 5)

Offspring is located on the Rotatable hyperellipsoid(for n > 10); position equallydistributed.

Contour lines

LIACS Natural Computing Group Leiden University48

Operators: Correlated Mutations),0( CN

• How to create ?

– Multiplication of uncorrelated mutation vector with n(n-1)/2 rotational matrices

– Generates only feasible (positiv definite) correlation matrices

u

n

i

n

ijijc R

)(1

1 1

LIACS Natural Computing Group Leiden University49

Operators: Correlated Mutations• Structure of rotation matrix

1

1

)cos(

1

)sin(

)sin(

1

)cos(1

1

)(

0

0

ijij

ijij

ijR

LIACS Natural Computing Group Leiden University50

Operators: Correlated Mutations• Implementation of correlated mutations

nq := n(n-1)/2;for i:=1 to n do

u[i] := [i] * Ni(0,1);

for k:=1 to n-1 don1 := n-k;n2 := n;for i:=1 to k do

d1 := u[n1]; d2:= u[n2];u[n2] := d1*sin([nq])+ d2*cos([nq]);u[n1] := d1*cos([nq])- d2*sin([nq]);n2 := n2-1;nq := nq-1;

odod

Generation of the uncorrelated mutation vector

Rotations

LIACS Natural Computing Group Leiden University51

Pros and Cons: Correlated Mutations

• Advantages:

– Individual scaling of object variables

– Rotation of coordinate system possible

– Increased global convergence reliability

• Disadvantages:

– Much slower convergence

– Effort for mutations scales quadratically

– Self-adaptation very inefficient

LIACS Natural Computing Group Leiden University52

Operators: Mutation – Addendum

• Generating N(0,1)-distributed random numbers?

w

wvx

w

wux

vuw

Uv

Uu

)log(2

)log(2

1)1,0[2

1)1,0[2

2

1

22

If w > 1

)1,0(~, 21 Nxx

LIACS Natural Computing Group Leiden University53

Evolution Strategy:

AlgorithmsRecombination

LIACS Natural Computing Group Leiden University54

Operators: Recombination• Only for > 1

• Directly after Secektion

• Iteratively generates offspring:

for i:=1 to dochoose recombinant r1 uniformly at random

from parent_population;choose recombinant r2 <> r1 uniformly at random

from parent population;offspring := recombine(r1,r2);add offspring to offspring_population;

od

LIACS Natural Computing Group Leiden University55

Operators: Recombination• How does recombination work?

• Discrete recombination:

– Variable at position i will be copied at random (uniformly distr.) from parent 1 or parent 2, position i.Parent 1

Parent 2

Offspring

LIACS Natural Computing Group Leiden University56

Operators: Recombination• Intermediate recombination:

– Variable at position i is arithmetic mean ofParent 1 and Parent 2, position i.

Parent 1

Parent 2

Offspring

1,1rx

1,2rx

2/)( 1,1, 21 rr xx

LIACS Natural Computing Group Leiden University57

Operators: Recombination

Parent 1

Parent 2

Offspring

…Parent

• Global discrete recombination:

– Considers all parents

LIACS Natural Computing Group Leiden University58

Operators: Recombination

Parent 1

Parent 2

Offspring

…Parent

1,1rx

11,

1

iri

x

1,rx

• Global intermediary recombination:

– Considers all parents1,2r

x

LIACS Natural Computing Group Leiden University59

Evolution Strategy

AlgorithmsSelection

LIACS Natural Computing Group Leiden University60

Operators: ()-Selection

• ()-Selection means:

– parents produce offspring by• (Recombination +)

• Mutation

– These individuals will be considered together

– The best out of will be selected („survive“)• Deterministic selection

– This method guarantees monotony• Deteriorations will never be accepted

= Actual solution candidate= New solution candidate

Recombination may be left outMutation always exists!

LIACS Natural Computing Group Leiden University61

Operators: ()-Selection

• ()-Selection means:

– parents produce offspring by• (Recombination +)

• Mutation

– offspring will be considered alone

– The best out of offspring will be selected• Deterministic selection

– The method doesn‘t guarantee monotony• Deteriorations are possible

• The best objective function value in generation t+1 may be worse than the best in generation t.

LIACS Natural Computing Group Leiden University62

Operators: Selection• Example: (2,3)-Selection

• Example: (2+3)-Selection

Parents don‘t survive!Parents don‘t survive …

… but a worse offspring.

… now this offspring survives.

LIACS Natural Computing Group Leiden University63

Exception!

• Possible occurrences of selection– (1+1)-ES: One parent, one offspring, 1/5-Rule– (1,)-ES: One Parent, offspring

• Example: (1,10)-Strategy• One step size / n self-adaptive step sizes • Mutative step size control• Derandomized strategy

– ()-ES: > 1 parents, offspring• Example: (2,15)-Strategy• Includes recombination• Can overcome local optima

– (+)-strategies: elitist strategies

Operators: Selection

LIACS Natural Computing Group Leiden University64

Evolution Strategy:

Self adaptation ofstep sizes

LIACS Natural Computing Group Leiden University65

Self-adaptation

• No deterministic step size control!

• Rather: Evolution of step sizes

– Biology: Repair enzymes, mutator-genes

• Why should this work at all?

– Indirect coupling: step sizes – progress

– Good step sizes improve individuals

– Bad ones make them worse

– This yields an indirect step size selection

LIACS Natural Computing Group Leiden University66

Self-adaptation: Example

• How can we test this at all?

• Need to know optimal step size …

– Only for very simple, convex objective functions

– Here: Sphere model

– Dynamic sphere model• Optimum locations changes occasionally

2*

1

)()( i

n

ii xxxf

: Optimum*x

LIACS Natural Computing Group Leiden University67

Self-adaptation: ExampleObjective function value

… and smallest step size measured in the population

average …

Largest …

According to theoryff optimal step sizes

LIACS Natural Computing Group Leiden University68

Self-adaptation

• Self-adaptation of one step size

– Perfect adaptation

– Learning time for back adaptation proportional n

– Proofs only for convex functions

• Individual step sizes

– Experiments by Schwefel

• Correlated mutations

– Adaptation much slower

LIACS Natural Computing Group Leiden University69

Evolution Strategy:

Derandomization

LIACS Natural Computing Group Leiden University70

Derandomization

• Goals:

– Fast convergence speed

– Fast step size adaptation

– Precise step size adaptation

– Compromise convergence velocity – convergence reliability

• Idea: Realizations of N(0, ) are important!

– Step sizes and realizations can be much different from each other

– Accumulates information over time

LIACS Natural Computing Group Leiden University71

Derandomzed (1,)-ES

kg

scalggg

N Zxxk

• Current parent: in generation g

• Mutation (k=1,…,): Offspring k

Global step size in generation g

Individual step sizes in generation g

gx

)1,0(~),...,( 1 NzzzZ in

• Selection: Choice of best offspring

gN

g

selxx 1 Best of offspring

in generation g

LIACS Natural Computing Group Leiden University72

Derandomized (1,)-ES

selgA

gA ZcZcZ

1)1(

• Accumulation of selected mutations:

The particular mutation vector,which created the parent!

• Also: weighted history of good mutation vectors!• Initialization:

• Weight factor:00

AZ

nc

1

LIACS Natural Computing Group Leiden University73

Derandomized (1,)-ES

n

cc

n

Z gAgg

5

11

2

exp1

• Step size adaptation: Norm of vector

Vector of absolute values

scal

cc

Z gAg

scalg

scal

35.0

2

1 Regulates adaptation

speed and precision

LIACS Natural Computing Group Leiden University74

Derandomized (1,)-ES• Explanations:

– Normalization of average variations in case of missing selection (no bias):

– Correction for small n: 1/(5n)

– Learning rates:

c

c

2

n

n

scal /1

/1

LIACS Natural Computing Group Leiden University75

Evolution Strategy:

Rules of thumb

LIACS Natural Computing Group Leiden University76

Some Theory Highlights

• Convergence velocity:

• For (1,)-strategies:

• For ()-strategies (discrete and intermediary recombination):

n/1~

Problem dimensionality

ln~

Speedup by is just logarithmic – more processors are only to a limited extend useful to increase

ln~ Genetic Repair Effect

of recombination!

Genetic Repair Effectof recombination!

LIACS Natural Computing Group Leiden University77

• For strategies with global intermediary recombination:

• Good heuristic for (1,):

2/

log34

n

9.5219.03150

9.4118.82140

9.3018.60130

9.1818.36120

9.0518.10110

8.9117.82100

8.7517.5090

8.5717.1580

8.3716.7570

8.1416.2860

7.8715.7450

7.5315.0740

7.1014.2030

6.4912.9920

5.4510.9110

n

10

LIACS Natural Computing Group Leiden University

Mixed-IntegerEvolution Strategies

LIACS Natural Computing Group Leiden University79

Mixed-Integer Evolution Strategy

• Generalized optimization problem:

LIACS Natural Computing Group Leiden University80

Mixed-Integer ES: Mutation

Learning rates (global)

Learning rates (global)

Geometrical distribution

Mutation Probabilities

LIACS Natural Computing Group Leiden University

Some Application Examples

LIACS Natural Computing Group Leiden University82

Safety Optimization – Pilot Study

• Aim: Identification of most appropriate Optimization Algorithm for realistic example!

• Optimizations for 3 test cases and 14 algorithms were performed (28 x 10 = 280 shots)

– Body MDO Crash / Statics / Dynamics– MCO B-Pillar– MCO Shape of Engine Mount

• NuTech’s ES performed significantly better than Monte-Carlo-scheme, GA, and Simulated Annealing

• Results confirmed by statistical hypothesis testing

LIACS Natural Computing Group Leiden University83

MDO Crash / Statics / Dynamics

• Minimization of body mass • Finite element mesh

– Crash ~ 130.000 elements– NVH ~ 90.000 elements

• Independent parameters: Thickness of each unit: 109

• Constraints: 18

Algorithm Avg. reduction (kg) Max. reduction (kg) Min. reduction (kg)

Best so far -6.6 -8.3 -3.3

NuTech ES -9.0 -13.4 -6.3

LIACS Natural Computing Group Leiden University84

MCO B-Pillar – Side Crash

• Minimization of mass & displacement

• Finite element mesh– ~ 300.000 elements

• Independent parameters: Thickness of 10 units

• Constraints: 0

• ES successfully developed Pareto front

Mass

Intr

usio

n

LIACS Natural Computing Group Leiden University85

MCO Shape of Engine Mount

• Mass minimal shape withaxial load > 90 kN

• Finite element mesh– ~ 5000 elements

• Independent parameters:9 geometry variables

• Dependent parameters: 7• Constraints: 3• ES optimized mount

– less weight than mount optimized with best so far method

– geometrically better deformation

LIACS Natural Computing Group Leiden University86

Safety Optimization – Example of use

• Production Run !• Minimization of body mass • Finite element mesh

– Crash ~ 1.000.000 elements– NVH ~ 300.000 elements

• Independent parameters: – Thickness of each unit: 136

• Constraints: 47, resulting from various loading cases• 180 (10 x 18) shots ~ 12 days• No statistical evaluation due to problem complexity

LIACS Natural Computing Group Leiden University87

Safety Optimization – Example of use

• 13,5 kg weight reduction by NuTech’s ES• ~ 2 kg more mass reduction than Best so far method• Typically higher convergence velocity of ES

~ 45% less time (~ 3 days saving) for comparable quality needed• Still potential of improvements after 180 shots.• Reduction of development time from 5 to 2 weeks allows for process

integration

Best so far Method

Generations

Mas

s

Initial Value

Generations

Mas

s

Initial Value

NuTech’s Evolution Strategy

LIACS Natural Computing Group Leiden University88

MDO – Optimization Run

107,13%

105,05%

100,00%

All Individuals

Number of Individuals

Fitness Infeasible

Feasible

LIACS Natural Computing Group Leiden University89

Optimization of metal stamping process

• Objective: Minimization of defects in the produced parts.

• Variables: Geometric parameters and forces.

• ES finds very good results in short time

• Computationally expensive simulation

LIACS Natural Computing Group Leiden University90

Bridgeman Casting Process

• Objective: Max. homogeneity of workpiece after Casting Process

• Variables: 18 continuous speed variables for Casting Schedule

• Computationally expensive simulation (up to 32h simulation time)

Initial (DoE) GCM (Commercial Gradient Based Method)

Evolution Strategy

Turbine Bladeafter Casting

LIACS Natural Computing Group Leiden University91

Algorithm 100 OFE

200 OFE

1000OFE

SQP 148 67.7 59

Conjugate directions

135 135 135

Pattern search

157 147 48

DSCP/Rosenbrock

88 38 29

Complex strategy

152 152 111

(1,3)-DES 91 67 67

(1,5)-DES 53 48 48

(1,7)-DES 110 58 58

(1,10)-DES 152 80 28

MAES 122 65 12

Minimize thedeviation of temperture atthe cube´s Surface to1000°C !

maxT

Input: TemperatureProfile (12 Variables7 Temperatures and 5 Time-Steps)

Steel Cube Temperature Control

LIACS Natural Computing Group Leiden University92

- Maximisation of growth speed- Minimisation of diameter differences

FLUENT: Simulation of fluid flowCASTS: Calculation of Temperatureand concentration field

0 m/s 3 m/s1.5 m/s

Reaction Gas TCS Growing High Purity Silicon Rod

Optimsation of 15 process parameters:

Production time 35 hours 30 hours

Siemens C Reactor

LIACS Natural Computing Group Leiden University93

Multipoint Airfoil Optimiziation• Objective: Find pressure profiles that are a compromise between two given

target pressure distributions under two given flow conditions!

• Variables: 12 to 18 Bezier points for the airfoil

High Lift!

Low Drag!

Start

Cruise

LIACS Natural Computing Group Leiden University94

Traffic Light Control Optimization

• Objective: Minimization of total delay / number of stops• Variables: Green times for next switching schedule• Dynamic optimization, depending on actual traffic • Better results (3-5%)• Higher flexibility than with traditional controllers

LIACS Natural Computing Group Leiden University95

Optimization of elevator control

• Minimization of passenger waiting times.• Better results (~10%) than with traditional controller.• Dynamic optimization, depending on actual traffic.

LIACS Natural Computing Group Leiden University96

Optimization of network routing

• Minimization of end-to-end-blockings under service constraints.

• Optimization of routing tables for existing, hard-wired networks.

• 10%-1000% improvement.

LIACS Natural Computing Group Leiden University97

Automatic battery configuration

• Configuration and optimization of industry batteries.

• Based on user specifications, given to the system.

• Internet-Configurator (Baan, Hawker, NuTech).

LIACS Natural Computing Group Leiden University98

Optimization of reactor fueling.

• Minimization of total costs.

• Creates new fuel assembly reload patterns.

• Clear improvements (1%-5%)of existing expert solutions.

• Huge cost saving.

LIACS Natural Computing Group Leiden University99

Optical Coatings: Design Optimization

• Nonlinear mixed-integer problem, variable dimensionality.

• Minimize deviation from desired reflection behaviour.

• Excellent synthesis method; robust and reliable results.

• MOC-Problems: anti-reflection coating, dialetric filters

• Robust design: Sharp peaks vs. Robust peaks

LIACS Natural Computing Group Leiden University100

Example: IntravascularUltrasound Image Analysis

• Real-time high-resolution tomographic images from the inside of coronary vessels:

LIACS Natural Computing Group Leiden University101

Intravascular Ultrasound Image Analysis

• Detected Features in an IVUS Image:

Vessel Border

Sidebranch

Lumen

Shadow

Plague

LIACS Natural Computing Group Leiden University102

Experimental Results on IVUS images• Parameters for the lumen feature detector

LIACS Natural Computing Group Leiden University103

Intravascular Ultrasound Image Analysis: Results

• On each of 5 data sets algorithm ran for 2804 evaluations – 19,5h of total computing time

• Significant improvement over expert tuning

Performance of the best found MI-ES parameter solutions

• A paired two-tailed t-test was performed on the difference measurements for each image dataset using a 95% confidence interval (p=0.05)

• The null-hypothesis is that the mean difference results of the best MI-ES individual and the default parameters are equal.

LIACS Natural Computing Group Leiden University104

Literature

• H.-P. Schwefel: Evolution and Optimum Seeking, Wiley, NY, 1995.

• I. Rechenberg: Evolutionsstrategie 94, frommann-holzboog, Stuttgart, 1994.

• Th. Bäck: Evolutionary Algorithms in Theory and Practice, Oxford University Press, NY, 1996.

• Th. Bäck, D.B. Fogel, Z. Michalewicz (Hrsg.): Handbook of Evolutionary Computation, Vols. 1,2, Institute of Physics Publishing, 2000.

top related