evolutionary computational inteliigence

31
Evolutionary Computational Inteliigence Lecture 6a: Multimodality

Upload: mercer

Post on 25-Jan-2016

35 views

Category:

Documents


1 download

DESCRIPTION

Evolutionary Computational Inteliigence. Lecture 6a: Multimodality. Multimodality. Most interesting problems have more than one locally optimal solution and our goal is to detect all of them. Multi-Objective Problems (MOPs). - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Evolutionary Computational Inteliigence

Evolutionary Computational Inteliigence

Lecture 6a: Multimodality

Page 2: Evolutionary Computational Inteliigence

Multimodality

Most interesting problems have more than one locally optimal solution and our goal is to detect all of them

Page 3: Evolutionary Computational Inteliigence

Multi-Objective Problems (MOPs)

Wide range of problems can be categorised by the presence of a number of n possibly conflicting objectives:– buying a car: speed vs. price vs. reliability

Two part problem:– finding set of good solutions– choice of best for particular application

Page 4: Evolutionary Computational Inteliigence

MOP Car example

I want to buy a car I would like it’s the cheapest the possible (minimize f1)

and the most comfortable the possible (maximize f2) If I consider the two functions separately I obtain:

– min f1

– max f2

Page 5: Evolutionary Computational Inteliigence

MOPs 1: Conventional approaches

rely on using a weighting of objective function values to give a single scalar objective function which can then be optimised:

to find other solutions have to re-optimise with different wi.

n

iii xfwxf

1

)()('

Page 6: Evolutionary Computational Inteliigence

MOPs 2: Dominance

we say x dominates y if it is at least as good on all criteria and better on at least one

Dominated by x

f2

f1

Pareto frontx

Page 7: Evolutionary Computational Inteliigence

Implications for Evolutionary Optimisation

Two main approaches to diversity maintenance: Implicit approaches (decision space):

– Impose an equivalent of geographical separation– Impose an equivalent of speciation

Explicit approaches (fitness):– Make similar individuals compete for resources

(fitness)– Make similar individuals compete with each other

for survival

Page 8: Evolutionary Computational Inteliigence

Periodic migration of individual solutions between populations

Implicit 1: “Island” Model Parallel EAs

EAEA

EA EA

EA

Page 9: Evolutionary Computational Inteliigence

Island Model EAs:

Run multiple populations in parallel, in some kind of communication structure (usually a ring or a torus).

After a (usually fixed) number of generations (an Epoch), exchange individuals with neighbours

Repeat until ending criteria met Partially inspired by parallel/clustered

systems

Page 10: Evolutionary Computational Inteliigence

Island Model Parameter Setting

The idea is simple but its success is subject to a proper parameter setting

It must be somehow known the number of “islands”,i.e. basins of attraction we are considering

It must be set the population size for each separate island

If some a priori information regarding the fitness landscape is given, island model can be efficient, otherwise it can likely fail

Page 11: Evolutionary Computational Inteliigence

Implicit 2: Diffusion Model Parallel EAs

Impose spatial structure (usually grid) in 1 pop

Currentindividual

Neighbours

Page 12: Evolutionary Computational Inteliigence

Diffusion Model EAs

Consider each individual to exist on a point on a grid

Selection (hence recombination) and replacement happen using concept of a neighbourhood a.k.a. deme

Leads to different parts of grid searching different parts of space, good solutions diffuse across grid over a number of gens

Page 13: Evolutionary Computational Inteliigence

Diffusion Model Example

Assume rectangular grid so each individual has 8 immediate neighbours

For each point we can consider a population mad up of 9 individuals

One of the other 8 remaining point is selected (e.g. by means of roulette wheel)

Recombination between starting and selected point occurs

In a steady state logic replacement of the fittest occurs

Page 14: Evolutionary Computational Inteliigence

Implicit 3: Automatic Speciation

It restricts the recombination on the basis genotypic structure of the solutions in order to have recombination only amongst individual of the same specie– comparing the maximum genotypic distance between

solutions – Adding a “tag” (genotypic enlargement) in order to

characterize the belonging of each individual to a certain specie

In both cases, problem requires a lot of comparisons and the computational overhead can be very high

Page 15: Evolutionary Computational Inteliigence

Explicit 1: Fitness Sharing

Restricts the number of individuals within a given niche by “sharing” their fitness, so as to allocate individuals to niches in proportion to the niche fitness

need to set the size of the niche share in either genotype or phenotype space

run EA as normal but after each gen set

1

)),((

)()('

j

jidsh

ifif

otherwise

dddsh

0

/1)(

Meaning of the distance is representation dependent

Page 16: Evolutionary Computational Inteliigence

Explicit 2: Crowding

Attempts to distribute individuals evenly amongst niches

relies on the assumption that offspring will tend to be close to parents

randomly selects a couple of parents, produce 2 offspring

each offspring compete in a pair-tournament for surviving with the most similar parent (steady state) i.e. the parent which has minimal distance

Page 17: Evolutionary Computational Inteliigence

Fitness Sharing vs. Crowding

Fitness Sharing

Crowding

Page 18: Evolutionary Computational Inteliigence

Multimodality and Constraints

In some cases we are not satisfied by finding all the local optima but only a subset of them having certain properties (e.g. fitness values)

In such cases the combination of algorithmic components can be beneficial

A rather efficient and simple option is to properly combine a cascade

Page 19: Evolutionary Computational Inteliigence

Fast Evolutionary Deterministic Algorithm (2006)

FEDA is composed by:– Quasi Genetic Algorithm (QGA, 2004)– Fitness Sharing Selection Scheme (FSS) – Multistart Hooke Jeeves Algorithm (HJA)

Page 20: Evolutionary Computational Inteliigence

Quasi Genetic Algorithm

Page 21: Evolutionary Computational Inteliigence

FEDA

The set of solutions coming from QGA (usually a lot) are processed by FSS

We thus obtain a smaller set of points which have good fitness values and are spread out in the decision space

The HJA is then applied to each of those solutions

Page 22: Evolutionary Computational Inteliigence

Grounding Grid Problem 1

Page 23: Evolutionary Computational Inteliigence

Grounding Grid Problem 2

Page 24: Evolutionary Computational Inteliigence

Grounding System Problem

Page 25: Evolutionary Computational Inteliigence

Evolutionary Computational Inteliigence

Lecture 6b: Towards Parameter Control

Page 26: Evolutionary Computational Inteliigence

Motivation 1

An EA has many strategy parameters, e.g. mutation operator and mutation rate crossover operator and crossover rate selection mechanism and selective pressure (e.g.

tournament size) population size

Good parameter values facilitate good performance

Q1 How to find good parameter values ?

Page 27: Evolutionary Computational Inteliigence

Motivation 2

EA parameters are rigid (constant during a run)

BUT

an EA is a dynamic, adaptive process

THUS

optimal parameter values may vary during a run

Q2: How to vary parameter values?

Page 28: Evolutionary Computational Inteliigence

Parameter tuning

Parameter tuning: the traditional way of testing andcomparing different values before the “real” runProblems: users mistakes in settings can be sources of errors

or sub-optimal performance costs much time parameters interact: exhaustive search is not

practicable good values may become bad during the run

(e.g. Population size)

Page 29: Evolutionary Computational Inteliigence

Parameter Setting: Problems

A wrong parameter setting can lead to an undesirable algorithmic behavious since it can lead to stagnation or premature convergence

Too large population size, stagnation Too small population size, premature convergence In some “moments” of the evolution I would like to

have a large pop size (when I need to explore and prevent premature convergence); in other “moments” I would like to have a small one (when I need to exploit available genotypes)

Page 30: Evolutionary Computational Inteliigence

Parameter control

Parameter control: setting values on-line, during theactual run, I would like that the algorithm “decides” by

itself how to properly vary parameter setting over the run

Some popular options for pursuing this aim are: predetermined time-varying schedule p = p(t) using feedback from the search process encoding parameters in chromosomes and rely on natural

selection (similar to ES self-adaptation)

Page 31: Evolutionary Computational Inteliigence

Related Problems

Problems: finding optimal p is hard, finding optimal p(t) is harder still user-defined feedback mechanism, how to ”optimize”? when would natural selection work for strategy parameters?

Provisional answer: In agreement with the No Free Lunch Theorem, optimal control

strategy does not exist. Nevertheless, there are a plenty of interesting proposals that can be very performing in some problems. Some of these strategies are very problem oriented while some others are much more robust and thus applicable in a fairly wide spectrum of optimization problems