metaheuristics: a quick overview

50
Metaheuristics: a quick overview Marc Sevaux University of Valenciennes CNRS, UMR 8530, LAMIH / Production systems [email protected] Marc Sevaux TEW – Antwerp 2003 1

Upload: others

Post on 16-Feb-2022

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Metaheuristics: a quick overview

�Metaheuristics: a quick overview

Marc Sevaux

University of Valenciennes

CNRS, UMR 8530, LAMIH / Production systems

[email protected]

Marc Sevaux TEW – Antwerp 2003 1

Page 2: Metaheuristics: a quick overview

Outline

Outline

• Neighborhood metaheurisitcs

• Application on a parallel machine scheduling problemPm|rj |

∑wjUj

• Conclusion on neighborhood search methods

• Population metaheuristics

• Application on a single machine scheduling problem1|rj |

∑wjTj

• Conclusion on population metaheuristics

Marc Sevaux TEW – Antwerp 2003 2

Page 3: Metaheuristics: a quick overview

Motivations for neighborhood metaheuristics

Motivations for neighborhood metaheuristics

• NP-hard problems

• Obtain rapidly solutions

• Short time of implementation

• No/few knowledge of the problem

• ... it’s easier...

Notationa set of solutions, S

a function to minimize, f(s) : s ∈ S → IR

an optimal solution, s� ∈ S / f(s�) ≤ f(s) ∀s ∈ S

Marc Sevaux TEW – Antwerp 2003 3

Page 4: Metaheuristics: a quick overview

Neighborhood and initial solution

Neighborhood and initial solution

First, an initial solution is needed. Then, to explore a solution space,we need to know how to move from one solution to another. It’s theneighborhood.

• Find an initial solution

• Define a neighborhood

• Explore [exhaustively] the neighborhood

• Evaluate a neighbor.

Notationa neighborhood of s, N(s)

Marc Sevaux TEW – Antwerp 2003 4

Page 5: Metaheuristics: a quick overview

Different approaches

Different approaches

Descent methods

• Simple descent, deepest descent

• Multi-start

Metaheuristic methods

• TS : Tabu search [Glover, 89 et 90]

• SA : Simulated annealing [Kirckpatrick, 83]

• TA : Threshold accepting [Deuck, Scheuer, 90]

• VNS : Variable neighborhood [Hansen, Mladenovic, 98]

• ILS : Iterated local search [Lorenco et al, 2000]

Practical use in scheduling

• Deepest descent, Multi-start

• Simulated annealing, Tabu search

Marc Sevaux TEW – Antwerp 2003 5

Page 6: Metaheuristics: a quick overview

Simple descent

Simple descent

procedure Simple Descent(initial solution s)repeat

Choose s′ ∈ N(s)if f(s′) < f(s) then

s ← s′

end ifuntil f(s′) ≥ f(s), ∀s′ ∈ N(s)

end

Marc Sevaux TEW – Antwerp 2003 6

Page 7: Metaheuristics: a quick overview

Simple descent

Simple descent

����

��������

�����������������������������������

�����������������������������������

������������������������������������

������������������������������������

Initial solution

Final solution

Solution space

Obj

ectiv

e va

lue

Marc Sevaux TEW – Antwerp 2003 7

Page 8: Metaheuristics: a quick overview

Deepest descent

Deepest descent

procedure Deepest Descent(initial solution s)repeat

Choose s′ ∈ N(s) / f(s′) ≤ f(s′′) ∀s′′ ∈ N(s)if f(s′) < f(s) then

s ← s′

end ifuntil f(s′) ≥ f(s), ∀s′ ∈ N(s)

end

Marc Sevaux TEW – Antwerp 2003 8

Page 9: Metaheuristics: a quick overview

Deepest descent

Deepest descent

���

���

������

������

����������������

��������������������������������������������������������������������������������������������������������

��������������������������������������������������������������������������������������������������������

Initial solution

Final solution

Solution space

Obj

ectiv

e va

lue

Marc Sevaux TEW – Antwerp 2003 9

Page 10: Metaheuristics: a quick overview

Multi-start descent and deepest descent

Multi-start descent and deepest descent

procedure Multistart Descentiter = 1f(Best) = +∞repeat

Choose a starting solution s0 at randoms ← Simple Descent(s0) (ou Deepest Descent(s0))if f(s) < f(Best) then

Best ← s

end ifiter ← iter + 1

until iter = IterMax

end

Marc Sevaux TEW – Antwerp 2003 10

Page 11: Metaheuristics: a quick overview

Multi-start descent and deepest descent

Multi-start descent and deepest descent

��������

������

������

����

������

������

��������

���

���

������

������

������

������

���

���

������

������

������

������

������

������

���

���

Initial solutions

Final solutions

Solution space

Obj

ectiv

e va

lue

Best solution

Marc Sevaux TEW – Antwerp 2003 11

Page 12: Metaheuristics: a quick overview

Tabu search

Tabu search

procedure Tabu Search(inital solution s)Best ← s

repeatChoose s′ ∈ N(s) / f(s′) ≤ f(s′′) ∀s′′ ∈ N(s)if s′ is not tabu or f(s′) < f(Best) then

s ← s′

Update tabu list, update Best

end ifuntil stopping conditions are met

end

Marc Sevaux TEW – Antwerp 2003 12

Page 13: Metaheuristics: a quick overview

Tabu search

Tabu search

Solution space

Obj

ectiv

e va

lue

Final solution

Best solution

Initial solution

Marc Sevaux TEW – Antwerp 2003 13

Page 14: Metaheuristics: a quick overview

Simulated annealing

Simulated annealing

procedure Simulated Annealing(inital solution s)Best ← s

T ← Tinit, choose α ∈]0, 1[, choose IterStage

repeatfor i = 1 to IterStage

Choose s′ ∈ N(s)∆ ← f(s′) − f(s)

s ← s′

if ∆ < 0 or

with the probability exp −∆T

, Update Best

end forT ← α × T

until stopping conditions are metend

Marc Sevaux TEW – Antwerp 2003 14

Page 15: Metaheuristics: a quick overview

Simulated annealing

Simulated annealing

Solution space

Obj

ectiv

e va

lue

Best solution

Initial solution

Final solution

Marc Sevaux TEW – Antwerp 2003 15

Page 16: Metaheuristics: a quick overview

Threshold accepting

Threshold accepting

procedure Threshold(inital solution s)T ← Tinit, choose α ∈]0, 1[, choose IterStage

Best ← s, moved ← truewhile moved do

for i = 1 to IterStage

Choose s′ ∈ N(s), ∆ ← f(s′) − f(s)moved ← trueif ∆ < 0 or ∆ < T then s ← s′

else moved ← false, end ifUpdate Best

end forT ← α × T

end whileend

Marc Sevaux TEW – Antwerp 2003 16

Page 17: Metaheuristics: a quick overview

Threshold accepting

Threshold accepting

Solution space

Obj

ectiv

e va

lue

Best solution

Initial solution

Final solution

Marc Sevaux TEW – Antwerp 2003 17

Page 18: Metaheuristics: a quick overview

Variable neighborhood search

Variable neighborhood search

procedure VNS(inital solution s)Best ← s

Choose a set of neighborhood structures Nk k = 1 . . . kmax

repeatk ← 1repeat

Shaking: choose s′ ∈ Nk(s)s′′ ← Localsearch(s′)if f(s′′) < f(s) then s ← s′′

else k ← k + 1 end ifUpdate Best

until k = kmax

until stopping conditions are metend

Marc Sevaux TEW – Antwerp 2003 18

Page 19: Metaheuristics: a quick overview

Variable neighborhood search

Variable neighborhood search

Obj

ectiv

e va

lue

Initial solution (s)

s’s’’

N1s’’

N1s’’

N1

N1

s’’

Solution space

N2N3

s’’

s’

Marc Sevaux TEW – Antwerp 2003 19

Page 20: Metaheuristics: a quick overview

Iterated local search

Iterated local search

procedure ILS(inital solution s0)Best ← s0

s ← Localsearch(s0)repeat

s′ ← Perturbation(s, history)s′′ ← Localsearch(s′)s ← AcceptanceCriterion(s, s′′, history)Update Best

until stopping conditions are metend

Marc Sevaux TEW – Antwerp 2003 20

Page 21: Metaheuristics: a quick overview

Iterated local search

Iterated local searchInitial solution s

0

s

s’

s’’

Perturbation

Solution space

Obj

ectiv

e va

lue

Marc Sevaux TEW – Antwerp 2003 21

Page 22: Metaheuristics: a quick overview

References

References

[Glover 89/90 ] Tabu Search Part I et II, ORSA Journal onComputing 1989:1(3), pp190-206 et 1990:2(1), pp4-32.

[Kirkpatrick et al 83 ] Optimization by Simulated Annealing,Science 220, pp671-680

[Deuck et Scheuer 90 ] Threshold accepting: a general purposeoptimization algorithm... Journal of Computational Physics 90,pp161-175.

[Hansen et Mladenovic 98 ] Variable Neighborhood Search:Principles and Applications, Rapport GERAD G-28-20.

[Lorenco et al 2000 ] Iterated Local Search, preprint.http://www.intellektik.informatik.tu-darmstadt.de/tom/

Marc Sevaux TEW – Antwerp 2003 22

Page 23: Metaheuristics: a quick overview

Conclusion on neighborhood search methods

Conclusion on neighborhood search methods

These metatheuristics

• are easy to understand and control

• give solutions rapidly

• need a short time for implementation

• and few knowledge of the problem

But to be really efficient

• tricks have to be added

• we need to know the problem very well

• try many of them

• and why not combine them

Marc Sevaux TEW – Antwerp 2003 23

Page 24: Metaheuristics: a quick overview

Population metaheuristics

Population metaheuristics

• Based on a population of solutions

• How to combine different solutions?

• How to manage the population?

• How to evaluate a solution?

Marc Sevaux TEW – Antwerp 2003 24

Page 25: Metaheuristics: a quick overview

Different approaches

Different approaches

• Genetic Algorithm, Holland 1975 – Goldberg 1989

• Memetic Algorithm, Moscatto 1989

• Hybrid Genetic Algorithm

• Ant Colony Optimisation, Dorigo 1991

• Scatter search, Laguna, Glover, Martı 2000

• Application to scheduling

Marc Sevaux TEW – Antwerp 2003 25

Page 26: Metaheuristics: a quick overview

Description of the algorithms

Genetic algorithm

Generate an initial population

While stopping conditions are not met

Select two individuals

Apply crossover operator

Mutate offspring under probability

Insert offspring under conditions

Remove an individual under conditions

End While

Marc Sevaux TEW – Antwerp 2003 26

Page 27: Metaheuristics: a quick overview

Description of the algorithms

Memetic algorithm

Generate an initial population

While stopping conditions are not met

Select two individuals

Apply crossover operator

Improve offspring under probability

Insert offspring under conditions

Remove an individual under conditions

End While

Marc Sevaux TEW – Antwerp 2003 27

Page 28: Metaheuristics: a quick overview

Description of the algorithms

Hybrid genetic algorithm

Generate an initial population

While stopping conditions are not met

Select two individuals

Apply crossover operator

Improve offspring under probability

Mutate offspring under probability

Insert offspring under conditions

Remove an individual under conditions

End While

Marc Sevaux TEW – Antwerp 2003 28

Page 29: Metaheuristics: a quick overview

Description of the algorithms

Ant Colony Optimisation

��������

������������ ���� ��

������

������ ��������

���� ������

������������������

���� ���� ������������

������

��������

���� ������������

��������������

��������

������������

��������

������������

������Nest Food

Obstacle

���� ��������

������������

��������

���� ��������

��������

����������������

���� ��������

������

������������

��������

��������������������������

����������������

������

������������Nest Food

???

??????

???

Obstacle

��������

������������ ������ ���� ���� ������

������������

������������

��������

������������

������

��������

������������

������������

�������� ��

������

��������

��������

������������

��������

��������

��������

��������

��������

������������

��������

������������

��������

��������

��������

Nest Food

Obstacle

��������

���������� ���� ��

������

��������

��������

����������������

���� ��������

������

������������

����

��������������

������������ ��

������

�������� ��

������

������������

��������

������������

��������

��������

��������

��������

��������

��������

������������

������������Nest Food

Marc Sevaux TEW – Antwerp 2003 29

Page 30: Metaheuristics: a quick overview

Description of the algorithms

Ant Colony Optimisation

Generate an initial colony

While stopping conditions are not met

For each ant of the colony

initialize ant memory

While Curent State �= Target state

Apply ant decision under pheromone information

Move to next state

Deposit pheromone information on the transition

End While

End For

Evaporate pheromone information

End While

Marc Sevaux TEW – Antwerp 2003 30

Page 31: Metaheuristics: a quick overview

Description of the algorithms

Scatter search

Generate an initial improved population

Select a diverse subset (RefSet, size b)

While stopping conditions are not met (outer loop)

A ← RefSet

While A �= ∅ (inner loop)

Combine solutions (B ← RefSet × A)

Improve solutions

Update RefSet (keep b best from RefSet ∪ B)

A ← B − RefSet

End While

Remove the worst b/2 solutions from RefSet

Add b/2 diverse solutions to RefSet

End While

Marc Sevaux TEW – Antwerp 2003 31

Page 32: Metaheuristics: a quick overview

Intensification vs diversification

Intensification vs diversification

Intensification Diversification

GA selection, crossover, replace-ment

mutation

MA selection, crossover, localsearch, replacement

HGA selection, crossover, localsearch, replacement

mutation

ACO pheromone information colony, ant memory re-initialization

SS inner loop: crossover, localsearch

diverse replacement

Marc Sevaux TEW – Antwerp 2003 32

Page 33: Metaheuristics: a quick overview

References

References (web sites)

Genetic Algorithms Holland 1975 – Goldberg 1989http://www-illigal.ge.uiuc.edu/index.php3

Ant Colony Optimisation Dorigo 1991http://iridia.ulb.ac.be/~mdorigo/ACO/ACO.html

Memetic Algorithms Moscatto 1989http://www.densis.fee.unicamp.br/

~moscato/memetic_home.html

Scatter search Laguna, Glover, Martı 2000http://www-bus.colorado.edu/faculty/

laguna/scattersearch.htm

Marc Sevaux TEW – Antwerp 2003 33

Page 34: Metaheuristics: a quick overview

References

References

Genetic Algorithms Holland 1975 – Goldberg 1989D. E. Goldberg, Genetic Algorithms in Search, Optimization,and Machine Learning, Addison-Wesley, Reading, Massachusetts,1989.Z. Michalewicz, Genetic Algorithms + Data Structures =Evolution Programs, Springer Verlag, 1999

Ant Colony Optimisation Dorigo 1991Dorigo M., V. Maniezzo and A. Colorni (1991). PositiveFeedback as a Search Strategy. Technical Report No. 91-016,Politecnico di Milano, ItalyDorigo M. and G. Di Caro (1999). The Ant Colony OptimizationMeta-Heuristic. In D. Corne, M. Dorigo and F. Glover, editors,New Ideas in Optimization, McGraw-Hill, 11-32.

Marc Sevaux TEW – Antwerp 2003 34

Page 35: Metaheuristics: a quick overview

References

References

Memetic Algorithms Moscatto 1989P. Moscatto, On Evolution, Search, Optimization, GeneticAlgorithms and Martial Arts: Towards Memetic Algorithms,Caltech Concurrent Computation Program, Report. 790, 1989.

Scatter search Laguna, Glover, Martı 2000F. Glover, M. Laguna and R. Martı, Fundamentals of ScatterSearch and Path Relinking Control and Cybernetics, vol. 39, no.3 pp. 653-684 (2000)F. Glover, M . Laguna and R. Martı (forthcoming) ScatterSearch Theory and Applications of Evolutionary Computation:Recent Trends, A. Ghosh and S. Tsutsui (Eds.), Springer-Verlag.

Marc Sevaux TEW – Antwerp 2003 35

Page 36: Metaheuristics: a quick overview

Application to a scheduling problem

Application to a scheduling problem

Scheduling ApplicationMinimizing the total weighted tardiness on a single machine inthe general case, 1|rj |

∑wjTj

Aim of the studyPoint out the differences between an Hybrid Genetic Algorithmand a Scatter Search

ImplementationBoth methods are compared on a same basis (commoncomponents, identical stoppping conditions, equivalentevaluations).

Marc Sevaux TEW – Antwerp 2003 36

Page 37: Metaheuristics: a quick overview

Application to a scheduling problem

Previous work

Complexity status 1|rj |∑

wjTj is NP-Hard in a strong sense

1|rj |∑

Tj NP-Hard in a strong sense [Garey, Johnson 77]

1| |∑wjTj NP-Hard in a strong sense [Lawler 77]

1| |∑Tj NP-Hard in an ordinary sense [Du, Leung 90]

Recent approaches on 1| |∑wjTj (OR-Library)[Crauwels, Potts, van Wassenhove 98] SA, TS, GA[Congram, Potts, van de Velde 02] Iterated dynasearch algorithm

Other approaches on 1|rj |∑

wjTj

[Akturk, Ozdemir 00] Exact method (up to 20 jobs)[Jouglet 02] Constraint-based method (up to 40 jobs)

Marc Sevaux TEW – Antwerp 2003 37

Page 38: Metaheuristics: a quick overview

Common components

Common components

CodingPermutation = natural representation for single machinescheduling problems, no clones allowed

FitnessBuild semi-active schedules according to the order of thepermutation

CrossoverStandard LOX crossover

MutationGPI: General Pairwise Interchange

Local searchBased on the GPI, a best improvement method is applied

Marc Sevaux TEW – Antwerp 2003 38

Page 39: Metaheuristics: a quick overview

Common components

Parameters (choose or tune)

Chosen according to the model

• crossover operator

• mutation operator

• local search method

• distance (for diverse replacement method)

Tuned according to the problem

• mutation rate

• local search rate

• population/RefSet size

• stopping conditions

Marc Sevaux TEW – Antwerp 2003 39

Page 40: Metaheuristics: a quick overview

What could happen

What could happen

• Easy problems (SS)cannot generate a diverse population

• Size of the population (GA/SS)SS: Too large → too slowGA: Too small → not enough diversification

• Mutation rate (GA)Too low → cannot escape from a local optimaToo high → cannot converge

• Local search rate (GA)Too low → converges slowlyToo high → converges too quickly to a local optima

Marc Sevaux TEW – Antwerp 2003 40

Page 41: Metaheuristics: a quick overview

What could happen

Influence of mutation rate

Marc Sevaux TEW – Antwerp 2003 41

Page 42: Metaheuristics: a quick overview

What could happen

Influence of local search rate

Marc Sevaux TEW – Antwerp 2003 42

Page 43: Metaheuristics: a quick overview

What could happen

Extreme rate values

Marc Sevaux TEW – Antwerp 2003 43

Page 44: Metaheuristics: a quick overview

Numerical results

Numerical results

Several sets of instances are used for numerical evaluation

OR-LIB instances for 1| |∑wjTj

n ∈ {40, 50, 100}, 125 instances eachComparison to optima (when known) or best solutions

ODD instances for 1|rj |∑

wjTj (initially for 1|rj |∑

wjUj)n ∈ {20, 40, 60, 80, 100}, 20 instances eachComparison between GA and SS approaches

Marc Sevaux TEW – Antwerp 2003 44

Page 45: Metaheuristics: a quick overview

Numerical results

Generic parameters

Parameters will be arbitrarily fixed for the study, better solution canbe found by tunning the parameters correctly.

Population size (SS) NPop = 10

RefSet size b = 5

Population size (GA) NPop = n/2

Mutation rate mr = 0.1

Local search rate lsr = 0.1

Stopping conditions 1200 sec. or 10000 it. without improvement(for ORLIB 100 jobs: 120 sec.)

Marc Sevaux TEW – Antwerp 2003 45

Page 46: Metaheuristics: a quick overview

Numerical results

OR LIB results

Parameters need to be tuned more precisely

Instances CPU GA it. SS GA = SS GA > SS Sol.

ORLIB 40 3.89s 4836 119 6 125

ORLIB 50 8.16s 2349 0 125 124

ORLIB 100* 58.36s 2 42 62 54

*CPU time limit is fixed to 120s

GA is always limited by the limit of 10,000 it w/o improvementSS is always limited by the CPU time limit

Marc Sevaux TEW – Antwerp 2003 46

Page 47: Metaheuristics: a quick overview

Numerical results

Another set of instances ODD

A problem instance generator is used, it is based on a single day of 80periods. Twenty instances are generated for each value of n

(n ∈ {20, 40, 60, 80, 100}).Orders have a high probability of arriving in the morning. Due datesare in the afternoon with high probability.

Release dates distributed according to a Gamma law Γ(0.2, 4)(Mean = 20, St. dev. = 10)

Due dates distributed according to the same Gamma law (Mean =20, St. dev. = 10) but starting at the end of the horizon

Processing times uniformly distributed in [1, di − ri]

Weights uniformly distributed in [1, 10]

Marc Sevaux TEW – Antwerp 2003 47

Page 48: Metaheuristics: a quick overview

Numerical results

Release date and due date repartition

0

1

2

3

4

5

6

0 10 20 30 40 50 60 70 80

Fre

quen

cy

Time periods

Release datesDue dates

Marc Sevaux TEW – Antwerp 2003 48

Page 49: Metaheuristics: a quick overview

Numerical results

ODD results

Stopping conditions:- CPU time limit = 1200s and 10,000 it. w/o improvement

Inst. CPU GA it. SS GA best SS best GA = SS GA > SS

ODD 20* 13.34s 8610 0.4s 0.0s 17 0

ODD 40 5.60s 1253 0.6s 3.1s 17 3

ODD 60 21.21s 291 4.6s 19.1s 15 5

ODD 80 62.68s 99 9.2s 248.9s 9 11

ODD 100 111.16s 43 33.6s 289.0s 9 10

*3 instances cannot be solved by SS, cannot generate 10 diverse solutions

GA is always limited by the limit of 10,000 it w/o improvementSS is always limited by the CPU time limit

Marc Sevaux TEW – Antwerp 2003 49

Page 50: Metaheuristics: a quick overview

Conclusion

Conclusion

GA and SS are both very powerful with similar kind of efforts

Hybrid Genetic Algorithm

• “easy” to implement

• lot of parameters to be tuned

• different class of instances → different parameters

• need automatic parameter configuration procedures

Scatter Search Algorithm

• “somewhat harder” to implement

• few parameters to be tuned

• need to improve the local search procedure

Marc Sevaux TEW – Antwerp 2003 50