theory of evolutionary algorithms for combinatorial
Post on 03-Nov-2021
7 Views
Preview:
TRANSCRIPT
1/22
Theory of Evolutionary Algorithms forCombinatorial Optimisation
Pietro S. Oliveto
University of Sheffield
Midlands Graduate SchoolUniversity of Nottingham, 22-26 April 2014
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
2/22
Theory of Randomized Search Heuristics
Mission: analyze RSHs using methods from theoretical computer science→ computational complexity → upper and lower bounds on running time
Aims:
Give theoretical explanations for practical success
Predict the behavior on problem sizes handled only in the future
Perceive RSHs as a serious algorithmic approach
Give feedback to the design of RSHs
. . .
Status: started in the mid-1990s for simple EAs on toy problems;nowadays body of results covering EAs and other approaches oncombinatorial optimization problems → focus here
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
2/22
Theory of Randomized Search Heuristics
Mission: analyze RSHs using methods from theoretical computer science→ computational complexity → upper and lower bounds on running time
Aims:
Give theoretical explanations for practical success
Predict the behavior on problem sizes handled only in the future
Perceive RSHs as a serious algorithmic approach
Give feedback to the design of RSHs
. . .
Status: started in the mid-1990s for simple EAs on toy problems;nowadays body of results covering EAs and other approaches oncombinatorial optimization problems → focus here
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
2/22
Theory of Randomized Search Heuristics
Mission: analyze RSHs using methods from theoretical computer science→ computational complexity → upper and lower bounds on running time
Aims:
Give theoretical explanations for practical success
Predict the behavior on problem sizes handled only in the future
Perceive RSHs as a serious algorithmic approach
Give feedback to the design of RSHs
. . .
Status: started in the mid-1990s for simple EAs on toy problems;nowadays body of results covering EAs and other approaches oncombinatorial optimization problems → focus here
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
3/22
Basic Randomized Search Heuristics
(1+1) EA and RLS for maximization of f : 0, 1n → R
(1+1) EA
1 Choose x0 ∈ 0, 1n uniformly at random.
2 Create y by flipping each bit of xt independently with probab. 1/n(mutation).
3 If f (y) ≥ f (xt) set xt+1 := y else xt+1 := xt(selection).
4 t := t + 1.
5 Repeat 2–4 until happy
Analyze: smallest t such that xt optimal = running timeOften focus on expected running time
Standard algorithms, surprisingly efficient; generalizable to morecomplicated approaches
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
3/22
Basic Randomized Search Heuristics
(1+1) EA and RLS for maximization of f : 0, 1n → R
RLS
1 Choose x0 ∈ 0, 1n uniformly at random.
2 Create y by flipping one bit of xt uniformly(mutation).
3 If f (y) ≥ f (xt) set xt+1 := y else xt+1 := xt(selection).
4 t := t + 1.
5 Repeat 2–4 until happy
Analyze: smallest t such that xt optimal = running timeOften focus on expected running time
Standard algorithms, surprisingly efficient; generalizable to morecomplicated approaches
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
3/22
Basic Randomized Search Heuristics
(1+1) EA and RLS for maximization of f : 0, 1n → R
(1+1) EA
1 Choose x0 ∈ 0, 1n uniformly at random.
2 Create y by flipping each bit of xt independently with probab. 1/n(mutation).
3 If f (y) ≥ f (xt) set xt+1 := y else xt+1 := xt(selection).
4 t := t + 1.
5 Repeat 2–4 until happy
Analyze: smallest t such that xt optimal = running timeOften focus on expected running time
Standard algorithms, surprisingly efficient; generalizable to morecomplicated approaches
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
3/22
Basic Randomized Search Heuristics
(1+1) EA and RLS for maximization of f : 0, 1n → R
(1+1) EA
1 Choose x0 ∈ 0, 1n uniformly at random.
2 Create y by flipping each bit of xt independently with probab. 1/n(mutation).
3 If f (y) ≥ f (xt) set xt+1 := y else xt+1 := xt(selection).
4 t := t + 1.
5 Repeat 2–4 until happy
Analyze: smallest t such that xt optimal = running timeOften focus on expected running time
Standard algorithms, surprisingly efficient; generalizable to morecomplicated approaches
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
4/22
State of the Art in Computational Complexity of RSHs
Disclaimer: Will not and do not want to beat problem-specific algorithms
OneMax (1+1) EA O(n log n)(1+λ) EA O(λn + n log n)(µ+1) EA O(µn + n log n)
1-ANT O(n2) w.h.p.(µ+1) IA O(µn + n log n)
Linear Functions (1+1) EA Θ(n log n)
cGA Θ(n2+ε), ε > 0 const.
Max. Matching (1+1) EA eΩ(n), PRAS
Sorting (1+1) EA Θ(n2 log n)
SS Shortest Path (1+1) EA O(n3 log(nwmax ))
MO (1+1) EA O(n3)
MST (1+1) EA Θ(m2 log(nwmax ))(1+λ) EA O(n log(nwmax ))1-ANT O(mn log(nwmax ))
Max. Clique (1+1) EA Θ(n5)
(rand. planar) (16n+1) RLS Θ(n5/3)
Eulerian Cycle (1+1) EA Θ(m2 log m)Partition (1+1) EA 4/3 approx., competitive avg.
Vertex Cover (1+1) EA eΩ(n), arb. bad approx.
Set Cover (1+1) EA eΩ(n), arb. bad approx.SEMO Pol. O(log n)-approx.
Intersection of (1+1) EA 1/p-approximation in
p ≥ 3 matroids O(|E|p+2 log(|E|wmax))
UIO/FSM conf. (1+1) EA eΩ(n)
Few results with parent populations. Why?
Few (no) results with non-elitistic EAs.
P. K. Lehre, 2008
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
5/22
Picked For This Talk
Disclaimer: Will not and do not want to beat problem-specific algorithms
OneMax (1+1) EA O(n log n)(1+λ) EA O(λn + n log n)(µ+1) EA O(µn + n log n)
1-ANT O(n2) w.h.p.(µ+1) IA O(µn + n log n)
Linear Functions (1+1) EA Θ(n log n)
cGA Θ(n2+ε), ε > 0 const.
Max. Matching (1+1) EA eΩ(n), PRAS
Sorting (1+1) EA Θ(n2 log n)
SS Shortest Path (1+1) EA O(n3 log(nwmax ))
MO (1+1) EA O(n3)
MST (1+1) EA Θ(m2 log(nwmax ))(1+λ) EA O(n log(nwmax ))1-ANT O(mn log(nwmax ))
Max. Clique (1+1) EA Θ(n5)
(rand. planar) (16n+1) RLS Θ(n5/3)
Eulerian Cycle (1+1) EA Θ(m2 log m)Partition (1+1) EA 4/3 approx., competitive avg.
Vertex Cover (1+1) EA eΩ(n), arb. bad approx.
Set Cover (1+1) EA eΩ(n), arb. bad approx.SEMO Pol. O(log n)-approx.
Intersection of (1+1) EA 1/p-approximation in
p ≥ 3 matroids O(|E|p+2 log(|E|wmax))
UIO/FSM conf. (1+1) EA eΩ(n)
Few results with parent populations. Why?
Few (no) results with non-elitistic EAs.
P. K. Lehre, 2008
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
6/22
Minimum Spanning Trees
Given an undirected connected graph G (V ,E ) on n := |V | vertices andm := |E | weighted edges (wi , 1 ≤ i ≤ m),
a spanning tree of that graph is a subgraph thatis a tree and connects all the vertices together.
Aim: Finding a spanning tree of minimum weight (an edge set E ′ ⊆ E ofminimal weight that connects all vertices).
The famous algorithms due to Kruskal and Prim have worst-caseruntimes of O((n + m) log n) and O(n log n + m), respectively.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
6/22
Minimum Spanning Trees
Given an undirected connected graph G (V ,E ) on n := |V | vertices andm := |E | weighted edges (wi , 1 ≤ i ≤ m),
a spanning tree of that graph is a subgraph thatis a tree and connects all the vertices together.
Aim: Finding a spanning tree of minimum weight (an edge set E ′ ⊆ E ofminimal weight that connects all vertices).
The famous algorithms due to Kruskal and Prim have worst-caseruntimes of O((n + m) log n) and O(n log n + m), respectively.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
6/22
Minimum Spanning Trees
Given an undirected connected graph G (V ,E ) on n := |V | vertices andm := |E | weighted edges (wi , 1 ≤ i ≤ m),
a spanning tree of that graph is a subgraph thatis a tree and connects all the vertices together.
Aim: Finding a spanning tree of minimum weight (an edge set E ′ ⊆ E ofminimal weight that connects all vertices).
The famous algorithms due to Kruskal and Prim have worst-caseruntimes of O((n + m) log n) and O(n log n + m), respectively.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
7/22
Minimum Spanning Trees: EA Application
Representation
x ∈ 0, 1m encodes a selection of edges;
Fitness Function
f (x) := (#components(x)−1) ·n3wmax +nwmax
∣∣n − 1−m∑i=1
xi∣∣+ m∑
i=1
wixi
Theorem (Neumann and Wegener, 2007)
The expected time until the (1+1) EA constructs a minimum spanningtree is bounded by O(m2(log n + logwmax)).
Proof Idea
Phase 1: Find a connected graph → O(m log n) (fitness levels)
Phase 2: Find a spanning tree → O(m log n) (fitness levels)
Phase 3: find a minimum spanning tree (drift analysis)
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
7/22
Minimum Spanning Trees: EA Application
Representation
x ∈ 0, 1m encodes a selection of edges;
Fitness Function
f (x) := (#components(x)−1) ·n3wmax +nwmax
∣∣n − 1−m∑i=1
xi∣∣+ m∑
i=1
wixi
Theorem (Neumann and Wegener, 2007)
The expected time until the (1+1) EA constructs a minimum spanningtree is bounded by O(m2(log n + logwmax)).
Proof Idea
Phase 1: Find a connected graph → O(m log n) (fitness levels)
Phase 2: Find a spanning tree → O(m log n) (fitness levels)
Phase 3: find a minimum spanning tree (drift analysis)
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
7/22
Minimum Spanning Trees: EA Application
Representation
x ∈ 0, 1m encodes a selection of edges;
Fitness Function
f (x) := (#components(x)−1) ·n3wmax +nwmax
∣∣n − 1−m∑i=1
xi∣∣+ m∑
i=1
wixi
Theorem (Neumann and Wegener, 2007)
The expected time until the (1+1) EA constructs a minimum spanningtree is bounded by O(m2(log n + logwmax)).
Proof Idea
Phase 1: Find a connected graph → O(m log n) (fitness levels)
Phase 2: Find a spanning tree → O(m log n) (fitness levels)
Phase 3: find a minimum spanning tree (drift analysis)
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
8/22
MST: Phase 1
Lemma
The expected time until the (1+1) EA has constructed a connectedgraph is O(m log n).
Proof Idea
1 For each edge set leading to a graph with k connected components,there are at least k − 1 edges that would decrease the number ofcomponents by 1. (otherwise graph would not be connected)
2 The probability of decreasing the # of components is:
(k − 1) · 1
m
(1− 1
m
)m−1
≥ k − 1
m· 1
e
3 By Artificial Fitness Levels the expected runtime is:
E [T ] ≤n∑
k=2
e ·mk − 1
= em ·n∑
k=2
1
k − 1= O(m log n).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
8/22
MST: Phase 1
Lemma
The expected time until the (1+1) EA has constructed a connectedgraph is O(m log n).
Proof Idea
1 For each edge set leading to a graph with k connected components,there are at least k − 1 edges that would decrease the number ofcomponents by 1. (otherwise graph would not be connected)
2 The probability of decreasing the # of components is:
(k − 1) · 1
m
(1− 1
m
)m−1
≥ k − 1
m· 1
e
3 By Artificial Fitness Levels the expected runtime is:
E [T ] ≤n∑
k=2
e ·mk − 1
= em ·n∑
k=2
1
k − 1= O(m log n).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
8/22
MST: Phase 1
Lemma
The expected time until the (1+1) EA has constructed a connectedgraph is O(m log n).
Proof Idea
1 For each edge set leading to a graph with k connected components,there are at least k − 1 edges that would decrease the number ofcomponents by 1. (otherwise graph would not be connected)
2 The probability of decreasing the # of components is:
(k − 1) · 1
m
(1− 1
m
)m−1
≥ k − 1
m· 1
e
3 By Artificial Fitness Levels the expected runtime is:
E [T ] ≤n∑
k=2
e ·mk − 1
= em ·n∑
k=2
1
k − 1= O(m log n).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
8/22
MST: Phase 1
Lemma
The expected time until the (1+1) EA has constructed a connectedgraph is O(m log n).
Proof Idea
1 For each edge set leading to a graph with k connected components,there are at least k − 1 edges that would decrease the number ofcomponents by 1. (otherwise graph would not be connected)
2 The probability of decreasing the # of components is:
(k − 1) · 1
m
(1− 1
m
)m−1
≥ k − 1
m· 1
e
3 By Artificial Fitness Levels the expected runtime is:
E [T ] ≤n∑
k=2
e ·mk − 1
= em ·n∑
k=2
1
k − 1= O(m log n).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
9/22
MST: Phase 2
Lemma
Let the current solution x be a connected graph. The expected time untilthe (1+1) EA creates a spanning tree is O(m log n)
Proof Idea
1 A spanning tree has n − 1 edges;
2 Let r > n− 1 be the number of edges in the current solution x , (i.e.,at least r − (n − 1) ≤ m − (n − 1) edges create loops and can beremoved);
3 By Artificial Fitness Levels the expected runtime is:
E [T ] ≤m−(n−1)∑
i=1
e ·mi
= O(m log(m − (n − 1)) = O(m log n).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
9/22
MST: Phase 2
Lemma
Let the current solution x be a connected graph. The expected time untilthe (1+1) EA creates a spanning tree is O(m log n)
Proof Idea
1 A spanning tree has n − 1 edges;
2 Let r > n− 1 be the number of edges in the current solution x , (i.e.,at least r − (n − 1) ≤ m − (n − 1) edges create loops and can beremoved);
3 By Artificial Fitness Levels the expected runtime is:
E [T ] ≤m−(n−1)∑
i=1
e ·mi
= O(m log(m − (n − 1)) = O(m log n).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
9/22
MST: Phase 2
Lemma
Let the current solution x be a connected graph. The expected time untilthe (1+1) EA creates a spanning tree is O(m log n)
Proof Idea
1 A spanning tree has n − 1 edges;
2 Let r > n− 1 be the number of edges in the current solution x , (i.e.,at least r − (n − 1) ≤ m − (n − 1) edges create loops and can beremoved);
3 By Artificial Fitness Levels the expected runtime is:
E [T ] ≤m−(n−1)∑
i=1
e ·mi
= O(m log(m − (n − 1)) = O(m log n).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
9/22
MST: Phase 2
Lemma
Let the current solution x be a connected graph. The expected time untilthe (1+1) EA creates a spanning tree is O(m log n)
Proof Idea
1 A spanning tree has n − 1 edges;
2 Let r > n− 1 be the number of edges in the current solution x , (i.e.,at least r − (n − 1) ≤ m − (n − 1) edges create loops and can beremoved);
3 By Artificial Fitness Levels the expected runtime is:
E [T ] ≤m−(n−1)∑
i=1
e ·mi
= O(m log(m − (n − 1)) = O(m log n).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
10/22
MST: Phase 3
Lemma (Property of Spanning trees (Kano,1987))
Let T be a minimum spanning tree and San arbitrary tree on G = (V ,E ). Thenthere exists a bijection α from T \ S toS \ T such that for every edge e ∈ T \ S ,α(e) ∈ Cycle(S , e) and w(α(e)) ≥ w(e).
Lemma
Let x be a non-minimum spanning tree. Then there existk ∈ 1, . . . , n − 1 different accepted 2-bit flips such that the averagedistance decrease of these bit flips from the MST is at least(w(x)− w(opt))/k .
Proof IdeaFollows from Lemma (Kano,1987) by considering that the distancebetween x and opt is (w(x)− w(opt)) and that there must exist k 2-bitflips decreasing the total weight.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
10/22
MST: Phase 3
Lemma (Property of Spanning trees (Kano,1987))
Let T be a minimum spanning tree and San arbitrary tree on G = (V ,E ). Thenthere exists a bijection α from T \ S toS \ T such that for every edge e ∈ T \ S ,α(e) ∈ Cycle(S , e) and w(α(e)) ≥ w(e).
Lemma
Let x be a non-minimum spanning tree. Then there existk ∈ 1, . . . , n − 1 different accepted 2-bit flips such that the averagedistance decrease of these bit flips from the MST is at least(w(x)− w(opt))/k .
Proof Idea
Follows from Lemma (Kano,1987) by considering that the distancebetween x and opt is (w(x)− w(opt)) and that there must exist k 2-bitflips decreasing the total weight.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
10/22
MST: Phase 3
Lemma (Property of Spanning trees (Kano,1987))
Let T be a minimum spanning tree and San arbitrary tree on G = (V ,E ). Thenthere exists a bijection α from T \ S toS \ T such that for every edge e ∈ T \ S ,α(e) ∈ Cycle(S , e) and w(α(e)) ≥ w(e).
Lemma
Let x be a non-minimum spanning tree. Then there existk ∈ 1, . . . , n − 1 different accepted 2-bit flips such that the averagedistance decrease of these bit flips from the MST is at least(w(x)− w(opt))/k .
Proof IdeaFollows from Lemma (Kano,1987) by considering that the distancebetween x and opt is (w(x)− w(opt)) and that there must exist k 2-bitflips decreasing the total weight.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
11/22
MST: Final Proof
Theorem (Neumann and Wegener, 2007)
The expected time until the (1+1) EA constructs a minimum spanningtree is bounded by O(m2(log n + logwmax)).
Proof Phases
Phase 1: Find a connected graph → O(m log n) (fitness levels)
Phase 2: Find a spanning tree → O(m log n) (fitness levels)
Phase 3: find a minimum spanning tree (drift analysis)
Proof Idea Let X t := w(x t)− w(opt).
1 The probability of a 2-bit flip is
(1m
)2
·(
1− 1m
)m−2
≥ 1em2
2 By Multiplicative Drift analysis we get the following drift:
E [X t−X t+1|x t = x ] ≥∑k
i=1
(w(x)− w(yi )
)em2
=w(x)− w(opt)
em2=
X t
em2
(drift is δ = 1/(em2)).3 The statement follows by Multiplicative Drift Analysis with
parameters smin = 1 and smax = mwmax.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
11/22
MST: Final Proof
Theorem (Neumann and Wegener, 2007)
The expected time until the (1+1) EA constructs a minimum spanningtree is bounded by O(m2(log n + logwmax)).
Proof Phases
Phase 1: Find a connected graph → O(m log n) (fitness levels)
Phase 2: Find a spanning tree → O(m log n) (fitness levels)
Phase 3: find a minimum spanning tree (drift analysis)
Proof Idea Let X t := w(x t)− w(opt).
1 The probability of a 2-bit flip is
(1m
)2
·(
1− 1m
)m−2
≥ 1em2
2 By Multiplicative Drift analysis we get the following drift:
E [X t−X t+1|x t = x ] ≥∑k
i=1
(w(x)− w(yi )
)em2
=w(x)− w(opt)
em2=
X t
em2
(drift is δ = 1/(em2)).3 The statement follows by Multiplicative Drift Analysis with
parameters smin = 1 and smax = mwmax.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
11/22
MST: Final Proof
Theorem (Neumann and Wegener, 2007)
The expected time until the (1+1) EA constructs a minimum spanningtree is bounded by O(m2(log n + logwmax)).
Proof Phases
Phase 1: Find a connected graph → O(m log n) (fitness levels)
Phase 2: Find a spanning tree → O(m log n) (fitness levels)
Phase 3: find a minimum spanning tree (drift analysis)
Proof Idea Let X t := w(x t)− w(opt).
1 The probability of a 2-bit flip is
(1m
)2
·(
1− 1m
)m−2
≥ 1em2
2 By Multiplicative Drift analysis we get the following drift:
E [X t−X t+1|x t = x ] ≥∑k
i=1
(w(x)− w(yi )
)em2
=w(x)− w(opt)
em2=
X t
em2
(drift is δ = 1/(em2)).3 The statement follows by Multiplicative Drift Analysis with
parameters smin = 1 and smax = mwmax.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
11/22
MST: Final Proof
Theorem (Neumann and Wegener, 2007)
The expected time until the (1+1) EA constructs a minimum spanningtree is bounded by O(m2(log n + logwmax)).
Proof Phases
Phase 1: Find a connected graph → O(m log n) (fitness levels)
Phase 2: Find a spanning tree → O(m log n) (fitness levels)
Phase 3: find a minimum spanning tree (drift analysis)
Proof Idea Let X t := w(x t)− w(opt).
1 The probability of a 2-bit flip is
(1m
)2
·(
1− 1m
)m−2
≥ 1em2
2 By Multiplicative Drift analysis we get the following drift:
E [X t−X t+1|x t = x ] ≥∑k
i=1
(w(x)− w(yi )
)em2
=w(x)− w(opt)
em2=
X t
em2
(drift is δ = 1/(em2)).
3 The statement follows by Multiplicative Drift Analysis withparameters smin = 1 and smax = mwmax.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
11/22
MST: Final Proof
Theorem (Neumann and Wegener, 2007)
The expected time until the (1+1) EA constructs a minimum spanningtree is bounded by O(m2(log n + logwmax)).
Proof Phases
Phase 1: Find a connected graph → O(m log n) (fitness levels)
Phase 2: Find a spanning tree → O(m log n) (fitness levels)
Phase 3: find a minimum spanning tree (drift analysis)
Proof Idea Let X t := w(x t)− w(opt).
1 The probability of a 2-bit flip is
(1m
)2
·(
1− 1m
)m−2
≥ 1em2
2 By Multiplicative Drift analysis we get the following drift:
E [X t−X t+1|x t = x ] ≥∑k
i=1
(w(x)− w(yi )
)em2
=w(x)− w(opt)
em2=
X t
em2
(drift is δ = 1/(em2)).3 The statement follows by Multiplicative Drift Analysis with
parameters smin = 1 and smax = mwmax.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
12/22
Makespan Scheduling
Result for an NP-hard problem:
Makespan scheduling on 2 machines (PARTITION):
n jobs with weights/processing times w1, . . . ,wn (objects)
2 identical machines (bins)
Minimize the total completion time (the total weight of fuller bin) =makespan.
Formally, find I ⊆ 1, . . . , n minimizing
max
∑i∈I
wi ,∑i /∈I
wi
.
Approximation ratio of (1+ε) achievable in time O(n3/ε) (Hochbaum,1997). (Approximation of 2 is trivial).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
12/22
Makespan Scheduling
Result for an NP-hard problem:
Makespan scheduling on 2 machines (PARTITION):
n jobs with weights/processing times w1, . . . ,wn (objects)
2 identical machines (bins)
Minimize the total completion time (the total weight of fuller bin) =makespan.
Formally, find I ⊆ 1, . . . , n minimizing
max
∑i∈I
wi ,∑i /∈I
wi
.
Approximation ratio of (1+ε) achievable in time O(n3/ε) (Hochbaum,1997). (Approximation of 2 is trivial).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
13/22
Partition: EA Application
Representation
x ∈ 0, 1n encodes a selection of objects;(i.e., xi = 1, object i in bin 1; xi = 0, object i in bin 0);
Fitness Function
f (x) := max
n∑
i=1
xi · wi ,
n∑i=1
(1− xi ) · wi
.
Let W =∑n
i=1 wi . We want to minimise the size of the fuller bin. Atrivial lower bound is W /2.
Theorem (Witt, 2005)
On any instance of the makespan scheduling problem, the (1+1) EA andRLS reach a solution with approximation ratio 4
3 in expected time O(n2).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
13/22
Partition: EA Application
Representation
x ∈ 0, 1n encodes a selection of objects;(i.e., xi = 1, object i in bin 1; xi = 0, object i in bin 0);
Fitness Function
f (x) := max
n∑
i=1
xi · wi ,
n∑i=1
(1− xi ) · wi
.
Let W =∑n
i=1 wi . We want to minimise the size of the fuller bin. Atrivial lower bound is W /2.
Theorem (Witt, 2005)
On any instance of the makespan scheduling problem, the (1+1) EA andRLS reach a solution with approximation ratio 4
3 in expected time O(n2).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
13/22
Partition: EA Application
Representation
x ∈ 0, 1n encodes a selection of objects;(i.e., xi = 1, object i in bin 1; xi = 0, object i in bin 0);
Fitness Function
f (x) := max
n∑
i=1
xi · wi ,
n∑i=1
(1− xi ) · wi
.
Let W =∑n
i=1 wi . We want to minimise the size of the fuller bin. Atrivial lower bound is W /2.
Theorem (Witt, 2005)
On any instance of the makespan scheduling problem, the (1+1) EA andRLS reach a solution with approximation ratio 4
3 in expected time O(n2).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
14/22
PARTITION: Critical job size
If f (x) ≥W /2 +wi/2 then the loads of the two bins differ by at least wi .Hence, object i can be shifted from the fuller to the emptier bin.
More generally,
Let s(x) be the weight of the smallest object in the fullest bin. We calls(x) the critical job size with respect to x . Because,
If f (x) ≥W /2 + s(x)/2 then we can shift the position of the criticalobject.
Only if f(x) is less than the above bound, can the algorithm be stuck on alocal optimum.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
15/22
PARTITION: Runtime
Lemma
Let x be a current search point of the RLS or of the (1+1) EA on anarbitrary instance of PARTITION. Suppose that the critical job size isguaranteed to be bounded above by s∗ for all search points such thatf (x) ≥W /2 + s∗/2. Then the algorithm reaches a value of at mostW /2 + s∗/2 in expected time O(n2).
Proof Idea:
Let, w.l.o.g., w1 ≥ w2 ≥ . . .wn and r the smallest i such that wi ≤ s∗.
1 As long as f (x) ≥W /2 + s∗/2, there must be a job from xr , . . . xnthat can be moved from the fuller to the emptiest bin.
2 The probability for the exchange is at least 1/en (waiting time: en).
3 After at most n − r + 1 exchanges the value drops to at mostW /2 + s∗/2 (waiting time: O(n2)).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
15/22
PARTITION: Runtime
Lemma
Let x be a current search point of the RLS or of the (1+1) EA on anarbitrary instance of PARTITION. Suppose that the critical job size isguaranteed to be bounded above by s∗ for all search points such thatf (x) ≥W /2 + s∗/2. Then the algorithm reaches a value of at mostW /2 + s∗/2 in expected time O(n2).
Proof Idea:
Let, w.l.o.g., w1 ≥ w2 ≥ . . .wn and r the smallest i such that wi ≤ s∗.
1 As long as f (x) ≥W /2 + s∗/2, there must be a job from xr , . . . xnthat can be moved from the fuller to the emptiest bin.
2 The probability for the exchange is at least 1/en (waiting time: en).
3 After at most n − r + 1 exchanges the value drops to at mostW /2 + s∗/2 (waiting time: O(n2)).
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
16/22
PARTITION: Approximation
Theorem (Witt, 2005)
On any instance of the makespan scheduling problem, the (1+1) EA andRLS reach a solution with approximation ratio 4
3 in expected time O(n2).
Proof Idea:
Case 1: w1 + w2 > (2/3)W .
1 w1 > W /3 because w1 ≥ w2 and W − w1 − w2 < W /3.If we have x1 and x2 on one machine, then x2 can be shifted (andwill remain shifted). (waiting time: O(n)).
2 Since w3 + · · ·+ wn < W /3, then wi < W /3 for all i ≥ 3.Hence the critical job size s∗ < W /3.
3
L+ s∗/2
L≤ W /2 + s∗/2
W /2≤ 1 +
W /6
W /2=
4
3
Case 2: w1 + w2 ≤ (2/3)W .
Then w2 ≤W /3 because w1 ≥ w2 and the critical job size boundand the approximation ratio still hold.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
16/22
PARTITION: Approximation
Theorem (Witt, 2005)
On any instance of the makespan scheduling problem, the (1+1) EA andRLS reach a solution with approximation ratio 4
3 in expected time O(n2).
Proof Idea:
Case 1: w1 + w2 > (2/3)W .
1 w1 > W /3 because w1 ≥ w2 and W − w1 − w2 < W /3.If we have x1 and x2 on one machine, then x2 can be shifted (andwill remain shifted). (waiting time: O(n)).
2 Since w3 + · · ·+ wn < W /3, then wi < W /3 for all i ≥ 3.Hence the critical job size s∗ < W /3.
3
L+ s∗/2
L≤ W /2 + s∗/2
W /2≤ 1 +
W /6
W /2=
4
3
Case 2: w1 + w2 ≤ (2/3)W .
Then w2 ≤W /3 because w1 ≥ w2 and the critical job size boundand the approximation ratio still hold.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
16/22
PARTITION: Approximation
Theorem (Witt, 2005)
On any instance of the makespan scheduling problem, the (1+1) EA andRLS reach a solution with approximation ratio 4
3 in expected time O(n2).
Proof Idea:
Case 1: w1 + w2 > (2/3)W .
1 w1 > W /3 because w1 ≥ w2 and W − w1 − w2 < W /3.If we have x1 and x2 on one machine, then x2 can be shifted (andwill remain shifted). (waiting time: O(n)).
2 Since w3 + · · ·+ wn < W /3, then wi < W /3 for all i ≥ 3.Hence the critical job size s∗ < W /3.
3
L+ s∗/2
L≤ W /2 + s∗/2
W /2≤ 1 +
W /6
W /2=
4
3
Case 2: w1 + w2 ≤ (2/3)W .
Then w2 ≤W /3 because w1 ≥ w2 and the critical job size boundand the approximation ratio still hold.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
16/22
PARTITION: Approximation
Theorem (Witt, 2005)
On any instance of the makespan scheduling problem, the (1+1) EA andRLS reach a solution with approximation ratio 4
3 in expected time O(n2).
Proof Idea:
Case 1: w1 + w2 > (2/3)W .
1 w1 > W /3 because w1 ≥ w2 and W − w1 − w2 < W /3.If we have x1 and x2 on one machine, then x2 can be shifted (andwill remain shifted). (waiting time: O(n)).
2 Since w3 + · · ·+ wn < W /3, then wi < W /3 for all i ≥ 3.Hence the critical job size s∗ < W /3.
3
L+ s∗/2
L≤ W /2 + s∗/2
W /2≤ 1 +
W /6
W /2=
4
3
Case 2: w1 + w2 ≤ (2/3)W .
Then w2 ≤W /3 because w1 ≥ w2 and the critical job size boundand the approximation ratio still hold.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
16/22
PARTITION: Approximation
Theorem (Witt, 2005)
On any instance of the makespan scheduling problem, the (1+1) EA andRLS reach a solution with approximation ratio 4
3 in expected time O(n2).
Proof Idea:
Case 1: w1 + w2 > (2/3)W .
1 w1 > W /3 because w1 ≥ w2 and W − w1 − w2 < W /3.If we have x1 and x2 on one machine, then x2 can be shifted (andwill remain shifted). (waiting time: O(n)).
2 Since w3 + · · ·+ wn < W /3, then wi < W /3 for all i ≥ 3.Hence the critical job size s∗ < W /3.
3
L+ s∗/2
L≤ W /2 + s∗/2
W /2≤ 1 +
W /6
W /2=
4
3
Case 2: w1 + w2 ≤ (2/3)W .
Then w2 ≤W /3 because w1 ≥ w2 and the critical job size boundand the approximation ratio still hold.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
16/22
PARTITION: Approximation
Theorem (Witt, 2005)
On any instance of the makespan scheduling problem, the (1+1) EA andRLS reach a solution with approximation ratio 4
3 in expected time O(n2).
Proof Idea:
Case 1: w1 + w2 > (2/3)W .
1 w1 > W /3 because w1 ≥ w2 and W − w1 − w2 < W /3.If we have x1 and x2 on one machine, then x2 can be shifted (andwill remain shifted). (waiting time: O(n)).
2 Since w3 + · · ·+ wn < W /3, then wi < W /3 for all i ≥ 3.Hence the critical job size s∗ < W /3.
3
L+ s∗/2
L≤ W /2 + s∗/2
W /2≤ 1 +
W /6
W /2=
4
3
Case 2: w1 + w2 ≤ (2/3)W .
Then w2 ≤W /3 because w1 ≥ w2 and the critical job size boundand the approximation ratio still hold.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
17/22
Worst Case – Good Approximations by Populations
More careful analysis of worst-case behavior shows:
Theorem
[Witt, 2005] On any instance, the (1+1) EA and RLS with prob.≥ 2−cd1/εe ln(1/ε) find a (1 + ε)-approximation within O(n ln(1/ε)) steps.
A population of 2O(d1/εe ln(1/ε)) runs find a (1 + ε)-approximationwith prob. ≥ 0.995% in O(n ln(1/ε)) parallel steps.
E. g., 1% away from optimality in expected O(n) steps.
Populations give rise to a polynomial-time randomizedapproximation scheme (PRAS)!
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
17/22
Worst Case – Good Approximations by Populations
More careful analysis of worst-case behavior shows:
Theorem
[Witt, 2005] On any instance, the (1+1) EA and RLS with prob.≥ 2−cd1/εe ln(1/ε) find a (1 + ε)-approximation within O(n ln(1/ε)) steps.
A population of 2O(d1/εe ln(1/ε)) runs find a (1 + ε)-approximationwith prob. ≥ 0.995% in O(n ln(1/ε)) parallel steps.
E. g., 1% away from optimality in expected O(n) steps.
Populations give rise to a polynomial-time randomizedapproximation scheme (PRAS)!
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
18/22
Vertex Cover
A vertex cover in an undirected graph G = (V ,E ) with |V | = n and|E | = m is a subset of nodes such that for each edge at least one of itsendpoints is in the subset; aim: find a minimum vertex cover (NP-hard)
Representationx ∈ 0, 1n encodes a selection of nodes;
u(x) : uncovered edges;
Fitness Function
f (x) := (n + 1) · u(x) +n∑
i=1
xi
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
18/22
Vertex Cover
A vertex cover in an undirected graph G = (V ,E ) with |V | = n and|E | = m is a subset of nodes such that for each edge at least one of itsendpoints is in the subset; aim: find a minimum vertex cover (NP-hard)
Representationx ∈ 0, 1n encodes a selection of nodes;
u(x) : uncovered edges;
Fitness Function
f (x) := (n + 1) · u(x) +n∑
i=1
xi
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
19/22
Worst-case Approximation
|V1| = εn
|V2| = (1− ε)n
(1+1) EA worst case instance: Complete bipartite graph
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
19/22
Worst-case Approximation
|V1| = εn
|V2| = (1− ε)n
(1+1) EA worst case instance: Optimal vertex cover
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
19/22
Worst-case Approximation
|V1| = εn
|V2| = (1− ε)n
(1+1) EA worst case instance: Local optimum
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
19/22
Worst-case Approximation
|V1| = εn
|V2| = (1− ε)n
Theorem (Friedrich et al. 2007)
Let δ > 0 be a constant and nδ−1 ≤ ε < 1/2. The expected time for the(1+1) EA to produce an approximation better than a factor (1-ε)/(ε) isexponential in the number of nodes n.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
19/22
Worst-case Approximation
|V1| = εn
|V2| = (1− ε)n
Theorem (Friedrich et al. 2007)
Let δ > 0 be a constant and nδ−1 ≤ ε < 1/2. The expected time for the(1+1) EA to produce an approximation better than a factor (1-ε)/(ε) isexponential in the number of nodes n.
Proof idea: There is a high probability that all the V2 nodes are insertedin the cover before all the V1 nodes!
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
19/22
Worst-case Approximation
|V1| = εn
|V2| = (1− ε)n
Theorem (Friedrich et al. 2007)
Let δ > 0 be a constant and nδ−1 ≤ ε < 1/2. The expected time for the(1+1) EA to produce an approximation better than a factor (1-ε)/(ε) isexponential in the number of nodes n.
Proof idea: There is a high probability that all the V2 nodes are insertedin the cover before all the V1 nodes!
Theorem (Oliveto, He Yao, 2007)
A population of a constant c runs find the minimum vertex cover inexpected O(n ln n) parallel steps!
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
20/22
Vertex Cover: Multiple Bipartite Graphs
B(1)√n,ε
B(2)√n,ε B
(√n)√
n,ε
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
20/22
Vertex Cover: Multiple Bipartite Graphs
B(1)√n,ε
B(2)√n,ε B
(√n)√
n,ε
minimum vertex cover
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
20/22
Vertex Cover: Multiple Bipartite Graphs
B(1)√n,ε
B(2)√n,ε B
(√n)√
n,ε
a local optimum
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
20/22
Vertex Cover: Multiple Bipartite Graphs
B(1)√n,ε
B(2)√n,ε B
(√n)√
n,ε
Theorem (Oliveto,He,Yao, 2009)
Let ε > n−1/2+δ, with 0 < δ < 1/2 a constant. With an overwhelmingprobability, the (1+1) EA does not find an approximation that is betterthan 2− o(1) in polynomial time.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
20/22
Vertex Cover: Multiple Bipartite Graphs
B(1)√n,ε
B(2)√n,ε B
(√n)√
n,ε
Theorem (Oliveto,He,Yao, 2009)
Let ε > n−1/2+δ, with 0 < δ < 1/2 a constant. With an overwhelmingprobability, the (1+1) EA does not find an approximation that is betterthan 2− o(1) in polynomial time.
Proof idea: The probability that all the bipartite subgraphs are optimizedby the same run is exponentially small, c−
√n
(Parallel) populations do not help either!
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
20/22
Vertex Cover: Multiple Bipartite Graphs
B(1)√n,ε
B(2)√n,ε B
(√n)√
n,ε
Theorem (Oliveto,He,Yao, 2009)
Let ε > n−1/2+δ, with 0 < δ < 1/2 a constant. With an overwhelmingprobability, the (1+1) EA does not find an approximation that is betterthan 2− o(1) in polynomial time.
Theorem (Neumann et al. 2011)
With overwhelming probability a Parallel EA using crossover duringmigration finds the minimum vertex cover of the graph within O(n2 ln n)fitness function evaluations.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
21/22
Approximation Upper Bound: Edge Representation
Representationx ∈ 0, 1m encodes a selection of edges (for each chosen edge bothendpoints are in the cover); a(x): adjacent edges in the cover;
Fitness Function
f ′(x) := (n + 1)2 · a(x) + (n + 1) · u(x) +n∑
i=1
xi
Theorem (Jansen,Oliveto,Zarges, 2013)
The (1+1) EA using fitness function f ′(x) and edge-basedrerpresentation finds at least a 2-approximation in expected timeO(n log n) regardless of the initial search point.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
21/22
Approximation Upper Bound: Edge Representation
Representationx ∈ 0, 1m encodes a selection of edges (for each chosen edge bothendpoints are in the cover); a(x): adjacent edges in the cover;
Fitness Function
f ′(x) := (n + 1)2 · a(x) + (n + 1) · u(x) +n∑
i=1
xi
Theorem (Jansen,Oliveto,Zarges, 2013)
The (1+1) EA using fitness function f ′(x) and edge-basedrerpresentation finds at least a 2-approximation in expected timeO(n log n) regardless of the initial search point.
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
22/22
The End
Presented
Considered examples of RSHs in combinatorial optimization
Showed how random search finds good or optimal solutions
Showed how populations can deliver improved results
Glimpse of proof techniques
Not Presented – And Open Problems
Hybrid RSHs
A complexity theory for RSHs
Other frameworks: multi-objective optimization, dynamic problems,different inputs models, . . .
Stronger results concerning (more realistic) population EAs (i.e.using stochastic selection mechanisms)
Thank you!
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
22/22
The End
Presented
Considered examples of RSHs in combinatorial optimization
Showed how random search finds good or optimal solutions
Showed how populations can deliver improved results
Glimpse of proof techniques
Not Presented – And Open Problems
Hybrid RSHs
A complexity theory for RSHs
Other frameworks: multi-objective optimization, dynamic problems,different inputs models, . . .
Stronger results concerning (more realistic) population EAs (i.e.using stochastic selection mechanisms)
Thank you!
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
22/22
The End
Presented
Considered examples of RSHs in combinatorial optimization
Showed how random search finds good or optimal solutions
Showed how populations can deliver improved results
Glimpse of proof techniques
Not Presented – And Open Problems
Hybrid RSHs
A complexity theory for RSHs
Other frameworks: multi-objective optimization, dynamic problems,different inputs models, . . .
Stronger results concerning (more realistic) population EAs (i.e.using stochastic selection mechanisms)
Thank you!
Pietro S. Oliveto Theory of Evolutionary Algorithms for Combinatorial Optimisation
top related