fast cross-entropy methods for combinatorial optimization, simulation ...iew3.technion.ac.il › ce...

76
Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at INFORMS, Seattle Washington, 2007 Reuven Y. Rubinstein Dirk P. Kroese * Faculty of Industrial Engineering and Management, Technion, Israel, * Department of Mathematics, The University of Queensland, Australia Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented a

Upload: others

Post on 30-Jun-2020

9 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Fast Cross-Entropy Methods forCombinatorial Optimization, Simulation and

CountingPlenary Talk Presented at INFORMS, Seattle Washington, 2007

Reuven Y. Rubinstein Dirk P. Kroese ∗

Faculty of Industrial Engineering and Management, Technion, Israel,∗Department of Mathematics, The University of Queensland, Australia

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 2: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Contents

1. A Simple Tutorial.

2. Importance Sampling; Some Theory.

3. The Cross-Entropy (CE) Method in Rare-Event Simulation.

4. CE for Solving NP-hard Combinatorial OptimizationProblems.

5. CE and MinxEnt for Solving NP-hard Counting Problems:Satisfiability, Hamiltonian Cycles.

6. Some Theory on CE.

7. Conclusions and Open Problems.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 3: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

CE Matters

Book: R.Y. Rubinstein and D.P. Kroese.The Cross-Entropy Method: A Unified

Approach to Combinatorial Optimization,

Monte Carlo Simulation and Machine Learn-

ing, Springer-Verlag, New York, 2004.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 4: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

CE Matters

Book: R.Y. Rubinstein and D.P. Kroese.The Cross-Entropy Method: A Unified

Approach to Combinatorial Optimization,

Monte Carlo Simulation and Machine Learn-

ing, Springer-Verlag, New York, 2004.

Special Issue:Annals of Operations Research, 2005

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 5: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

CE Matters

Book: R.Y. Rubinstein and D.P. Kroese.The Cross-Entropy Method: A Unified

Approach to Combinatorial Optimization,

Monte Carlo Simulation and Machine Learn-

ing, Springer-Verlag, New York, 2004.

Special Issue:Annals of Operations Research, 2005

The CEhome page:

http://www.cemethod.org

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 6: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

CE Matters

Book: R.Y. Rubinstein and D.P. Kroese.Simulation and the Monte Carlo Method:Second Edition, Wiley, 2007.

®

Simulation and the

Monte Carlo Method

R e u v e n Y . R u b i n s t e i n

D i r k P. K r o e s e

S E C O N D E D I T I O N

Wiley Series in Probability and Statistics

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 7: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Applications

Combinatorial Optimization, like TSP, Maximal Cut,

Scheduling and Production Lines.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 8: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Applications

Combinatorial Optimization, like TSP, Maximal Cut,

Scheduling and Production Lines.

Machine Learning

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 9: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Applications

Combinatorial Optimization, like TSP, Maximal Cut,

Scheduling and Production Lines.

Machine Learning

Pattern Recognition, Clustering and Image Analysis

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 10: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Applications

Combinatorial Optimization, like TSP, Maximal Cut,

Scheduling and Production Lines.

Machine Learning

Pattern Recognition, Clustering and Image Analysis

DNA Sequence Alignment

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 11: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Applications

Combinatorial Optimization, like TSP, Maximal Cut,

Scheduling and Production Lines.

Machine Learning

Pattern Recognition, Clustering and Image Analysis

DNA Sequence Alignment

Simulation-based (noisy) Optimization, like Optimal Buffer

Allocation and Optimization in Finance Engineering

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 12: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Applications

Combinatorial Optimization, like TSP, Maximal Cut,

Scheduling and Production Lines.

Machine Learning

Pattern Recognition, Clustering and Image Analysis

DNA Sequence Alignment

Simulation-based (noisy) Optimization, like Optimal Buffer

Allocation and Optimization in Finance Engineering

Multi-extremal Continuous Optimization

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 13: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Applications

Combinatorial Optimization, like TSP, Maximal Cut,

Scheduling and Production Lines.

Machine Learning

Pattern Recognition, Clustering and Image Analysis

DNA Sequence Alignment

Simulation-based (noisy) Optimization, like Optimal Buffer

Allocation and Optimization in Finance Engineering

Multi-extremal Continuous Optimization

NP- hard Counting problems: Hamiltonian Cycles, SAW’s,

calculation the Permanent, Satisfiability Problem, etc.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 14: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Combinatorial Optimization: AColoring Problem

We wish to color the nodes white and black.

How should we color so that the total number of linksbetweenthe two groups is maximized? This problem is known asMaximal Cut problem.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 15: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Continuous Optimization: AMulti-extremal function

This is the trigonometric function.

-1-0.5

00.5

1 -1

-0.5

0

0.5

1

0

5

10

15

20

25

What is its global maximum, and where is it attained?

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 16: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Another Multi-extremal function

-2

0

2

4

6

-2 0 2 4 60

0.1

0.2

0

0.1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 17: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Combinatorial Optimization: TSP

row1

row2

row3

row4

row5

row6

row7

row8

row9

row10

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Pro

ba

bilit

y

ITERATION 0

row1

row2

row3

row4

row5

row6

row7

row8

row9

row10

row1

row2

row3

row4

row5

row6

row7

row8

row9

row10

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Pro

ba

bilit

y

ITERATION 1

row1

row2

row3

row4

row5

row6

row7

row8

row9

row10

row1

row2

row3

row4

row5

row6

row7

row8

row9

row10

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Pro

ba

bilit

y

ITERATION 2

row1

row2

row3

row4

row5

row6

row7

row8

row9

row10

row1

row2

row3

row4

row5

row6

row7

row8

row9

row10

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Pro

ba

bilit

y

ITERATION 3

row1

row2

row3

row4

row5

row6

row7

row8

row9

row10

row1

row2

row3

row4

row5

row6

row7

row8

row9

row10

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Pro

ba

bilit

y

ITERATION 4

row1

row2

row3

row4

row5

row6

row7

row8

row9

row10

row 1

row 2

row 3

row 4

row 5

row 6

row 7

row 8

row 9

row 10

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Pro

bab

ilit

y

ITERATION 5

row 1

row 2

row 3

row 4

row 5

row 6

row 7

row 8

row 9

row 10

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 18: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Counting Hamiltonian Cycles

How many Hamiltonian cycles does this graph have?

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 19: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Calculating the Number of HC’s

1 2 3 4 5 6 7 8 9 10

0

0.2

0.4

0.6

0.8

1

P0

1 2 3 4 5 6 7 8 9 10

0

0.2

0.4

0.6

0.8

1

P1

1 2 3 4 5 6 7 8 9 10

0

0.2

0.4

0.6

0.8

1

P2

1 2 3 4 5 6 7 8 9 10

0

0.2

0.4

0.6

0.8

1

P3

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 20: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Hamiltonian Cycles: Large Example.

Performance of CE Algorithm forn = 100, η = 0.4 andN = 5n2 = 50, 000

t 0 1 2 3 4

Average 7.42E+115 8.00E+115 7.12E+115 7.48E+115 7.84E+115

|X ∗| Min 6.14E+115 6.79E+115 6.55E+115 6.79E+115 6.76E+115

Max 8.21E+115 8.97E+115 7.95E+115 8.12E+115 8.70E+115

ε 0.056 0.074 0.053 0.048 0.058

ε ε∗ 0.005 0.001 0.007 0.003 0.005

ε∗ 0.173 0.151 0.116 0.092 0.137

RE 0.077 0.089 0.068 0.059 0.078

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 21: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Introduction

The CE Method can be used to solve two types of problems:1. Estimation:

Estimate ℓ = E[H(X)]

X: random vector/process taking values in some setX .H: function onX .In particular, the estimation ofrare eventprobabilities:ℓ = P(S(X) ≥ γ), whereS is another function onX .

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 22: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Introduction

The CE Method can be used to solve two types of problems:1. Estimation:

Estimate ℓ = E[H(X)]

X: random vector/process taking values in some setX .H: function onX .In particular, the estimation ofrare eventprobabilities:ℓ = P(S(X) ≥ γ), whereS is another function onX .

2. Optimization:

Determine maxx∈X S(x)

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 23: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Estimation via Importance Sampling

Let the expected performance of a stochastic system be

ℓ = Ef [H(X)] =

∫H(x) f(x) dx .

HereH is the sample performance function,f is the density ofX and the subscriptf in Ef [H(X)] means that the expectationis taken with respect to the densityf .An example ofH(X) is anindicator function

H(X) = I{S(X)≥γ} =

{1 if S(X) ≥ γ

0 otherwise .

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 24: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Importance sampling

Note that we can writeℓ as

ℓ =

∫H(x)

f(x)

g(x)g(x) dx = Eg

[H(X)

f(X)

g(X)

],

where the subscriptg means that the expectation is taken withrespect to the IS pdfg.The (unbiased)likelihood ratio estimatorof ℓ is

ℓ =1

N

N∑

i=1

H(X i) W (X i) ,

whereW (x) = f(x)/g(x) is the likelihood ratio (LR) andX1, . . . ,XN ∼ g. This idea is calledimportance sampling(IS).

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 25: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Parametric IS

We consider now only the case where the IS densityg belongs tothesame parametric familyasf . Let f(·;u) denote the densityof the random vectorX for some fixed “nominal” parameteru ∈ V .In this case the LR estimatorℓ, with g(x) = f(x;v) becomes

ℓ =1

N

N∑

i=1

H(X i) W (X i;u,v), and W (X;u,v) =f(X;u)

f(X;v),

whereX1, . . . ,XN is a random sample fromf(·;v).

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 26: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Parametric IS

The “optimal”v = ∗v should minimize

Varv [H(X) W (X;u,v)] ,

which is the same as minimizing

V (v) = Ev

[H2(X) W 2(X;u,v)

].

Using again the IS idea, we can write

V (v) = Ew

[H2(X) W (X;u,v) W (X;u,w)

],

wherew is anarbitraryreference parameter.

This function may still be difficult to minimize.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 27: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Parametric IS

The stochastic counterpart of

minv

V (v) = minv

Ew

[H2(X) W (X;u,v) W (X;u,w)

].

is

minv

V (v) = minv

1

N

N∑

i=1

H2(X i) W (X i ;u,v) W (X i ;u,w) ,

whereX1, . . . ,XN is an i.i.d. sample fromf(·;w).

Hence, we can estimate∗v by minimizing V (v). This usuallyinvolves numerical minimization.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 28: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Cross-Entropy (CE) Method

In the Cross-Entropy method wechooseg = f(·;v) such that the“distance” between the densitiesg∗ andf(·;v) is minimal.

TheKullback-Leibler or cross-entropy distance is definedas:

D(g, h) = Eg

[log

g(X)

h(X)

]

=

∫g(x) log g(x) dx −

∫g(x) log h(x) dx .

Note thatD(g, h) ≥ 0 andD(g, h) = 0 wheng = h.Shannon Entropy

H(x) = −E log f(X) = −

∫f(x) log f(x) dx

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 29: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

CE Method

This is equivalent to solving

maxv

D(v) = maxv

Eu [H(X) log f(X;v)] .

Using again IS, we can rewrite this as

maxv

Ew [H(X) W (X,u,w) log f(X;v)] .

We mayestimatethe optimal solutionv∗ by solving the

following stochastic counterpart:

maxv

1

N

N∑

i=1

I{S(Xi)≥γ} W (X i;u,w) log f(X i;v) ,

whereX1, . . . ,XN is a random sample fromf(·;w).Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 30: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

CE Method

Summarizing: in CE we maximize

maxv

D(v) = maxv

1

N

N∑

i=1

H(X i) W (X i ;u,w) ln f(X i ;u,v) ,

instead of

minv

V (v) = minv

1

N

N∑

i=1

H2(X i) W (X i ;u,w) W (X i ;u,v) ,

whereX i ∼ f(x;w).

The advantage: CE programs can be typically solvedanalytically, while the variance minimization onlynumerically.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 31: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

The Simplest Example

Consider the problem of estimating

ℓ = Eu

[I{X≥γ}

]= P(X ≥ γ),

with X ∼ Exp(1/u). Consider the family of IS densities

f(x; v) = v−1e−x/v, x ≥ 0 .

To find the CE optimal parameterv∗ we need maximizeD(v),with

D(v) = Eu

[I{X≥γ}W (X;u,w) log f(X; v)

]

= Eu

[I{X≥γ}W (X;u,w)

(− log(v) −

X

v

)].

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 32: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

The Simplest Example

SettingD′(v) = 0 we find

v∗ =Eu

[I{X≥γ}X

]

Eu

[I{X≥γ}

] =Ew

[I{X≥γ}W (X;u,w)X

]

Ew

[I{X≥γ}W (X;u,w)

] .

Similarly, the solution toD′(v) = 0 is

v =

∑Ni=1 I{Xi≥γ}W (Xi;u,w)Xi∑N

i=1 I{Xi≥γ}W (Xi;u,w),

with X1, . . . , XN ∼ f(·;w).

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 33: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

General Algorithm

Generateγ1, v1, γ2, v2, . . . , γT , vT .

γ1 γ2 x −→

t = 0 : f(x;u) = u−1 e−u−1x

t = 1 : f(x; v1) = v−1

1e−v

−11 x

t = 2 : f(x; v2) = v−1

2e−v

−12 x

1

0

2

t = 3 : f(x;w) = w−1 e−w−1x

3

γ3γ3 = γ

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 34: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Generic CE AlgorithmThe CE Algorithm generates a sequence{(γt, vt)}, t ≥ 0;

γt monotonically increases and crosses the levelγ after afinite number of iterationst,

vt converges to the optimal parameterv∗ of the IS densityg(x) = f(x;v∗).

Assuming independent components from a 1-parameterexponential family parameterized by the mean, theanalyticupdating formula is

vt,j =

∑Ni=1 I{S(Xi)≥γt}W (X i;u, vt−1) Xij∑N

i=1 I{S(Xi)≥γt}W (X i;u, vt−1).

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 35: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Combinatorial Optimization

LetX be a finite set ofstates, and letS be a real-valuedsample

functionS overX . We wish to find

S(x∗) = γ∗ = maxx∈X

S(x) .

The starting point in the methodology of the CE method is to as-

sociate with the above optimization problem a meaningfulesti-

mation problem.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 36: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

General CE Procedure

The CE method cast the original optimization problem ofS(x)

into an associated rare-events probability estimation problem,that estimation of

ℓ = P(S(X) ≥ γ) = E[I{S(X)≥γ}

].

and involves the following iterative steps:

Formulate a parameterized random mechanism togenerate

the objectsx ∈ X .

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 37: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

General CE Procedure

The CE method cast the original optimization problem ofS(x)

into an associated rare-events probability estimation problem,that estimation of

ℓ = P(S(X) ≥ γ) = E[I{S(X)≥γ}

].

and involves the following iterative steps:

Formulate a parameterized random mechanism togenerate

the objectsx ∈ X .

Give theupdating formulasfor the parameters of therandom mechanism (obtained via Cross-Entropyminimization), in order to produce a better sample in thenext iteration.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 38: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut Example

Consider a weighted graphG with node setV = {1, . . . , n}.Partition the nodes of the graph into two subsetsV1 andV2 suchthat the sum of the weights of the edges going from one subset tothe other is maximized.

Example3

4

5

62

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 39: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

The Max-Cut problem

Cost matrix:

C =

0

B

B

B

B

B

B

B

B

B

B

@

0 c12 c13 0 0 0

c21 0 c23 c24 0 0

c31 c32 0 c34 c35 0

0 c42 c43 0 c45 c46

0 0 c53 c54 0 c56

0 0 0 c64 c65 0

1

C

C

C

C

C

C

C

C

C

C

A

.

{V1, V2} = {{1, 3, 4}, {2, 5, 6}} is a possiblecut. Thecostof thecut is

c12 + c32 + c35 + c42 + c45 + c46.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 40: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Generation and Updating Formulas

Generation of cut vectors: The most natural and easiestway to generate the cut vectors is to letX2, . . . , Xn beindependent Bernoulli random variables with successprobabilitiesp2, . . . , pn.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 41: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Generation and Updating Formulas

Generation of cut vectors: The most natural and easiestway to generate the cut vectors is to letX2, . . . , Xn beindependent Bernoulli random variables with successprobabilitiesp2, . . . , pn.

Updating formulas:

pt,j =

∑Ni=1 I{S(Xi)≥γt} I{Xij=1}∑N

i=1 I{S(Xi)≥γt}

=

∑Xi∈Et

Xij

|Et|, j = 2, . . . , n

where|Et| = ρN , the number ofelite samples.

Note that the likelihood term is missing.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 42: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

CE Optimization Algorithm

1 Start with p0 = (1/2, . . . , 1/2). Let t := 1.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 43: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

CE Optimization Algorithm

1 Start with p0 = (1/2, . . . , 1/2). Let t := 1.

2 Update γt: Draw X1, . . . ,XN from Ber(pt). Let γt be

the worst performance of the ρ × 100% best

performances.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 44: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

CE Optimization Algorithm

1 Start with p0 = (1/2, . . . , 1/2). Let t := 1.

2 Update γt: Draw X1, . . . ,XN from Ber(pt). Let γt be

the worst performance of the ρ × 100% best

performances.

3 Update pt: Use the same sample to calculate

pt,j =

∑Xi∈Et

Xij

|Et|,

j = 1, . . . , n, where X i = (Xi1, . . . ,Xin), and increase t

by 1.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 45: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

CE Optimization Algorithm

1 Start with p0 = (1/2, . . . , 1/2). Let t := 1.

2 Update γt: Draw X1, . . . ,XN from Ber(pt). Let γt be

the worst performance of the ρ × 100% best

performances.

3 Update pt: Use the same sample to calculate

pt,j =

∑Xi∈Et

Xij

|Et|,

j = 1, . . . , n, where X i = (Xi1, . . . ,Xin), and increase t

by 1.

4 If the stopping criterion is met, then stop;

otherwise set t := t + 1 and reiterate from step 2.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 46: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 47: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 48: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 49: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 50: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 51: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 52: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 53: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 54: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 55: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 56: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 57: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 58: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 59: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 60: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 61: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 62: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 63: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 64: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 65: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 66: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 67: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Max-Cut: Synthetic Problem,n = 400, m = 200

0 50 100 150 200 250 300 350 400 4500

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 68: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Counting via Monte Carlo

We start with the following basicExample.Assume we want to calculate an area of same “irregular" regionX ∗. The Monte-Carlo method suggests inserting the ”irregular"regionX ∗ into a nice “regular" oneX as per figure below

X ∗

X

X : Set of objects (paths in a graph,colorings of a graph, etc.)X ∗ : Subset ofspecialobjects (cy-cles in a graph, colorings of a cer-tain type, etc).

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 69: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Counting via Monte Carlo

To calculate|X ∗| we apply the following sampling procedure:

(i) Generate a random sampleX1, . . . ,XN , uniformly

distributed over the “regular” regionX .

(ii) Estimate the desired area|X ∗| as

|X ∗| = ℓ|X |,

where

ℓ =NX ∗

NX=

1

N

N∑

k=1

I{Xk∈X ∗},

I{Xk∈X ∗} denotes the indicator of the event{Xk ∈ X ∗} and

{Xk} is a sample fromf(x) overX , wheref(x) = 1|X |

.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 70: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Counting via Monte Carlo

The above formula

|X ∗| = ℓ|X |

is also valid forcounting problems, that is whereX ∗

presents a discrete rather a continuous set of points. Forexample, in HC problem

1. X is the entire set of tours in the graph. Note that|X | = (n − 1)!

2. X ∗ is the subset of tours of lengthn.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 71: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Counting via Rare-Events

Note that for counting problems

ℓ =|X ∗|

|X |= EU[I{X∈X ∗}]

is typically very small, so the naive,crude Monte Carloestimator

of ℓ is useless. It is easy to show that using importance sampling

we obtain

|X ∗| = Eg[I{X∈X ∗}1

g(X)].

The IS estimate of|X ∗| is therefore

|X ∗| =1

N

N∑

k=1

I{Xk∈X ∗}1

g(Xk)=

Xk∈X ∗

1

g(Xk).

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 72: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Chosing the IS pdf g(x)

The best choice forg is, clearly,g∗(x) = 1/|X ∗|, x ∈ X ∗, which

is theuniform distribution on X ∗. Underg∗ the estimator has

zero variance, since the random variable|X ∗| = const, so that

only one sample is required. However, sampling from suchg∗ is

impractical, since it requires availability of our target value|X ∗|.

To overcome this difficulty we shall show in the following sec-

tions how to construct ”good" (law variance) IS sample pdf’sg(x)

(parametric and nonparametric) for different #P-completecount-

ing problems.

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 73: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

A Knapsack Problem

Performance of the CE Algorithm for the knapsack problem with

the instance matrixA = (20 × 11) andN = 10, 000. This

problem was taken from the website http://elib.zib.de. Using full

enumuration we found that the total number of multiple exrema

is 612.

t Mean Max Min PV RE S m RDc

0 639.6 943.7 419.4 0.00 0.225 6.93 10 55.73

1 619.2 697.6 564.8 0.03 0.072 5.78 11 0.02

2 630.8 706.5 557.0 0.07 0.059 5.18 11 0.03

3 628.1 698.1 533.2 0.08 0.083 4.95 11 0.03

4 573.7 671.2 504.9 0.09 0.083 4.88 11 0.06

5 599.3 719.6 525.7 0.09 0.100 4.72 11 0.03

6 576.9 646.4 508.0 0.09 0.071 4.76 11 0.06

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 74: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

A Knapsack Problem

A typical dynamics of the CE Algorithm for the knapsackproblem with the instance matrixA = (20 × 11) andN = 10, 000.

2 4 6 8 10 12 14 16 18 200

0.5

1

t 0

2 4 6 8 10 12 14 16 18 200

0.5

1

t 1

2 4 6 8 10 12 14 16 18 200

0.5

1

t 3

2 4 6 8 10 12 14 16 18 200

0.5

1

t 5

2 4 6 8 10 12 14 16 18 200

0.5

1

t 7

2 4 6 8 10 12 14 16 18 200

0.5

1

t 9

2 4 6 8 10 12 14 16 18 200

0.5

1

t 11

2 4 6 8 10 12 14 16 18 200

0.5

1

t 13

2 4 6 8 10 12 14 16 18 200

0.5

1

t 15

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 75: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

Calculating the Number of HC’s

1 2 3 4 5 6 7 8 9 10

0

0.2

0.4

0.6

0.8

1

P0

1 2 3 4 5 6 7 8 9 10

0

0.2

0.4

0.6

0.8

1

P1

1 2 3 4 5 6 7 8 9 10

0

0.2

0.4

0.6

0.8

1

P2

1 2 3 4 5 6 7 8 9 10

0

0.2

0.4

0.6

0.8

1

P3

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at

Page 76: Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation ...iew3.technion.ac.il › CE › files › talk-informs07.pdf · Estimation via Importance Sampling Let the

THANK YOU

Fast Cross-Entropy Methods for Combinatorial Optimization, Simulation and Counting Plenary Talk Presented at