chapter 15 dynamic programming lee, hsiu-hui ack: this presentation is based on the lecture slides...

60
Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as w ell as various materials from the web.

Upload: chloe-goodwin

Post on 18-Jan-2018

217 views

Category:

Documents


0 download

DESCRIPTION

chap15Hsiu-Hui Lee3 The development of a dynamic programming algorithm can be broken into a sequence of four steps: 1. Characterize the structure of an optimal solution. 2. Recursively define the value of an optimal solution. 3. Compute the value of an optimal solution in a bottom up fashion. 4. Construct an optimal solution from computed information.

TRANSCRIPT

Page 1: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

Chapter 15

Dynamic Programming Lee, Hsiu-Hui

Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from the web.

Page 2: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 2

Introduction• Dynamic programmingDynamic programming is typically

applied to optimization problems. • In such problem there can be many

solutions. Each solution has a value, and we wish to find a solution with the optimal value.

Page 3: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 3

The development of a dynamic programming algorithm can be broken into a sequence of four steps:1. Characterize the structure of an optimal solution.2. Recursively define the value of an optimal solution.3. Compute the value of an optimal solution in a

bottom up fashion.4. Construct an optimal solution from computed

information.

Page 4: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 4

Assembly-line scheduling

Page 5: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 5

a manufacturing problem to find the fast way through a factory

ai,j : the assembly time required at station Si,j

ti,j : the time to transfer a chassis away from assembly line i after having gone through station Si,j

Page 6: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 6

Page 7: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 7

An instance of the assembly-line problem with costs

Page 8: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 8

Step 1: The structure of the fastest way through the factory

Page 9: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 9

• Optimal substructure An optimal solution to a problem (fastest way through S1, j ) contains

within it an optimal solution to subproblems. (fastest way through S1, j−1 or S 2, j−1).

• Use optimal substructure to construct optimal solution to problem from optimal solutions to subproblems.

To solve problems of finding a fastest way through S1, j and S2, j , solve subproblems of finding a fastest way through S1, j-1 and S2, j-1 .

Page 10: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 10

Step 2: A recursive solution

)][,][min( 2211* xnfxnff

2if,1if

2if,1if

)]1[,]1[min(][

)]1[,]1[min(][

,21,11,22

1,222

,11,22,11

1,111

jjjj

atjfajf

aejf

atjfajf

aejf

jjj

jjj

fi[j]: the fastest possible time to get a chassis from the starting point through station Si, j

f*: the fastest time to get a chassis all the way through the factory.

Page 11: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 11

• li [ j ] = line # (1 or 2) whose station j − 1 is used in fastest way through Si, j .

• Sli [ j ], j−1 precedes Si, j .• Defined for i = 1,2 and j = 2, . . . , n.• l∗ = line # whose station n is used.

Page 12: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 12

step 3: computing an optimal solution• Let ri(j)be the number of references made to fi[j] in a re

cursive algorithm. r1(n)=r2(n)=1r1(j) = r2(j)=r1(j+1)+r2(j+1)

• The total number of references to all fi[j] values is (2n).• We can do much better if we compute the fi[j] values in

different order from the recursive way. Observe that for j 2, each value of fi[j] depends only on the values of f1[j-1] and f2[j-1].

Page 13: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 13

FASTEST-WAY procedure

FASTEST-WAY(a, t, e, x, n)1 f1[1] e1 + a1,1

2 f2[1] e2 + a2,1

3 for j 2 to n4 do if f1[j-1] + a1,j f2[j-1] + t2,j-1 +a1,j

5 then f1[j] f1[j-1] + a1,j

6 l1[j] 17 else f1[j] f2[j-1] + t2,j-1 +a1,j

8 l1[j] 29 if f2[j-1] + a2, j f1[j-1] + t1,j-1 +a2,j

Page 14: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 14

10 then f2[j] f2[j – 1] + a2,j

11 l2[j] 2

12 else f2[j] f1[j – 1] + t1,j-1 + a2,j

13 l2[j] 1

14 if f1[n] + x1 f2[n] + x2

15 then f* = f1[n] + x1

16 l* = 117 else f* = f2[n] + x2

18 l* = 2

Page 15: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 15

step 4: constructing the fastest way through the factory

PRINT-STATIONS(l, l*, n)1 i l*2 print “line” i “,station” n3 for j n downto 2

4 do i li[j]5 print “line” i “,station” j – 1

output line 1, station 6line 2, station 5line 2, station 4line 1, station 3line 2, station 2line 1, station 1

Page 16: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 16

15.2 Matrix-chain multiplication

• A product of matrices is fully parenthesized if it is either a single matrix, or a product of two fully parenthesized matrix product, surrounded by parentheses.

Page 17: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 17

• How to compute where is a matrix for every i.

• Example:

A A An1 2 ... Ai

A A A A1 2 3 4

( ( ( ))) ( (( ) ))(( )( )) (( ( )) )((( ) ) )

A A A A A A A AA A A A A A A AA A A A

1 2 3 4 1 2 3 4

1 2 3 4 1 2 3 4

1 2 3 4

Page 18: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 18

MATRIX MULTIPLY MATRIX MULTIPLY(A, B)1 if columns[A] column[B]2 then error “incompatible dimensions”3 else for to rows[A]4 do for to columns[B]5 do 6 for to columns[A]7 do 8 return C

i 1j 1

c i j[ , ] 0k 1

c i j c i j A i k B k j[ , ] [ , ] [ , ] [ , ]

Page 19: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 19

Complexity:

Let A be a matrix B be a matrix.Then the complexity is

p qq r

p q r

Page 20: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 20

Example:

is a matrix, is a matrix, and is a matrix

Then takes time.However takes time.

A1 10 100 A2 100 5A3 5 50(( ) )A A A1 2 310 100 5 10 5 50 7500

( ( ))A A A1 2 3

100 5 50 10 100 50 75000

Page 21: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 21

The matrix-chain multiplication problem: • Given a chain of n

matrices, where for i=0,1,…,n, matrix Ai has dimension pi-1pi, fully parenthesize the product in a way that minimizes the number of scalar multiplications.

nAAA ,...,, 21

A A An1 2 ...

Page 22: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 22

Counting the number of parenthesizations:

• [Catalan number]

2)()(

11)( 1

1

nifknPkP

nifnP n

k

P n C n( ) ( ) 1

112 4

3 2nn

n n

n( )/

Page 23: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 23

Step 1: The structure of an optimal parenthesization

(( ... )( ... ))A A A A A Ak k k n1 2 1 2

Optimal

Combine

Page 24: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 24

Step 2: A recursive solution

• Define m[i, j]= minimum number of scalar multiplications needed to compute the matrix

• goal m[1, n]•

A A A Ai j i i j.. .. 1

m i j[ , ]

jipppjkmkimji

jkijki }],1[],[{min0

1

Page 25: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

Step 3: Computing the optimal costs

Page 26: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 26

MATRIX_CHAIN_ORDER MATRIX_CHAIN_ORDER(p)1 n length[p] –1 2 for i 1 to n3 do m[i, i] 04 for l 2 to n5 do for i 1 to n – l + 16 do j i + l – 1 7 m[i, j] 8 for k i to j – 1 9 do q m[i, k] + m[k+1, j]+ pi-1pkpj

10 if q < m[i, j]11 then m[i, j] q12 s[i, j] k13 return m and s

Complexity: O n( )3

Page 27: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 27

Example:

656

545

434

323

212

101

25202010

10551515353530

ppAppAppAppAppAppA

Page 28: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 28

the m and s table computed by MATRIX-CHAIN-ORDER for n=6

Page 29: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 29

m[2,5]=min{ m[2,2]+m[3,5]+p1p2p5=0+2500+351520=13000, m[2,3]+m[4,5]+p1p3p5=2625+1000+35520=7125, m[2,4]+m[5,5]+p1p4p5=4375+0+351020=11374}=7125

Page 30: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

Step 4: Constructing an optimal solution

Page 31: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 31

PRINT_OPTIMAL_PARENS(s, i, j)1 if j = i2 then print “A”i

3 else print “(“4 PRINT_OPTIMAL_PARENS(s, i, s[i,j]) 5 PRINT_OPTIMAL_PARENS(s, s[i,j]+1,

j)6 print “)“

Page 32: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 32

example: PRINT_OPTIMAL_PARENS(s, 1, 6)Output:

[1,6]

[4,6][1,3

]

[4,5]

A5A4A2 A3

[6,6]

A6

[1,1]

A1

[2,3]

[2,2] [3,3] [4,4] [5,5]

(( ( ))(( ) ))A A A A A A1 2 3 4 5 6

Page 33: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

16.3 Elements of dynamic programming

Page 34: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 34

Optimal substructure• We say that a problem exhibits optimal s

ubstructure if an optimal solution to the problem contains within its optimal solution to subproblems.

• Example: Matrix-multiplication problem

Page 35: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 35

1. You show that a solution to the problem consists of making a choice. Making this choice leaves one or more subproblems to be solved.

2. You suppose that for a given problem, you are given the choice that leads to an optimal solution.

3. Given this choice, you determine which subproblems ensue and how to best characterize the resulting space of subproblems.

4. You show that the solutions to the subproblems used within the optimal solution to the problem must themselves be optimal by using a “cut-and-paste” technique.

Page 36: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 36

Optimal substructure varies across problem domains in two ways:

1. how many subproblems are used in an optimal solutiion to the original problem, and

2. how many choices we have in determining which subproblem(s) to use in an optimal solution.

Page 37: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

Overlapping subproblems:

example: MAXTRIX_CHAIN_ORDER

Page 38: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 38

RECURSIVE_MATRIX_CHAIN RECURSIVE_MATRIX_CHAIN(p, i, j)1 if i = j2 then return 03 m[i, j] 4 for k i to j – 1 5 do q RMC(p,i,k) + RMC(p,k+1,j) + pi-1pkpj

6 if q < m[i, j]7 then m[i, j] q8 return m[i, j]

Page 39: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 39

The recursion tree for the computation of RECURSUVE-MATRIX-CHAIN(P, 1, 4)

Page 40: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 40

• We can prove that T(n) =(2n) using substitution method.

1

1

1)1)()((1)(

1)1(n

k

nforknTkTnT

T

T n T i ni

n( ) ( )

2

1

1

Page 41: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 41

Solution: 1. bottom up2. memorization (memorize the natural, but ineffi

cient)

11

2

0

1

1

1

0

2)22()12(2

2222)(

21)1(

nnn

n

i

in

i

i

nn

nnnT

T

Page 42: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 42

MEMORIZED_MATRIX_CHAIN MEMORIZED_MATRIX_CHAIN(p)1 n length[p] –1 2 for i 1 to n3 do for j 1 to n4 do m[i, j] 5 return LOOKUP_CHAIN (p,1,n)

Page 43: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 43

LOOKUP_CHAIN LOOKUP_CHAIN(p, i, j)1 if m[i, j] < 2 then return m[i, j]3 if i = j4 then m[i, j] 05 else for k i to j – 1 6 do q LC(p, i, k) +LC(p, k+1, j)+pi-1pkpj

7 if q < m[i, j]8 then m[i, j] q9 return m[i, j]

Time Complexity: )( 3nO

Page 44: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 44

16.4 Longest Common Subsequence X = < A, B, C, B, D, A, B >Y = < B, D, C, A, B, A >• < B, C, A > is a common subsequence of

both X and Y.• < B, C, B, A > or < B, C, A, B > is the

longest common subsequence of X and Y.

Page 45: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 45

Longest-common-subsequence problem: • Given two sequences

X = <x1,x2,...,xm> Y = <y1,y2,...,yn>• To find a maximum length common subsequen

ce (LCS) of X and Y.• Define Xi : the ith prefix of X

Xi = < x1,x2,...,xi >e.g. X=<A,B,C,B,D,A,B>

X4 =<A,B,C,B> X0 = empty sequence

Page 46: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 46

Theorem 15.1 (Optimal substructure of LCS)

Let X = <x1,x2,...,xm> and Y = <y1,y2,...,yn>and let Z = <z1,z2,...,zk> be any LCS of X and Y.

1. If xm = yn

then zk = xm = yn and Zk-1 is an LCS of Xm-1 and Yn-1.2. If xm yn

then zk xm implies Z is an LCS of Xm-1 and Y.3. If xm yn

then zk yn implies Z is an LCS of X and Yn-1.

Page 47: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 47

A recursive solution to subproblem

• Define c [i, j] is the length of the LCS of Xi and Yj .

ji

ji

yx>i,jjicjic=yx> i,jjic

=j=ijic

and if and if

or if

0}],1[],1,[max{01]1,1[

000],[

Page 48: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 48

Computing the length of an LCSLCS_LENGTH(X,Y)1 m length[X]2 n length[Y]3 for i 1 to m4 do c[i, 0] 05 for j 1 to n6 do c[0, j] 07 for i 1 to m8 do for j 1 to n

Page 49: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 49

9 do if xi = yj

10 then c[i, j] c[i-1, j-1]+111 b[i, j] “”12 else if c[i–1, j] c[i, j-1]13 then c[i, j] c[i-1, j]14 b[i, j] “”15 else c[i, j] c[i, j-1]16 b[i, j] “”17 return c and b

Page 50: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 50Complexity: O(mn)

Page 51: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 51

PRINT_LCS PRINT_LCS(b, X, i, j )1 if i = 0 or j = 02 then return3 if b[i, j] = “”4 then PRINT_LCS(b, X, i-1, j-1)5 print xi

6 elseif b[i, j] = “”7 then PRINT_LCS(b, X, i-1, j)8 then PRINT_LCS(b, X, i, j-1)

Complexity: O(Complexity: O(m+nm+n))

Page 52: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 52

15.5 Optimal Binary search trees

cost:2.80 cost:2.75optimal!!

Page 53: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 53

expected cost

• the expected cost of a search in T is

n

i

n

iii qp

1 0

1

n

ii

n

iiTiiT

i

n

i

n

iiTiiT

qdpk

qdpk

1 0

1 0

)(depth)(depth1

)1)(depth()1)(depth(T]in cost E[search

Page 54: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 54

• For a given set of probabilities, our goal is to construct a binary search tree whose expected search is smallest. We call such a tree an optimal binary search tree.

Page 55: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 55

Step 1: The structure of an optimal binary search tree

• Consider any subtree of a binary search tree. It must contain keys in a contiguous range ki, ...,kj, for some 1 i j n. In addition, a subtree that contains keys ki, ..., kj must also have as its leaves the dummy keys di-1, ..., dj.

• If an optimal binary search tree T has a subtree T' containing keys ki, ..., kj, then this subtree T' must be optimal as well for the subproblem with keys ki, ..., kj and dummy keys di-1, ..., dj.

Page 56: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 56

Step 2: A recursive solution

. if,1 if

)},(],1[]1,[{min],[

),(

1

1

jiij

jiwjrerieq

jie

qpjiw

jri

i

j

ill

j

ill

e[i,j]: the expected cost of searching an optimal binary search tree containing the keys ki, … kj

Page 57: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 57

Step 3:computing the expected search cost of an OBSTOPTIMAL-BST(p,q,n)1 for i 1 to n + 12 do e[i, i – 1] qi-1

3 w[i, i – 1] qi-1

4 for l 1 to n5 do for i 1 to n – l + 16 do j i + l – 1 7 e[i, j] 8 w[i, j] w[i, j – 1] + pj+qj

Page 58: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 58

9 for r i to j10 do t e[i, r –1]+e[r +1, j]+w[i, j]11 if t < e[i, j]12 then e[i, j] t13 root[i, j] r14 return e and root

OPTIMAL-BST procedure takes (n3), just like MATRIX-CHAIN-ORDER

Page 59: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 59

The table e[i,j], w[i,j], and root[i,j] computer by OPTIMAL-BST on the key distribution.

Page 60: Chapter 15 Dynamic Programming Lee, Hsiu-Hui Ack: This presentation is based on the lecture slides from Hsu, Lih-Hsing, as well as various materials from

20071214 chap15 Hsiu-Hui Lee 60

• Knuth has shown that there are always roots of optimal subtrees such that root[i, j –1] root[i+1, j] for all 1 i j n.

• We can use this fact to modify Optimal-BST procedure to run in (n2) time.