2 divide & conquer strategy and merge sort - cse.msu.eduhuding/331material/notes3.pdf · 2...
TRANSCRIPT
Outline
1 Insertion Sort
2 Divide & Conquer Strategy and Merge Sort
3 Master Theorem
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 1 / 20
Binary Search
Binary SearchInput: Sorted array A[1..n] and a number x
Output: Find i such that A[i] = x, if no such i exists, output “no”.
We use a function BinarySearch(A, p, r, x) that searches x in A[p..r].
BinarySearch(A, p, r, x)1: if p = r then2: if A[p] = x return p3: if A[p] 6= x return “no”4: else5: q = (p + r)/26: if A[q] = x return q7: if A[q] > x call BinarySearch(A, p, q− 1, x)8: if A[q] < x call BinarySearch(A, q + 1, r, x)9: end if
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 2 / 20
Binary Search
Binary SearchInput: Sorted array A[1..n] and a number x
Output: Find i such that A[i] = x, if no such i exists, output “no”.
We use a function BinarySearch(A, p, r, x) that searches x in A[p..r].
BinarySearch(A, p, r, x)1: if p = r then2: if A[p] = x return p3: if A[p] 6= x return “no”4: else5: q = (p + r)/26: if A[q] = x return q7: if A[q] > x call BinarySearch(A, p, q− 1, x)8: if A[q] < x call BinarySearch(A, q + 1, r, x)9: end if
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 2 / 20
Binary Search
Binary SearchInput: Sorted array A[1..n] and a number x
Output: Find i such that A[i] = x, if no such i exists, output “no”.
We use a function BinarySearch(A, p, r, x) that searches x in A[p..r].
BinarySearch(A, p, r, x)1: if p = r then2: if A[p] = x return p3: if A[p] 6= x return “no”4: else5: q = (p + r)/26: if A[q] = x return q7: if A[q] > x call BinarySearch(A, p, q− 1, x)8: if A[q] < x call BinarySearch(A, q + 1, r, x)9: end if
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 2 / 20
Binary Search
Binary SearchInput: Sorted array A[1..n] and a number x
Output: Find i such that A[i] = x, if no such i exists, output “no”.
We use a function BinarySearch(A, p, r, x) that searches x in A[p..r].
BinarySearch(A, p, r, x)1: if p = r then2: if A[p] = x return p3: if A[p] 6= x return “no”4: else5: q = (p + r)/26: if A[q] = x return q7: if A[q] > x call BinarySearch(A, p, q− 1, x)8: if A[q] < x call BinarySearch(A, q + 1, r, x)9: end if
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 2 / 20
Insertion Sort
SortingInput: An array A[1..n]
Output: Re-ordered A such that A[i] ≤ A[i + 1] for any 1 ≤ i ≤ n− 1.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 3 / 20
Insertion Sort
InsertionSort(A[1..n])1: For i = 1 to n− 12: if A[i] > A[i + 1] then3: if i = 1 then4: swap A[i] and A[i + 1]5: else6: if A[1] > A[i + 1] then7: Insert A[i + 1] before A[1]8: else9: Find A[i′] such that 1 ≤ i′ < i, A[i′] ≤ A[i + 1] < A[i′ + 1],
insert A[i + 1] between A[i′] and A[i′ + 1]10: end if11: end if12: end if
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 4 / 20
Analysis for Insertion Sort
At most Θ(∑n
i=1 i) = Θ(n2) movements and comparisons=⇒ T(n) = Θ(n2).Using binary search, we can reduce the number of comparisonsto Θ(
∑ni=1 log i) = Θ(n log n). Sometimes comparison is more
expensive than movement.Only O(1) extra space.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 5 / 20
Analysis for Insertion Sort
At most Θ(∑n
i=1 i) = Θ(n2) movements and comparisons=⇒ T(n) = Θ(n2).
Using binary search, we can reduce the number of comparisonsto Θ(
∑ni=1 log i) = Θ(n log n). Sometimes comparison is more
expensive than movement.Only O(1) extra space.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 5 / 20
Analysis for Insertion Sort
At most Θ(∑n
i=1 i) = Θ(n2) movements and comparisons=⇒ T(n) = Θ(n2).Using binary search, we can reduce the number of comparisonsto Θ(
∑ni=1 log i) = Θ(n log n). Sometimes comparison is more
expensive than movement.
Only O(1) extra space.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 5 / 20
Analysis for Insertion Sort
At most Θ(∑n
i=1 i) = Θ(n2) movements and comparisons=⇒ T(n) = Θ(n2).Using binary search, we can reduce the number of comparisonsto Θ(
∑ni=1 log i) = Θ(n log n). Sometimes comparison is more
expensive than movement.Only O(1) extra space.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 5 / 20
Outline
1 Insertion Sort
2 Divide & Conquer Strategy and Merge Sort
3 Master Theorem
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 6 / 20
Divide and Conquer Strategy
Algorithm design is more an art, less so a science.
There are a few useful strategies, but no guarantee to succeed.We will discuss: Divide and Conquer, Greedy, DynamicProgramming.For each of them, we will discuss a few examples, and try toidentify common schemes.
Divide and ConquerDivide the problem into smaller subproblems (of the same type).Solve each subproblem (usually by recursive calls).Combine the solutions of the subproblems into the solution of theoriginal problem.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 7 / 20
Divide and Conquer Strategy
Algorithm design is more an art, less so a science.There are a few useful strategies, but no guarantee to succeed.
We will discuss: Divide and Conquer, Greedy, DynamicProgramming.For each of them, we will discuss a few examples, and try toidentify common schemes.
Divide and ConquerDivide the problem into smaller subproblems (of the same type).Solve each subproblem (usually by recursive calls).Combine the solutions of the subproblems into the solution of theoriginal problem.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 7 / 20
Divide and Conquer Strategy
Algorithm design is more an art, less so a science.There are a few useful strategies, but no guarantee to succeed.We will discuss: Divide and Conquer, Greedy, DynamicProgramming.
For each of them, we will discuss a few examples, and try toidentify common schemes.
Divide and ConquerDivide the problem into smaller subproblems (of the same type).Solve each subproblem (usually by recursive calls).Combine the solutions of the subproblems into the solution of theoriginal problem.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 7 / 20
Divide and Conquer Strategy
Algorithm design is more an art, less so a science.There are a few useful strategies, but no guarantee to succeed.We will discuss: Divide and Conquer, Greedy, DynamicProgramming.For each of them, we will discuss a few examples, and try toidentify common schemes.
Divide and ConquerDivide the problem into smaller subproblems (of the same type).Solve each subproblem (usually by recursive calls).Combine the solutions of the subproblems into the solution of theoriginal problem.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 7 / 20
Divide and Conquer Strategy
Algorithm design is more an art, less so a science.There are a few useful strategies, but no guarantee to succeed.We will discuss: Divide and Conquer, Greedy, DynamicProgramming.For each of them, we will discuss a few examples, and try toidentify common schemes.
Divide and Conquer
Divide the problem into smaller subproblems (of the same type).Solve each subproblem (usually by recursive calls).Combine the solutions of the subproblems into the solution of theoriginal problem.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 7 / 20
Divide and Conquer Strategy
Algorithm design is more an art, less so a science.There are a few useful strategies, but no guarantee to succeed.We will discuss: Divide and Conquer, Greedy, DynamicProgramming.For each of them, we will discuss a few examples, and try toidentify common schemes.
Divide and ConquerDivide the problem into smaller subproblems (of the same type).
Solve each subproblem (usually by recursive calls).Combine the solutions of the subproblems into the solution of theoriginal problem.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 7 / 20
Divide and Conquer Strategy
Algorithm design is more an art, less so a science.There are a few useful strategies, but no guarantee to succeed.We will discuss: Divide and Conquer, Greedy, DynamicProgramming.For each of them, we will discuss a few examples, and try toidentify common schemes.
Divide and ConquerDivide the problem into smaller subproblems (of the same type).Solve each subproblem (usually by recursive calls).
Combine the solutions of the subproblems into the solution of theoriginal problem.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 7 / 20
Divide and Conquer Strategy
Algorithm design is more an art, less so a science.There are a few useful strategies, but no guarantee to succeed.We will discuss: Divide and Conquer, Greedy, DynamicProgramming.For each of them, we will discuss a few examples, and try toidentify common schemes.
Divide and ConquerDivide the problem into smaller subproblems (of the same type).Solve each subproblem (usually by recursive calls).Combine the solutions of the subproblems into the solution of theoriginal problem.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 7 / 20
Merge Sort
MergeSortInput: an array A[1..n]Output: Sort A into increasing order.
Use a recursive function MergeSort(A, p, r).It sorts A[p..r].In main program, we call MergeSort(A, 1, n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 8 / 20
Merge Sort
MergeSortInput: an array A[1..n]Output: Sort A into increasing order.
Use a recursive function MergeSort(A, p, r).
It sorts A[p..r].In main program, we call MergeSort(A, 1, n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 8 / 20
Merge Sort
MergeSortInput: an array A[1..n]Output: Sort A into increasing order.
Use a recursive function MergeSort(A, p, r).It sorts A[p..r].
In main program, we call MergeSort(A, 1, n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 8 / 20
Merge Sort
MergeSortInput: an array A[1..n]Output: Sort A into increasing order.
Use a recursive function MergeSort(A, p, r).It sorts A[p..r].In main program, we call MergeSort(A, 1, n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 8 / 20
Merge Sort
Merge(A, p, q, r)1: i = p, j = q + 1,B = zeros(r − p + 1), flag = 0.2: While(i ≤ q or j ≤ r)3: flag=flag+14: if i ≤ q and j ≤ r then5: if A[i] ≤ A[j] then6: B[flag] = A[i], i = i + 17: else8: B[flag] = A[j], j = j + 19: end if
10: else if i = q + 1 then11: B[flag : r − p + 1] = A[j : r], j = r + 112: else13: B[flag : r − p + 1] = A[i : q], i = q + 114: end if15: End While16: A[p : r] = B
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 9 / 20
Merge Sort
MergeSort(A, p, r)1: if (p < r) then2: q = (p + r)/23: MergeSort(A, p, q)4: MergeSort(A, q + 1, r)5: Merge(A, p, q, r)6: else7: do nothing8: end if
Divide A[p..r] into two sub-arrays of equal size.Sort each sub-array by recursive call.Merge(A, p, q, r) is a procedure that, assuming A[p..q] andA[q + 1..r] are sorted, merge them into sorted A[p..r]
It can be done in Θ(k) time where k = r − p is the number ofelements to be sorted.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 10 / 20
Merge Sort
MergeSort(A, p, r)1: if (p < r) then2: q = (p + r)/23: MergeSort(A, p, q)4: MergeSort(A, q + 1, r)5: Merge(A, p, q, r)6: else7: do nothing8: end if
Divide A[p..r] into two sub-arrays of equal size.
Sort each sub-array by recursive call.Merge(A, p, q, r) is a procedure that, assuming A[p..q] andA[q + 1..r] are sorted, merge them into sorted A[p..r]
It can be done in Θ(k) time where k = r − p is the number ofelements to be sorted.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 10 / 20
Merge Sort
MergeSort(A, p, r)1: if (p < r) then2: q = (p + r)/23: MergeSort(A, p, q)4: MergeSort(A, q + 1, r)5: Merge(A, p, q, r)6: else7: do nothing8: end if
Divide A[p..r] into two sub-arrays of equal size.Sort each sub-array by recursive call.
Merge(A, p, q, r) is a procedure that, assuming A[p..q] andA[q + 1..r] are sorted, merge them into sorted A[p..r]
It can be done in Θ(k) time where k = r − p is the number ofelements to be sorted.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 10 / 20
Merge Sort
MergeSort(A, p, r)1: if (p < r) then2: q = (p + r)/23: MergeSort(A, p, q)4: MergeSort(A, q + 1, r)5: Merge(A, p, q, r)6: else7: do nothing8: end if
Divide A[p..r] into two sub-arrays of equal size.Sort each sub-array by recursive call.Merge(A, p, q, r) is a procedure that, assuming A[p..q] andA[q + 1..r] are sorted, merge them into sorted A[p..r]
It can be done in Θ(k) time where k = r − p is the number ofelements to be sorted.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 10 / 20
Analysis of MergeSort
Let T(n) be the runtime function of MergeSort(A[1..n]). Then:
T(n) =
O(1) if n = 12T(n/2) + Θ(n) if n > 1
If n = 1, MergeSort does nothing, hence O(1) time.Otherwise, we make 2 recursive calls. The input size of each isn/2. Hence the runtime 2T(n/2).Θ(n) is the time needed by Merge(A, p, q, r) and all otherprocessing.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 11 / 20
Analysis of MergeSort
Let T(n) be the runtime function of MergeSort(A[1..n]). Then:
T(n) =
O(1) if n = 12T(n/2) + Θ(n) if n > 1
If n = 1, MergeSort does nothing, hence O(1) time.
Otherwise, we make 2 recursive calls. The input size of each isn/2. Hence the runtime 2T(n/2).Θ(n) is the time needed by Merge(A, p, q, r) and all otherprocessing.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 11 / 20
Analysis of MergeSort
Let T(n) be the runtime function of MergeSort(A[1..n]). Then:
T(n) =
O(1) if n = 12T(n/2) + Θ(n) if n > 1
If n = 1, MergeSort does nothing, hence O(1) time.Otherwise, we make 2 recursive calls. The input size of each isn/2. Hence the runtime 2T(n/2).
Θ(n) is the time needed by Merge(A, p, q, r) and all otherprocessing.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 11 / 20
Analysis of MergeSort
Let T(n) be the runtime function of MergeSort(A[1..n]). Then:
T(n) =
O(1) if n = 12T(n/2) + Θ(n) if n > 1
If n = 1, MergeSort does nothing, hence O(1) time.Otherwise, we make 2 recursive calls. The input size of each isn/2. Hence the runtime 2T(n/2).Θ(n) is the time needed by Merge(A, p, q, r) and all otherprocessing.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 11 / 20
Outline
1 Insertion Sort
2 Divide & Conquer Strategy and Merge Sort
3 Master Theorem
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 12 / 20
Master Theorem
For DaC algorithms, the runtime function often satisfies:
T(n) =
O(1) if n ≤ n0aT(n/b) + Θ(f (n)) if n > n0
If n ≤ n0 (n0 is a small constant), we solve the problem directlywithout recursive calls. Since the input size is fixed (bounded byn0), it takes O(1) time.We make a recursive calls. The input size of each is n/b. Hencethe runtime T(n/b).Θ(f (n)) is the time needed by all other processing.T(n) =?
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 13 / 20
Master Theorem
For DaC algorithms, the runtime function often satisfies:
T(n) =
O(1) if n ≤ n0aT(n/b) + Θ(f (n)) if n > n0
If n ≤ n0 (n0 is a small constant), we solve the problem directlywithout recursive calls. Since the input size is fixed (bounded byn0), it takes O(1) time.
We make a recursive calls. The input size of each is n/b. Hencethe runtime T(n/b).Θ(f (n)) is the time needed by all other processing.T(n) =?
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 13 / 20
Master Theorem
For DaC algorithms, the runtime function often satisfies:
T(n) =
O(1) if n ≤ n0aT(n/b) + Θ(f (n)) if n > n0
If n ≤ n0 (n0 is a small constant), we solve the problem directlywithout recursive calls. Since the input size is fixed (bounded byn0), it takes O(1) time.We make a recursive calls. The input size of each is n/b. Hencethe runtime T(n/b).
Θ(f (n)) is the time needed by all other processing.T(n) =?
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 13 / 20
Master Theorem
For DaC algorithms, the runtime function often satisfies:
T(n) =
O(1) if n ≤ n0aT(n/b) + Θ(f (n)) if n > n0
If n ≤ n0 (n0 is a small constant), we solve the problem directlywithout recursive calls. Since the input size is fixed (bounded byn0), it takes O(1) time.We make a recursive calls. The input size of each is n/b. Hencethe runtime T(n/b).Θ(f (n)) is the time needed by all other processing.
T(n) =?
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 13 / 20
Master Theorem
For DaC algorithms, the runtime function often satisfies:
T(n) =
O(1) if n ≤ n0aT(n/b) + Θ(f (n)) if n > n0
If n ≤ n0 (n0 is a small constant), we solve the problem directlywithout recursive calls. Since the input size is fixed (bounded byn0), it takes O(1) time.We make a recursive calls. The input size of each is n/b. Hencethe runtime T(n/b).Θ(f (n)) is the time needed by all other processing.T(n) =?
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 13 / 20
Master Theorem
Master Theorem (Theorem 4.1, Cormen’s book.)1 If f (n) = O(nlogb a−ε) for some constant ε > 0, then
T(n) = Θ(nlogb a).2 If f (n) = Θ(nlogb a), then T(n) = Θ(nlogb a log n).3 If f (n) = Ω(nlogb a+ε) for some constant ε > 0, and af (n/b) ≤ cf (n)
for some c < 1 for sufficiently large n, then T(n) = Θ(f (n)).
Example: MergeSortWe have a = 2, b = 2, hence logb a = log2 2 = 1. Sof (n) = Θ(n1) = Θ(nlogb a).
By statement (2), T(n) = Θ(nlog2 2 log n) = Θ(n log n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 14 / 20
Master Theorem
Master Theorem (Theorem 4.1, Cormen’s book.)1 If f (n) = O(nlogb a−ε) for some constant ε > 0, then
T(n) = Θ(nlogb a).2 If f (n) = Θ(nlogb a), then T(n) = Θ(nlogb a log n).3 If f (n) = Ω(nlogb a+ε) for some constant ε > 0, and af (n/b) ≤ cf (n)
for some c < 1 for sufficiently large n, then T(n) = Θ(f (n)).
Example: MergeSortWe have a = 2, b = 2, hence logb a = log2 2 = 1. Sof (n) = Θ(n1) = Θ(nlogb a).
By statement (2), T(n) = Θ(nlog2 2 log n) = Θ(n log n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 14 / 20
Master Theorem
Master Theorem (Theorem 4.1, Cormen’s book.)1 If f (n) = O(nlogb a−ε) for some constant ε > 0, then
T(n) = Θ(nlogb a).2 If f (n) = Θ(nlogb a), then T(n) = Θ(nlogb a log n).3 If f (n) = Ω(nlogb a+ε) for some constant ε > 0, and af (n/b) ≤ cf (n)
for some c < 1 for sufficiently large n, then T(n) = Θ(f (n)).
Example: MergeSortWe have a = 2, b = 2, hence logb a = log2 2 = 1. Sof (n) = Θ(n1) = Θ(nlogb a).
By statement (2), T(n) = Θ(nlog2 2 log n) = Θ(n log n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 14 / 20
Binary Search
Binary SearchInput: Sorted array A[1..n] and a number x
Output: Find i such that A[i] = x, if no such i exists, output “no”.
We use a function BinarySearch(A, p, r, x) that searches x in A[p..r].
BinarySearch(A, p, r, x)1: if p = r then2: if A[p] = x return p3: if A[p] 6= x return “no”4: else5: q = (p + r)/26: if A[q] = x return q7: if A[q] > x call BinarySearch(A, p, q− 1, x)8: if A[q] < x call BinarySearch(A, q + 1, r, x)9: end if
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 15 / 20
Binary Search
Binary SearchInput: Sorted array A[1..n] and a number x
Output: Find i such that A[i] = x, if no such i exists, output “no”.
We use a function BinarySearch(A, p, r, x) that searches x in A[p..r].
BinarySearch(A, p, r, x)1: if p = r then2: if A[p] = x return p3: if A[p] 6= x return “no”4: else5: q = (p + r)/26: if A[q] = x return q7: if A[q] > x call BinarySearch(A, p, q− 1, x)8: if A[q] < x call BinarySearch(A, q + 1, r, x)9: end if
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 15 / 20
Binary Search
Binary SearchInput: Sorted array A[1..n] and a number x
Output: Find i such that A[i] = x, if no such i exists, output “no”.
We use a function BinarySearch(A, p, r, x) that searches x in A[p..r].
BinarySearch(A, p, r, x)1: if p = r then2: if A[p] = x return p3: if A[p] 6= x return “no”4: else5: q = (p + r)/26: if A[q] = x return q7: if A[q] > x call BinarySearch(A, p, q− 1, x)8: if A[q] < x call BinarySearch(A, q + 1, r, x)9: end if
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 15 / 20
Binary Search
Binary SearchInput: Sorted array A[1..n] and a number x
Output: Find i such that A[i] = x, if no such i exists, output “no”.
We use a function BinarySearch(A, p, r, x) that searches x in A[p..r].
BinarySearch(A, p, r, x)1: if p = r then2: if A[p] = x return p3: if A[p] 6= x return “no”4: else5: q = (p + r)/26: if A[q] = x return q7: if A[q] > x call BinarySearch(A, p, q− 1, x)8: if A[q] < x call BinarySearch(A, q + 1, r, x)9: end if
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 15 / 20
Analysis of Binary Search
If n = p− r + 1 = 1, it takes O(1) time.
If not, we make at most one recursive call, with size n/2.All other processing take f (n) = Θ(1) timeSo a = 1, b = 2 and f (n) = Θ(n0) time.Since logb a = log2 1 = 0, f (n) = Θ(nlogb a).Hence T(n) = Θ(nlogb a log n) = Θ(log n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 16 / 20
Analysis of Binary Search
If n = p− r + 1 = 1, it takes O(1) time.If not, we make at most one recursive call, with size n/2.
All other processing take f (n) = Θ(1) timeSo a = 1, b = 2 and f (n) = Θ(n0) time.Since logb a = log2 1 = 0, f (n) = Θ(nlogb a).Hence T(n) = Θ(nlogb a log n) = Θ(log n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 16 / 20
Analysis of Binary Search
If n = p− r + 1 = 1, it takes O(1) time.If not, we make at most one recursive call, with size n/2.All other processing take f (n) = Θ(1) time
So a = 1, b = 2 and f (n) = Θ(n0) time.Since logb a = log2 1 = 0, f (n) = Θ(nlogb a).Hence T(n) = Θ(nlogb a log n) = Θ(log n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 16 / 20
Analysis of Binary Search
If n = p− r + 1 = 1, it takes O(1) time.If not, we make at most one recursive call, with size n/2.All other processing take f (n) = Θ(1) timeSo a = 1, b = 2 and f (n) = Θ(n0) time.Since logb a = log2 1 = 0, f (n) = Θ(nlogb a).
Hence T(n) = Θ(nlogb a log n) = Θ(log n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 16 / 20
Analysis of Binary Search
If n = p− r + 1 = 1, it takes O(1) time.If not, we make at most one recursive call, with size n/2.All other processing take f (n) = Θ(1) timeSo a = 1, b = 2 and f (n) = Θ(n0) time.Since logb a = log2 1 = 0, f (n) = Θ(nlogb a).Hence T(n) = Θ(nlogb a log n) = Θ(log n).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 16 / 20
Example
ExampleA function makes 4 recursive calls, each with size n/2. Otherprocessing takes f (n) = Θ(n3) time.
T(n) = 4T(n/2) + Θ(n3)
We have a = 4, b = 2. So logb a = log2 4 = 2.f (n) = n3 = Θ(nlogb a+1) = Ω(nlogb a+0.5).This is the case 3 of Master Theorem. We need to check the 2nd
condition:a · f (n/b) = 4
(n2
)3=
48
n3 =12· f (n)
If we let c = 1/2 < 1, we have: a · f (n/b) ≤ c · f (n).Hence by case 3, T(n) = Θ(f (n)) = Θ(n3).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 17 / 20
Example
ExampleA function makes 4 recursive calls, each with size n/2. Otherprocessing takes f (n) = Θ(n3) time.
T(n) = 4T(n/2) + Θ(n3)
We have a = 4, b = 2. So logb a = log2 4 = 2.f (n) = n3 = Θ(nlogb a+1) = Ω(nlogb a+0.5).This is the case 3 of Master Theorem. We need to check the 2nd
condition:a · f (n/b) = 4
(n2
)3=
48
n3 =12· f (n)
If we let c = 1/2 < 1, we have: a · f (n/b) ≤ c · f (n).Hence by case 3, T(n) = Θ(f (n)) = Θ(n3).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 17 / 20
Example
ExampleA function makes 4 recursive calls, each with size n/2. Otherprocessing takes f (n) = Θ(n3) time.
T(n) = 4T(n/2) + Θ(n3)
We have a = 4, b = 2. So logb a = log2 4 = 2.
f (n) = n3 = Θ(nlogb a+1) = Ω(nlogb a+0.5).This is the case 3 of Master Theorem. We need to check the 2nd
condition:a · f (n/b) = 4
(n2
)3=
48
n3 =12· f (n)
If we let c = 1/2 < 1, we have: a · f (n/b) ≤ c · f (n).Hence by case 3, T(n) = Θ(f (n)) = Θ(n3).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 17 / 20
Example
ExampleA function makes 4 recursive calls, each with size n/2. Otherprocessing takes f (n) = Θ(n3) time.
T(n) = 4T(n/2) + Θ(n3)
We have a = 4, b = 2. So logb a = log2 4 = 2.f (n) = n3 = Θ(nlogb a+1) = Ω(nlogb a+0.5).This is the case 3 of Master Theorem. We need to check the 2nd
condition:
a · f (n/b) = 4(n
2
)3=
48
n3 =12· f (n)
If we let c = 1/2 < 1, we have: a · f (n/b) ≤ c · f (n).Hence by case 3, T(n) = Θ(f (n)) = Θ(n3).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 17 / 20
Example
ExampleA function makes 4 recursive calls, each with size n/2. Otherprocessing takes f (n) = Θ(n3) time.
T(n) = 4T(n/2) + Θ(n3)
We have a = 4, b = 2. So logb a = log2 4 = 2.f (n) = n3 = Θ(nlogb a+1) = Ω(nlogb a+0.5).This is the case 3 of Master Theorem. We need to check the 2nd
condition:a · f (n/b) = 4
(n2
)3=
48
n3 =12· f (n)
If we let c = 1/2 < 1, we have: a · f (n/b) ≤ c · f (n).Hence by case 3, T(n) = Θ(f (n)) = Θ(n3).
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 17 / 20
Master Theorem
If f (n) has the form f (n) = Θ(nk) for some k ≥ 0, We have the following:
A simpler version of Master Theorem
T(n) =
O(1) if n ≤ n0aT(n/b) + Θ(nk) if n > n0
1 If k < logb a, then T(n) = Θ(nlogb a).2 If k = logb a, then T(n) = Θ(nk log n).3 If k > logb a, then T(n) = Θ(nk).
Only the case 3 is different. In this case, we need to check the 2nd
condition. Because k > logb a, bk > a and a/bk < 1:
a · f (n/b) = a ·(n
b
)k=
abk · f (n) = c · f (n)
where c = abk < 1, as needed.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 18 / 20
Master Theorem
If f (n) has the form f (n) = Θ(nk) for some k ≥ 0, We have the following:
A simpler version of Master Theorem
T(n) =
O(1) if n ≤ n0aT(n/b) + Θ(nk) if n > n0
1 If k < logb a, then T(n) = Θ(nlogb a).2 If k = logb a, then T(n) = Θ(nk log n).3 If k > logb a, then T(n) = Θ(nk).
Only the case 3 is different. In this case, we need to check the 2nd
condition. Because k > logb a, bk > a and a/bk < 1:
a · f (n/b) = a ·(n
b
)k=
abk · f (n) = c · f (n)
where c = abk < 1, as needed.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 18 / 20
Master Theorem
How to understand/memorize Master Theorem?
The cost of a DaC algorithm can be divided into two parts:1 The total cost of all recursive calls is Θ(nlogb a).2 The total cost of all other processing is Θ(f (n)).
If (1) > (2), (1) dominates the total cost: T(n) = Θ(nlogb a).If (1) < (2), (2) dominates the total cost: T(n) = Θ(f (n)).If (1) = (2), the cost of two parts are about the same, somehow wehave an extra factor log n.The proof of Master Theorem is given in textbook.We’ll illustrate two examples in class.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 19 / 20
Master Theorem
How to understand/memorize Master Theorem?The cost of a DaC algorithm can be divided into two parts:
1 The total cost of all recursive calls is Θ(nlogb a).2 The total cost of all other processing is Θ(f (n)).
If (1) > (2), (1) dominates the total cost: T(n) = Θ(nlogb a).If (1) < (2), (2) dominates the total cost: T(n) = Θ(f (n)).If (1) = (2), the cost of two parts are about the same, somehow wehave an extra factor log n.The proof of Master Theorem is given in textbook.We’ll illustrate two examples in class.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 19 / 20
Master Theorem
How to understand/memorize Master Theorem?The cost of a DaC algorithm can be divided into two parts:
1 The total cost of all recursive calls is Θ(nlogb a).
2 The total cost of all other processing is Θ(f (n)).
If (1) > (2), (1) dominates the total cost: T(n) = Θ(nlogb a).If (1) < (2), (2) dominates the total cost: T(n) = Θ(f (n)).If (1) = (2), the cost of two parts are about the same, somehow wehave an extra factor log n.The proof of Master Theorem is given in textbook.We’ll illustrate two examples in class.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 19 / 20
Master Theorem
How to understand/memorize Master Theorem?The cost of a DaC algorithm can be divided into two parts:
1 The total cost of all recursive calls is Θ(nlogb a).2 The total cost of all other processing is Θ(f (n)).
If (1) > (2), (1) dominates the total cost: T(n) = Θ(nlogb a).If (1) < (2), (2) dominates the total cost: T(n) = Θ(f (n)).If (1) = (2), the cost of two parts are about the same, somehow wehave an extra factor log n.The proof of Master Theorem is given in textbook.We’ll illustrate two examples in class.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 19 / 20
Master Theorem
How to understand/memorize Master Theorem?The cost of a DaC algorithm can be divided into two parts:
1 The total cost of all recursive calls is Θ(nlogb a).2 The total cost of all other processing is Θ(f (n)).
If (1) > (2), (1) dominates the total cost: T(n) = Θ(nlogb a).
If (1) < (2), (2) dominates the total cost: T(n) = Θ(f (n)).If (1) = (2), the cost of two parts are about the same, somehow wehave an extra factor log n.The proof of Master Theorem is given in textbook.We’ll illustrate two examples in class.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 19 / 20
Master Theorem
How to understand/memorize Master Theorem?The cost of a DaC algorithm can be divided into two parts:
1 The total cost of all recursive calls is Θ(nlogb a).2 The total cost of all other processing is Θ(f (n)).
If (1) > (2), (1) dominates the total cost: T(n) = Θ(nlogb a).If (1) < (2), (2) dominates the total cost: T(n) = Θ(f (n)).
If (1) = (2), the cost of two parts are about the same, somehow wehave an extra factor log n.The proof of Master Theorem is given in textbook.We’ll illustrate two examples in class.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 19 / 20
Master Theorem
How to understand/memorize Master Theorem?The cost of a DaC algorithm can be divided into two parts:
1 The total cost of all recursive calls is Θ(nlogb a).2 The total cost of all other processing is Θ(f (n)).
If (1) > (2), (1) dominates the total cost: T(n) = Θ(nlogb a).If (1) < (2), (2) dominates the total cost: T(n) = Θ(f (n)).If (1) = (2), the cost of two parts are about the same, somehow wehave an extra factor log n.
The proof of Master Theorem is given in textbook.We’ll illustrate two examples in class.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 19 / 20
Master Theorem
How to understand/memorize Master Theorem?The cost of a DaC algorithm can be divided into two parts:
1 The total cost of all recursive calls is Θ(nlogb a).2 The total cost of all other processing is Θ(f (n)).
If (1) > (2), (1) dominates the total cost: T(n) = Θ(nlogb a).If (1) < (2), (2) dominates the total cost: T(n) = Θ(f (n)).If (1) = (2), the cost of two parts are about the same, somehow wehave an extra factor log n.The proof of Master Theorem is given in textbook.We’ll illustrate two examples in class.
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 19 / 20
Example
For some simple cases, Master Theorem does not work.
Example
T(n) = 2T(n/2) + Θ(n log n)
a = 2, b = 2, loga b = log2 2 = 1. f (n) = n1 log n = nlogb a log n
f (n) = Ω(n), but f (n) 6= Ω(n1+ε) for any ε > 0.Master Theorem does not apply.
TheoremIf T(n) = aT(n/b) + f (n), where f (n) = Θ(nlogb a(log n)k), thenT(n) = Θ(nlogb a(log n)k+1).
In the above example, T(n) = Θ(n log2 n)
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 20 / 20
Example
For some simple cases, Master Theorem does not work.
Example
T(n) = 2T(n/2) + Θ(n log n)
a = 2, b = 2, loga b = log2 2 = 1. f (n) = n1 log n = nlogb a log n
f (n) = Ω(n), but f (n) 6= Ω(n1+ε) for any ε > 0.Master Theorem does not apply.
TheoremIf T(n) = aT(n/b) + f (n), where f (n) = Θ(nlogb a(log n)k), thenT(n) = Θ(nlogb a(log n)k+1).
In the above example, T(n) = Θ(n log2 n)
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 20 / 20
Example
For some simple cases, Master Theorem does not work.
Example
T(n) = 2T(n/2) + Θ(n log n)
a = 2, b = 2, loga b = log2 2 = 1. f (n) = n1 log n = nlogb a log n
f (n) = Ω(n), but f (n) 6= Ω(n1+ε) for any ε > 0.Master Theorem does not apply.
TheoremIf T(n) = aT(n/b) + f (n), where f (n) = Θ(nlogb a(log n)k), thenT(n) = Θ(nlogb a(log n)k+1).
In the above example, T(n) = Θ(n log2 n)
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 20 / 20
Example
For some simple cases, Master Theorem does not work.
Example
T(n) = 2T(n/2) + Θ(n log n)
a = 2, b = 2, loga b = log2 2 = 1. f (n) = n1 log n = nlogb a log n
f (n) = Ω(n), but f (n) 6= Ω(n1+ε) for any ε > 0.
Master Theorem does not apply.
TheoremIf T(n) = aT(n/b) + f (n), where f (n) = Θ(nlogb a(log n)k), thenT(n) = Θ(nlogb a(log n)k+1).
In the above example, T(n) = Θ(n log2 n)
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 20 / 20
Example
For some simple cases, Master Theorem does not work.
Example
T(n) = 2T(n/2) + Θ(n log n)
a = 2, b = 2, loga b = log2 2 = 1. f (n) = n1 log n = nlogb a log n
f (n) = Ω(n), but f (n) 6= Ω(n1+ε) for any ε > 0.Master Theorem does not apply.
TheoremIf T(n) = aT(n/b) + f (n), where f (n) = Θ(nlogb a(log n)k), thenT(n) = Θ(nlogb a(log n)k+1).
In the above example, T(n) = Θ(n log2 n)
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 20 / 20
Example
For some simple cases, Master Theorem does not work.
Example
T(n) = 2T(n/2) + Θ(n log n)
a = 2, b = 2, loga b = log2 2 = 1. f (n) = n1 log n = nlogb a log n
f (n) = Ω(n), but f (n) 6= Ω(n1+ε) for any ε > 0.Master Theorem does not apply.
TheoremIf T(n) = aT(n/b) + f (n), where f (n) = Θ(nlogb a(log n)k), thenT(n) = Θ(nlogb a(log n)k+1).
In the above example, T(n) = Θ(n log2 n)
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 20 / 20
Example
For some simple cases, Master Theorem does not work.
Example
T(n) = 2T(n/2) + Θ(n log n)
a = 2, b = 2, loga b = log2 2 = 1. f (n) = n1 log n = nlogb a log n
f (n) = Ω(n), but f (n) 6= Ω(n1+ε) for any ε > 0.Master Theorem does not apply.
TheoremIf T(n) = aT(n/b) + f (n), where f (n) = Θ(nlogb a(log n)k), thenT(n) = Θ(nlogb a(log n)k+1).
In the above example, T(n) = Θ(n log2 n)
c©Hu Ding (Michigan State University) CSE 331 Algorithm and Data Structures 20 / 20