growth rates of functions 3/26/12. asymptotic equivalence def: for example, note that n 2 +1 is...
TRANSCRIPT
Growth Rates of Functions
3/26/12
Asymptotic Equivalence
• Def:
For example,
Note that n2+1 is being used to name the function f such that f(n) = n2+1 for every n
f (n) : g(n) iff limn→∞
f(n)g(n)
⎛⎝⎜
⎞⎠⎟=1
n2 +1 : n2 (think:
2
1,5
4,10
9,17
16,...)
3/26/12
An example: Stirling’s formula
n! :
n
e⎛⎝⎜
⎞⎠⎟
n
2πn ("Stirling's approximation")
3/26/12
Little-Oh: f = o(g)
• Def: f(n) = o(g(n)) iff
limn→∞
f(n)g(n)
=0
• For example, n2 = o(n3) since
limn→∞
n2
n3 =limn→∞
1n=0
3/26/12
= o( ∙ ) is “all one symbol”
• “f = o(g)” is really a strict partial order on functions
• NEVER write “o(g) = f”, etc.
3/26/12
Big-Oh: O(∙)
• Asymptotic Order of Growth:
• “f grows no faster than g”• A Weak Partial Order
f (n)=O(g(n)) iff limn→∞
f(n)g(n)
⎛⎝⎜
⎞⎠⎟<∞
3/26/12
Growth Order
3n2 +n+2 =O(n2 ) because
limn→∞
3n2 +n+2n2 =3<∞
3/26/12
f = o(g) implies f = O(g)
because if limn→∞
f(n)g(n)
=0
then limn→∞
f(n)g(n)
< ∞
So for example, n+1=O(n2 )3/26/12
Big-Omega
• f = Ω(g) means g = O(f)• “f grows at least as quickly as g”
3/26/12
Big-Theta: (∙)𝛩“Same order of growth”
f (n)=Θ(g(n)) iff
f(n) =O(g(n)) and g(n) =O( f(n))or equivalently
f(n) =O(g(n)) and f(n) =Ω(g(n))
So, for example,
3n2 +2 =Θ(n2 )3/26/12
Rough Paraphrase
• f∼g: f and g grow to be roughly equal
• f=o(g): f grows more slowly than g• f=O(g): f grows at most as quickly
as g• f=Ω(g): f grows at least as quickly
as g• f= (g):𝛩 f and g grow at the same
rate3/26/12
Equivalent Defn of O(∙)
f (n)=O(g(n)) iff∃c,n0 such that ∀n≥n0 :
f(n) ≤c⋅g(n)
“From some point on, the value of f is at most a constant multiple of the value of g”
3/26/12
Three Concrete Examples
• Polynomials• Logarithmic functions• Exponential functions
3/26/12
Polynomials
• A (univariate) polynomial is a function such as f(n) = 3n5+2n2-n+2 (for all natural numbers n)
• This is a polynomial of degree 5 (the largest exponent)
• Or in general • Theorem: – If a<b then any polynomial of degree a is o(any
polynomial of degree b)– If a≤b then any polynomial of degree a is O(any
polynomial of degree b)
f (n)= cini
i=0
d
∑
3/26/12
Logarithmic Functions
• A function f is logarithmic if it is Θ(logbn) for some constant b.
• Theorem: All logarithmic functions are Θ() of each other, and are Θ(any logarithmic function of a polynomial)
• Theorem: Any logarithmic function is o(any polynomial)
3/26/12
Exponential Functions
• A function is exponential if it is Θ(cn) for some constant c>1.
• Theorem: Any polynomial is o(any exponential)
• If c<d then cn=o(dn).
3/26/12
Growth Rates and Analysis of Algorithms
• Let f(n) measure the amount of time taken by an algorithm to solve a problem of size n.
• Most practical algorithms have polynomial running times
• E.g. sorting algorithms generally have running times that are quadratic (polynomial or degree 2) or less (for example, O(n log n)).
• Exhaustive search over an exponentially growing set of possible answers requires exponential time.
3/26/12
Another way to look at it
• Suppose an algorithm can solve a problem of size S in time T and you give it twice as much time.
• If the running time is f(n)=n2, so that T=S2, then in time 2T you can solve a problem of size (21/2)∙S
• If the running time is f(n)=2n, so that T=2S, then in time 2T you can solve a problem of size S+1.
• In general doubling the time available to a polynomial algorithm results in a MULTIPLICATIVE increase in the size of the problem that can be solved
• But doubling the time available to an exponential algorithm results in an ADDITIVE increase to the size of the problem that can be solved.
3/26/12
FINIS
3/26/12