algorithms growth of functions. some notation nnatural numbers rreal numbers n + positive natural...

31
Algorithms Growth of Functions

Upload: clarissa-lyons

Post on 03-Jan-2016

216 views

Category:

Documents


1 download

TRANSCRIPT

Algorithms

Growth of Functions

Some Notation

N Natural numbersR Real numbersN+ Positive natural numbersR+Positive real numbersR*Non-negative real numbersB Boolean constants {true, false}

Growth of Functions• Asymptotic efficiency of algorithms– How does the running time of an algorithm

increase with the size of the input in the limit as the input increases in size without bound?

• Asymptotic notation (“the order of”)– Define sets of functions that satisfy certain

criteria and use these to characterize time and space complexity of algorithms

Big O

Definition:For a given function g(n), O(g(n)) is the set of functionsO(g(n))= {f(n): there exist positive

constants c and n0 such that

0 f(n) c g(n) for all n n0 }

c is the multiplicative constant n0 is the threshold

Big O• Big O is an upper bound on a function to within a

constant function.• O(g(n)) is a set of functions• Commonly used notation

f(n) = O(g(n)) • Correct notation

f(n) O(g(n))• Meaningless statement

O(g(n)) = f(n)

n

c g(n)

f(n)

n0

f(n) O(g(n))

• Question:How do you demonstrate that f(n) O(g(n))

• Answer:Show that you can find values for c and n0 such

that 0 f(n) c g(n) for all n n0

n

c g(n)

f(n)

n0

f(n) may be negative or undefinedfor some values of n

Big O and Algorithms• Principle of Invariance

If some implementation of an algorithm never takes more than t(n) seconds to solve for an instance of size n, then any other implementations of the same algorithm take a time in the order of t(n) seconds.

• Therefore the algorithm takes time in the order of f(n) for any functionf: N R* such that t(n) O(f(n))

ExampleSuppose

t(n) = 20n3 + 45n2 -3n + 1 s Then

t(n) O(20n3 + 45n2 -3n + 1)since it is always the case that

t(n) O(t(n)) with c = 1 and n0 = 0

But it is also the case that t(n) O(20n3) t(n) O(n3)

Why not just use t(n)?

• Using simple functions for the order simplifies comparison of algorithms

• t(n) may be very difficult to determine exactly

• In general, we try to express the order of an algorithm’s running time using the simplest possible function f such that t(n) O(f(n))

True or False?

• 3n + 4 O(n) True• n/2 O(n) False (it is Upper bond should be n or greater n2)

• 100n2 + 100n - 6 O(n) False• n O(20n) True• n O(n2) True • 6*2n +n2 O(n2) False. • 6*2n +n2 O(2n) True.

Common Terminology

Complexity TermO(1) constantO(log n)

logarithmicO(n) linearO(n lg n) n log nO(nb)

polynomialO(bn) b > 1 exponentialO(n!) factorial

Big Omega

• Definition:For a given function g(n), (g(n)) is the set of

functions:(g(n)) = {f(n): there exist positive constants c and n0 such that 0 c g(n) f(n)

for all n n0 }

• Omega provides a lower bound for a function to within a constant factor

n

c g(n)

f(n)

n0

f(n) (g(n))

Alternative Definition

f(n) (g(n)) iff g(n) O(f(n))

True or False?

• n O(n) n (n) • n/2 O(n) n/2 (n)• n + 1 O(n) n + 1 (n)• n O(1) n (1)• 3 O(n) 3 (n) • n O(n2) n (n2) • n2 O(n) n2 (n)

Big Theta

• DefinitionFor a given function g(n), (g(n)) is the set of

functions:(g(n)) = {f(n): there exist positive constants c1, c2 and n0 such that

0 c1 g(n) f(n) c2 g(n) for all n n0 }

n

c1 g(n)

f(n)

n0

f(n) (g(n))

c2 g(n)

Alternative Definition

(g(n)) = O(g(n)) (g(n))

• If we can find a simple function that gives both an upper and lower bound on the growth of the function, this is very useful. It is not always simple.

Little o • Definition:

For a given function g(n), o(g(n)) is the set of functionso(g(n))= {f(n): for any positive constant c,

there exists a constant n0 such that

0 f(n) < c g(n) for all n n0 } all values of c g(n) is greater then

f(n) and these values are called little o.

• Denotes an upper bound that is not asymptotically tight

• Examples:2no(n2) but 2n2 o(n2)

The function f(n) becomes insignificant relative to g(n) as n approaches infinity

if the limit exists

Alternative Definitionfor little o

limn

f (n)

g(n)0

little-omega• Definition:

For a given function g(n), (g(n)) is the set of functions(g(n))= {f(n): for any positive constant c,

there exists a constant n0 such that

0 c g(n) < f(n) for all n n0 }

• Denotes a lower bound that is not asymptotically tight

• Examples:n (n2) n(sqrt(n)) n (lg n)

Alternative Definitionfor little omega

limn

f (n)

g(n)

The function g(n) becomes insignificant relative to f(n) as n approaches infinity

if the limit exists

Binary Relations• Each of these notations can be viewed as a

binary relation on a set of functions{t: N R*}

• Properties of relations on a set• transitivity: R is transitive iff for all a, b, c

whenever a R b and b R c, then a R c• reflexivity: R is reflexive iff for all a, a R a• symmetry: R is symmetric iff for all a and b,

a R b and b R a

Binary Relations cont.

• Property of two relations• transpose symmetry:

R1 and R2 exhibit transpose symmetry iff for all a and b when a R1 b then b R2 a

Transitivityf(n) (g(n)) g(n) (h(n)) f(n) (h(n))

f(n) (g(n)) g(n) (h(n)) f(n) (h(n))

f(n) (g(n)) g(n) (h(n)) f(n) (h(n))

f(n) (g(n)) g(n) (h(n)) f(n) (h(n))

f(n) (g(n)) g(n) (h(n)) f(n) (h(n))

Reflexivityf(n) (f(n))

f(n) (f(n)))

f(n) (f(n))

Symmetryf(n) (g(n)) g(n) (f(n)))

Transpose Symmetryf(n) (g(n)) g(n) (f(n)))

f(n) (g(n)) g(n) (f(n)))

ORDER NOTATION

Say that f is Mean that f is Write if

small oh g slower than g f(n) o(g(n))

big oh g no faster than g f(n) O(g(n)) c, no > o,

n no

f(n) c g(n)

theta g about as fast f(n) (g(n)) f(n) O(g(n)) and

as g g(n) O(f(n))

omega g no slower than g f(n) (g(n)) g(n) O(f(n)

little omega g faster than g f(n) (g(n)) g(n) (f(n))

limn

f (n)

g(n)0

Equivalence Relation

• An equivalence relation is a relation that is reflexive, symmetric and transitive.

• Which of the relations are equivalence relations?

• An equivalence relation partitions a set into equivalence classes. Describe the equivalence classes.