object-oriented design csc 212. announcements ask more questions! your fellow students have the...

36
Object-Oriented Design Object-Oriented Design CSC 212

Upload: morgan-ball

Post on 01-Jan-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Object-Oriented DesignObject-Oriented Design

CSC 212

Announcements

Ask more questions!Your fellow students have the same questions

(remember, I grade the daily quizzes) A different explanation can often help clarify the

matter Homework #1 on web

Due before class on ThursdayNo user I/O required

I explicitly state when this is required

Recursion

A recursive function using old results to define new values

E.g.,

Fibonacci sequence: 1, 1, 2, 3, 5, 8, 13, 21, … Value of new term is sum of two preceding ones

1 if

1 if

)1(

1!

n

n

nnn

Recursion

Recursive definitions have two parts:Base Case – solved non-recursively (often

with constant definitions) n! = 1, if n = 1 First two terms of Fibonacci sequence defined as 1

Recursive Case – solved using the function being defined

n! = n * (n-1)!, n > 1

Recursion Continued

Using recursion can simplify code:

public static int factorial(int num) { if (n == 1) return 1; else return n * factorial(num – 1);}

But, recursion can also bring problemsWhat does factorial(-2) return?

Recursion can also be very slow

Another Recursion Example

Can have multiple base cases Recursion can also be done multiple timespublic int fibonacci(int n) {

if (n < 0) { return 0; }

else if (n < 2) { return 1; }

else {

return fibonacci(n – 1) + fibonacci(n-2);

}

}

Mutual Recursion

Recursion can also occur across methods:public static int factEven(int n) { return n * factOdd(n – 1);}

public static int factOdd(int n) { if (n == 1) return 1; else return n * factEven(n – 1);}

We will see more complex methods of recursion later

Aside: Logarithms

If B * K = N, then logB N = KB is the base of the logarithm

Unless stated, in CSC logarithm base is 2 So log N really means log2 N

log N = K if and only if 2K = N log 16 is 4 log 1,024 = 10 log 1,000,000,000 ≈ 30

Logarithmic functions grow very slowly

Logarithm Examples

Number of bits required to store a binary number is logarithmic8 bits stores 256 values

log 256 = 8

Maximum value of Java word = 2,147,483,648 log 2,147,483,648 = 31

Logarithm Examples

Inventor of chess asked Emperor to be paid like this:1 grain of rice on the first square, 2 on next…

So each square has twice the grain as previous

Function grows exponentially i.e. 2n, the inverse of a logarithm

Emperors like clever games, but not always the game designers

Chess inventor was beheaded

Analysis Techniques

Running time is important when codingObviously true for real-time systemsBut also holds for most systems

But not always possible to compare times of all algorithmsLots of ways to solve a single problemMany different implementations possible for

each solution

Analysis Techniques

Want way of examining algorithm that ignores affect of compiler, hardware, etc.

Consider algorithm across different inputs, including (especially) worst possible case

How to do this analysis without dealing with implementation issues?

The Pseudo-Code Answer

Analysis is only for human eyesDo bother with details needed to make code

compile Instead use "pseudo-code"

Pseudo-code isn't realName used when writing algorithm in a

computer language-like manner

The Pseudo-Code Answer

Pseudo-code includes all important codeE.g., Loops, assignments, method calls, etc.Helps better analyze algorithm

Pseudo-code isn't formal – only used to understand algorithm Ignore unimportant punctuation, formalismsWrite pseudo-code so people can understand

and analyze it

Pseudo-code Example

What is this function computing?int exampleFunction(int n, n > 0)

returnVariable 1while (n > 0)

returnVariable = returnVariable * nn n – 1

return returnVariable

Algorithm Analysis

When comparing algorithms, do not want to measure exact timesDo not want to do all the coding and testing Instead want back-of-the-envelope measuresProvide quick and easy evaluation and

comparison Implementation often affects execution times,

anyway!

Big-Oh Notation

Big-Oh computes code complexityProvides worst-case analysis of performanceExecution time related to code complexityEnables comparison between algorithms

Can use pseudo-code description of algorithm Do not need to implement all approaches Avoids comparing details not related to algorithms

E.g., Compiler, CPU, Users typing speed

Algorithmic Analysis

1.E+00

1.E+01

1.E+02

1.E+03

1.E+04

1.E+05

1.E+06

1.E+07

1.E+08

1.E+09

2 4 8 16 32 64 128 256 512 1024n

1log nnn^2n^32^n

Algorithm Analysis

Approximate time to run a program with n inputs on 1GHz machine:

n = 10 n = 50 n = 100 n = 1000 n = 106

O(n log n) 35 ns 200 ns 700 ns 10000 ns 20 ms

O(n2) 100 ns 2500 ns 10000 ns 1 ms 17 min

O(n5) 0.1 ms 0.3 s 10.8 s 11.6 days 3x1013 years

O(2n) 1000 ns 13 days 4 x 1014 years Too long! Too long!

O(n!) 4 ms Too long! Too long! Too long! Too long!

Big-Oh Notation

Want correct results for any data setOnly consider details affecting large data sets Ignore multipliers: O(5n) = O(2n) = O(n)

Constant multipliers affected by implementation Coding tricks can often reduce these factors,

anyway

Use only dominating term: O(n5 + n2) = O(n5) Does extra 17 minutes matter after 3x1013 years?

Analysis of Algorithms

Individual statementsE.g., method calls, assignments, arithmetic…O(1) Complexity

Also called “constant time”

Also holds for sequence of statements Provided statements (including loops) execute

constant number of times for all input sizes Remember: only want rough estimate – we ignore

constant multipliers

Analysis of Algorithms

Simple Loopsfor (int i = 0; i < n; i++) { S }for statement executed n times If S is simple sequence of statements (e.g.,

complexity of O(1)), total complexity is n*O(1) = O(n)

Analysis of Algorithms, cont.

Slightly more complicated loopsfor (int i = 0; i < n; i += 2) { S }

i takes values 0, 2, 4, ... until it is larger than n for loop executes n/2 times If S executes in O(1) time, loop complexity is

n/2 * O(1) = O(n/2) = O(½n) = O(n)

Analysis of Algorithms, cont.

Nested Loopsfor (int i=0; i<n; i++) {

for (int j = 0; j < n; j++) { S }

} If S executes in constant time (e.g., O(1))

complexity of j loop is n * O(1) = O(n) i loop's complexity = n * j loop's complexity =

n * O(n) = O(n2)

Analysis of Algorithms, cont.

Complex Nested Loopsfor (int m = 0; m < n; m++) {

for (int i = 0; i < m; i++) { S }

}

Outer loop executes n timesInner loop executes m timesAssume S executes in constant time

Analysis of Algorithms, cont.

Total number of executions is:= 1 + 2 + 3 + ... + n - 3 + n - 2 + n -1

= (1 + n-1) + (2 + n-2) + ... + ( (n-1)/2 +(n+1)/2)

= (n) + (n) + (n) + ... + (n)

= n * n/2

= 0.5 * n2

= O(n2)

Analysis of Algorithms, cont.

Complex Nested LoopsBig-Oh notation matches previous nested loopMinor improvement in inner loop didn't change

big pictureBut execution time may be half as much

Big-Oh cannot be used to measure small improvements

Analysis of Algorithms, cont.

Loops with ‘jumps’for (int i = 0; i < n; i *= 2) { S }

i equals 1, 2, 4, ... until it exceeds n for loop executes 1 + log2n times

If S executes in O(1) time, loop complexity is:(log2n + 1) * O(1) = O(log n + 1) = O(log n)

Quick Analysis Tricks

Analyzing nested loops Complexity is product of loops’ complexity. What is complexity of the following code?

for (int i = 0; i < n; i++)

for (int j = 0; j < i; j++)

for (int k = 0; k < n; k++)

for (int m = 0; m < k; m += 2) for (int q = 0; q < j; q++) { S }

Quick Analysis Tricks

Analyzing consecutive loopsComplexity will be longest loop's complexityWhat is complexity of 5 consecutive loops: 4

loop from 1 - n; 1 loops from 1 - n2?

Experimental Verification

Occasionally, may want to verify you have determined the correct complexity

To do this verification Implement the algorithm in a real

programming languagePick initial number of inputs (i.e. n)Measure time needed to run program

Experimental Verification

Verifying Big-Oh analysis Increase n by a factor of 10 and run it again

if logarithmic (O(log n)), takes 3x longer if linear (O(n)), takes 10x longer if O(n log n), takes 13x longer if quadratic (O(n2)), takes 100x longer if exponential (O(2n)), takes over 1000x longer

Limitations of Big-Oh Analysis

Constants can make a differenceFor n < 8, 3n is larger than n log n Ignores differences in time needed to access

data in main memory versus on a disk Disks can take thousands of times longer to read

Worst case rarely happens Big-Oh notation often overestimates total time

Daily Quiz #1

Finish writing the findMin and findMax methods without using any loops:

public void printMinAndMax(int[] a) { if (a.length < 1) return; System.out.println(“Min entry value: ” + Integer.toString(findMin(a,0))); System.out.println(“Max entry value: ” + Integer.toString(findMax(a,0)));}public int findMin(int[] a, int n) { … }public int findMax(int[] a, int n) { … }

Hint: Use n to determine when to stop the recursion

Daily Quiz #2GIVEN:public abstract class Person { ... }public interface Worker { ... }public class Student extends Person { ... }public class Employee extends Person implements Worker { ... }public class StudentEmp extends Student implements Worker {... }

WHICH OF THESE ARE ILLEGAL? OK? NEED CASTING?

Person p1 = new Person();Person p2 = new Student();Person p3 = new Employee();Person p4 = new StudentEmp();Worker w2 = p2;Worker w3 = p3;Worker w4 = p4;

Student s2 = p2;Student s3 = p3;Student s4 = p4;Employee e3 = p3;Employee e4 = p4;StudentEmp se3 = p3;StudentEmp se4 = p4;

Daily Quiz #2

Not using these slides, write each of the following:A method executing in O(1) timeA method executing in O(n) timeA method executing in O(n log n) timeA method executing in O(n2) timeA method executing in O(n4 log n) time