icom 6115: computer systems performance measurement and evaluation

27
ICOM 6115: Computer Systems Performance Measurement and Evaluation August 11, 2006

Upload: kane

Post on 25-Feb-2016

57 views

Category:

Documents


0 download

DESCRIPTION

ICOM 6115: Computer Systems Performance Measurement and Evaluation. August 11, 2006. Question. Describe a performance study you have done Work or School or … Describe a performance study you have recently read about Research paper Newspaper article Scientific journal. Outline. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: ICOM 6115: Computer Systems Performance Measurement and Evaluation

ICOM 6115: Computer Systems Performance Measurement and

Evaluation

August 11, 2006

Page 2: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Question Describe a performance study you have

done Work or School or …

Describe a performance study you have recently read about Research paper Newspaper article Scientific journal

Page 3: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Outline Objectives (next) The Art Common Mistakes Systematic Approach

Page 4: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Objectives (1)

Select appropriate evaluation techniques, performance metrics and workloads for a system. Techniques: measurement, simulation, analytic

modeling Metrics: criteria to study performance

ex: response time Workloads: requests by users/applications to the

system

Page 5: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Objectives (2) Conduct performance measurements

correctly Need two tools: load generator and monitor

System Can it be observed?

Software Can the actions be observed?

Page 6: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Objectives (3) Use proper statistical techniques to compare

several alternatives One run of workload often not sufficient

Many non-deterministic computer events that effect performance

Comparing average of several runs may also not lead to correct results

Especially if variance is high

Run

Exec time

Run

Exec time

Page 7: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Objectives (4)

Design measurement and simulation experiments to provide the most information with the least effort. Often many factors that affect performance. Separate

out the effects that individually matter. How many experiments are needed? How can the

performance of each factor be estimated?

Page 8: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Objectives (5) Perform simulations correctly

Select correct language, seeds for random numbers, length of simulation run, and analysis

Before all of that, may need to validate simulator

Page 9: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Outline Objectives (done) The Art (next) Common Mistakes Systematic Approach

Page 10: ICOM 6115: Computer Systems Performance Measurement and Evaluation

The Art of Performance Evaluation Evaluation cannot be produced mechanically

Requires intimate knowledge of system Careful selection of methodology, workload, tools

No one correct answer as two performance analysts may choose different metrics or workloads

Like art, there are techniques to learn how to use them when to apply them

Page 11: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Example: Comparing Two Systems Two systems, two workloads, measure

transactions per second

Work- Work-System load 1 load 2 Average

A 20 10 15B 10 20 15

They are equally good! … but is A better than B?

Page 12: ICOM 6115: Computer Systems Performance Measurement and Evaluation

The Ratio Game Take system B as the base

Work- Work-System load 1 load 2 Average

A 2 0.5 1.25B 1 1 1

A is better! … but is B better than A?

Page 13: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Outline Objectives (done) The Art (done) Common Mistakes (next) Systematic Approach

Page 14: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Common Mistakes (1) Undefined Goals

There is no such thing as a general model Describe goals and then design experiments (Don’t shoot and then draw target)

Biased Goals Don’t show YOUR system better than HERS (Performance analysis is like a jury)

Unrepresentative Workload Should be representative of how system will work “in

the wild” Ex: large and small packets? Don’t test with only large

or only small

Page 15: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Common Mistakes (2) Wrong Evaluation Technique

Use most appropriate: model, simulation, measurement

(Don’t have a hammer and see everything as a nail) Inappropriate Level of Detail

Can have too much! Ex: modeling disk Can have too little! Ex: analytic model for congested

router No Sensitivity Analysis

Analysis is evidence and not fact Need to determine how sensitive results are to

settings

Page 16: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Common Mistakes (3) Improper Presentation of Results

It is not the number of graphs, but the number of graphs that help make decisions

Omitting Assumptions and Limitations Ex: may assume most traffic TCP, whereas

some links may have significant UDP traffic May lead to applying results where

assumptions do not hold

Page 17: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Outline Objectives (done) The Art (done) Common Mistakes (done) Systematic Approach (next)

Page 18: ICOM 6115: Computer Systems Performance Measurement and Evaluation

A Systematic Approach1. State goals and define boundaries2. Select performance metrics3. List system and workload parameters4. Select factors and values5. Select evaluation techniques6. Select workload7. Design experiments8. Analyze and interpret the data9. Present the results. Repeat.

Page 19: ICOM 6115: Computer Systems Performance Measurement and Evaluation

State Goals and Define Boundaries Just “measuring performance” or “seeing

how it works” is too broad Ex: goal is to decide which ISP provides better

throughput Definition of system may depend upon goals

Ex: if measuring CPU instruction speed, system may include CPU + cache

Ex: if measuring response time, system may include CPU + memory + … + OS + user workload

Page 20: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Select Metrics Criteria to compare performance In general, related to speed, accuracy

and/or availability of system services Ex: network performance

Speed: throughput and delay Accuracy: error rate Availability: data packets sent do arrive

Ex: processor performance Speed: time to execute instructions

Page 21: ICOM 6115: Computer Systems Performance Measurement and Evaluation

List Parameters List all parameters that affect

performance System parameters (hardware and

software) Ex: CPU type, OS type, …

Workload parameters Ex: Number of users, type of requests

List may not be initially complete, so have working list and let grow as progress

Page 22: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Select Factors to Study Divide parameters into those that are to

be studied and those that are not Ex: may vary CPU type but fix OS type Ex: may fix packet size but vary number of

connections Select appropriate levels for each factor

Want typical and ones with potentially high impact

For workload often smaller (1/2 or 1/10th) and larger (2x or 10x) range

Start small or number can quickly overcome available resources!

Page 23: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Select Evaluation Technique Depends upon time, resources and desired

level of accuracy Analytic modeling

Quick, less accurate Simulation

Medium effort, medium accuracy Measurement

Typical most effort, most accurate Note, above are all typical but can be

reversed in some cases!

Page 24: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Select Workload Set of service requests to system Depends upon measurement technique

Analytic model may have probability of various requests

Simulation may have trace of requests from real system

Measurement may have scripts impose transactions

Should be representative of real life

Page 25: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Design Experiments Want to maximize results with minimal

effort Phase 1:

Many factors, few levels See which factors matter

Phase 2: Few factors, more levels See where the range of impact for the factors

is

Page 26: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Analyze and Interpret Data Compare alternatives Take into account variability of results

Statistical techniques Interpret results.

The analysis does not provide a conclusion Different analysts may come to different

conclusions

Page 27: ICOM 6115: Computer Systems Performance Measurement and Evaluation

Present Results Make it easily understood Graphs Disseminate (entire methodology!)

"The job of a scientist is not merely to see: it is to see, understand, and communicate. Leave out any of these phases, and you're not doing science. If

you don't see, but you do understand and communicate, you're a prophet, not a scientist. If

you don't understand, but you do see and communicate, you're a reporter, not a scientist. If

you don't communicate, but you do see and understand, you're a mystic, not a scientist."