icom 6115: computer systems performance measurement and evaluation
DESCRIPTION
ICOM 6115: Computer Systems Performance Measurement and Evaluation. August 11, 2006. Question. Describe a performance study you have done Work or School or … Describe a performance study you have recently read about Research paper Newspaper article Scientific journal. Outline. - PowerPoint PPT PresentationTRANSCRIPT
ICOM 6115: Computer Systems Performance Measurement and
Evaluation
August 11, 2006
Question Describe a performance study you have
done Work or School or …
Describe a performance study you have recently read about Research paper Newspaper article Scientific journal
Outline Objectives (next) The Art Common Mistakes Systematic Approach
Objectives (1)
Select appropriate evaluation techniques, performance metrics and workloads for a system. Techniques: measurement, simulation, analytic
modeling Metrics: criteria to study performance
ex: response time Workloads: requests by users/applications to the
system
Objectives (2) Conduct performance measurements
correctly Need two tools: load generator and monitor
System Can it be observed?
Software Can the actions be observed?
Objectives (3) Use proper statistical techniques to compare
several alternatives One run of workload often not sufficient
Many non-deterministic computer events that effect performance
Comparing average of several runs may also not lead to correct results
Especially if variance is high
Run
Exec time
Run
Exec time
Objectives (4)
Design measurement and simulation experiments to provide the most information with the least effort. Often many factors that affect performance. Separate
out the effects that individually matter. How many experiments are needed? How can the
performance of each factor be estimated?
Objectives (5) Perform simulations correctly
Select correct language, seeds for random numbers, length of simulation run, and analysis
Before all of that, may need to validate simulator
Outline Objectives (done) The Art (next) Common Mistakes Systematic Approach
The Art of Performance Evaluation Evaluation cannot be produced mechanically
Requires intimate knowledge of system Careful selection of methodology, workload, tools
No one correct answer as two performance analysts may choose different metrics or workloads
Like art, there are techniques to learn how to use them when to apply them
Example: Comparing Two Systems Two systems, two workloads, measure
transactions per second
Work- Work-System load 1 load 2 Average
A 20 10 15B 10 20 15
They are equally good! … but is A better than B?
The Ratio Game Take system B as the base
Work- Work-System load 1 load 2 Average
A 2 0.5 1.25B 1 1 1
A is better! … but is B better than A?
Outline Objectives (done) The Art (done) Common Mistakes (next) Systematic Approach
Common Mistakes (1) Undefined Goals
There is no such thing as a general model Describe goals and then design experiments (Don’t shoot and then draw target)
Biased Goals Don’t show YOUR system better than HERS (Performance analysis is like a jury)
Unrepresentative Workload Should be representative of how system will work “in
the wild” Ex: large and small packets? Don’t test with only large
or only small
Common Mistakes (2) Wrong Evaluation Technique
Use most appropriate: model, simulation, measurement
(Don’t have a hammer and see everything as a nail) Inappropriate Level of Detail
Can have too much! Ex: modeling disk Can have too little! Ex: analytic model for congested
router No Sensitivity Analysis
Analysis is evidence and not fact Need to determine how sensitive results are to
settings
Common Mistakes (3) Improper Presentation of Results
It is not the number of graphs, but the number of graphs that help make decisions
Omitting Assumptions and Limitations Ex: may assume most traffic TCP, whereas
some links may have significant UDP traffic May lead to applying results where
assumptions do not hold
Outline Objectives (done) The Art (done) Common Mistakes (done) Systematic Approach (next)
A Systematic Approach1. State goals and define boundaries2. Select performance metrics3. List system and workload parameters4. Select factors and values5. Select evaluation techniques6. Select workload7. Design experiments8. Analyze and interpret the data9. Present the results. Repeat.
State Goals and Define Boundaries Just “measuring performance” or “seeing
how it works” is too broad Ex: goal is to decide which ISP provides better
throughput Definition of system may depend upon goals
Ex: if measuring CPU instruction speed, system may include CPU + cache
Ex: if measuring response time, system may include CPU + memory + … + OS + user workload
Select Metrics Criteria to compare performance In general, related to speed, accuracy
and/or availability of system services Ex: network performance
Speed: throughput and delay Accuracy: error rate Availability: data packets sent do arrive
Ex: processor performance Speed: time to execute instructions
List Parameters List all parameters that affect
performance System parameters (hardware and
software) Ex: CPU type, OS type, …
Workload parameters Ex: Number of users, type of requests
List may not be initially complete, so have working list and let grow as progress
Select Factors to Study Divide parameters into those that are to
be studied and those that are not Ex: may vary CPU type but fix OS type Ex: may fix packet size but vary number of
connections Select appropriate levels for each factor
Want typical and ones with potentially high impact
For workload often smaller (1/2 or 1/10th) and larger (2x or 10x) range
Start small or number can quickly overcome available resources!
Select Evaluation Technique Depends upon time, resources and desired
level of accuracy Analytic modeling
Quick, less accurate Simulation
Medium effort, medium accuracy Measurement
Typical most effort, most accurate Note, above are all typical but can be
reversed in some cases!
Select Workload Set of service requests to system Depends upon measurement technique
Analytic model may have probability of various requests
Simulation may have trace of requests from real system
Measurement may have scripts impose transactions
Should be representative of real life
Design Experiments Want to maximize results with minimal
effort Phase 1:
Many factors, few levels See which factors matter
Phase 2: Few factors, more levels See where the range of impact for the factors
is
Analyze and Interpret Data Compare alternatives Take into account variability of results
Statistical techniques Interpret results.
The analysis does not provide a conclusion Different analysts may come to different
conclusions
Present Results Make it easily understood Graphs Disseminate (entire methodology!)
"The job of a scientist is not merely to see: it is to see, understand, and communicate. Leave out any of these phases, and you're not doing science. If
you don't see, but you do understand and communicate, you're a prophet, not a scientist. If
you don't understand, but you do see and communicate, you're a reporter, not a scientist. If
you don't communicate, but you do see and understand, you're a mystic, not a scientist."