the pursuit of quality: chasing tornadoes or just hot air?...•user personas (16 year old gamer, 30...
TRANSCRIPT
The Pursuit of Quality: Chasing
Tornadoes or Just Hot Air?
Gerrard Consulting Limited PO Box 347
Maidenhead
Berkshire
SL6 2GU
Tel: +44 (0) 1628 639173
Fax: +44 (0) 1628 630398
Web: gerrardconsulting.com
Slide 1
Paul Gerrard
Slide 2
Paul is a consultant, teacher, author, webmaster, programmer, tester,
conference speaker, rowing coach and a publisher. He has conducted
consulting assignments in all aspects of software testing and quality
assurance, specialising in test assurance. He has presented keynote talks and
tutorials at testing conferences across Europe, the USA, Australia, South
Africa and occasionally won awards for them. In 2010 he won the Eurostar
European Testing Excellence Award.
Paul designed and built the Gerrard Consulting story platform on which the
maelscrum.com and businessstorymanager.com products are based.
Gerrard Consulting Limited hosts the UK Test Management Forum.
Agenda
• What is Quality?
• Models for quality and testing
• Examples of models
• Models and stakeholders
• Failures of systems, failures of models
• Close
Slide 3
Weather
• Rain is great for farmers and their crops, but
terrible for tourists
• Wind is essential for sailors and windmills but
bad for the rest of us
• Quality, like weather, can be good or bad and
that depends on who you are.
Slide 4
That’s Fantastic!
That’s Terrible!
Slide 5
Quality is a relationship
• Quality is not an attribute of a system
• It is a relationship between systems and stakeholders who take different views
• The model of Quality that prevails has more to do with stakeholders than the system itself
Slide 6
The concepts of quality, risk,
comfort, intuitiveness … • Concepts that most
people understand, but few can explain
• But it‟s a lot worse than that
• Quality is an all-encompassing, collective term for these and many other difficult concepts
• A term that means all things to all people
• (I try and avoid the Q-word).
Slide 7
Models for Quality and
Testing
Slide 8
Models
Slide 9
Models are everywhere
Models and reality
• In our minds we build mental models of everything we experience (and also, many things we don‟t experience)
• When we pick up a glass of water, we build models
– The 3-dimentional location and relationship between the glass, the water, the table it sits on and our body
– As we reach for the glass, our brain processes the signals from our eyes, our muscles and the feelings in our fingertips
– It continuously compares experience with the model and adjusts/rebuilds the model many times
• … just to lift a cup of water – incredible!
Slide 10
Some familiar models
• The project plan is a model
– The resources, activities, effort, costs, risks and future decision making
• System requirements are a model
– The “what and how” of the system
– What: the features and functionality
– How: how the system works (fast, secure, reliable)
• User personas (16 year old gamer, 30 year old security hacker, 50 year old Man United fan).
Slide 11
Where quality comes from
• Quality is the outcome of a comparison – Our mental model of perfection
– Our experience of reality
• Mental models are internal, personal and unique to us
• We could share them using some kind of Vulcan mind meld
• But usually, we can write them down or we can talk about them
• However we communicate, there is noise and information gets corrupted/lost in translation.
Slide 12
A quality model?
• The requirements and design describe the
behaviour of a system
• Functional
– Mapping test cases to requirements is all we need
• Non-Functional
– All technical attributes are defined and measured
• Quality and therefore testing assumes a model
– Often undocumented, the model may not be shared,
understood, complete, consistent, correct…
Slide 13
Test design is based on models
• Models describe the environment, system, usage, users, goals, risks
• They simplify the context of the test - irrelevant or negligible details are ignored in the model
• Focus attention on a particular aspect of the behaviour of the system
• Generate a set of unique and diverse tests (within the context of the model)
• Enable the testing to be estimated, planned, monitored and evaluated for its completeness (coverage).
• Models help us to select tests in a systematic way.
Slide 14
Examples of test models
• A checklist or sets of criteria
– Goals, risks, process paths, interfaces, message type…
• Diagrams from requirements or design documents
• Analyses of narrative text or tables
• Some models are documented, many models are never committed to paper
– Can be mental models constructed specifically to guide the tester whilst they explore the system under test and guide their next action.
Slide 15
Sources of models
• Test Basis – We analyse the text or diagrams or information that describe
required behaviour (or use past experience and knowledge)
• System architecture: – We identify testable items in its user-interface, structure or
internal design
• Modes of failure (product risks): – We identify potential ways in which the system might fail that
are of concern to stakeholders
• Usage patterns: – We focus on the way the system will be used, operated and
interacted with in a business context using personas
• Everything looks fine – doesn‟t it?
Slide 16
But all models (over-)simplify
• But requirements are never perfect, not all attributes can be meaningfully measured
• Models incorporate implicit assumptions and are approximate representations
• All test models are heuristic, useful in some situations, always incomplete and fallible
• Before we adopt a model, we need to know: – What aspects of the behaviour, design, modes of
failure or usage the model helps us to identify
– What assumptions and simplifications it includes (explicitly or implicitly).
Slide 17
Formality
• Formal test models – Derived from analyses of requirements or code
– Quantitative coverage measure can be obtained from a formal test mode (mostly)
• Informal test models – E.g. some models are just lists of modes of failure, risks or
vulnerabilities.
– Informal models cannot be used to define quantitative coverage measures
• Ad-hoc models – Some models can be ad-hoc, invented by the tester just before
or even during testing
– Can be formal or informal.
Slide 18
Examples of Models
Slide 19
Basic test design techniques are
based on the simplest models
• Equivalence partitions and boundary values:
– Presume single input, single output responses
– All values in partitions are equivalent, but the
boundaries are the most important
• These techniques are useful, but they date
from the „green-screen‟ era.
Slide 20
“Green Screen” equivalence model
• Single input, single output
• All input is classified and
partitioned with rules
• One test per rule is
enough!
• But we don‟t consider:
– The state of the system
– Combinations of values.
Slide 21
If m<1 then “Error” Else if m>12 then
“Error” Else “OK”
Single Input
Single Output
State Transition Testing
Slide 22
Start State
Room Requested
Room Booked
On Waiting
List
Overnight Stay
Booking Cancelled
Checkout Room available Decrement room count
Room request None
Customer arrives None
Customer pays Increment room count
No room available Add to waiting list
Customer cancels Remove from waiting list
Room available Decrement room count
Customer cancels Increment room count
But the number of states is infinite!
• State-Transition considers:
– The states of the system and
– The valid/invalid transitions between states
• Some systems have many, many states
– A real-time system e.g. telecoms switch may have 25,000 distinct states
– State may depend on many variables that can have infinite values in combination
• How confident can we be in this model?
Slide 23
End-to-end/transaction-flow tests
• End–to-end tests can follow a path through a
process or a user journey
• The mechanics of the experience are
simulated but…
Slide 24
Bad experience leads to attrition
Slide 25
• Typical form-filling on government sites
intended to allow citizens to „apply online‟
Page 1 Page 2 Page 6 Page 5 Page 4 Page 3 Page 7
45% 72% 48% 21% 85% 80% Conversion by page
45% 32% 16% 3% 3% 2% Cumulative
• Every page „works‟ but the user-experience is so poor that only 2% finish the journey
• Modelling the journey is good, but not enough…
• We need to model the experience too.
Models and
Stakeholders
Slide 26
Stakeholders and test models
• Stakeholders may not tell testers to use
specific test models; you need to explain them
to stakeholders so they understand
• The challenge(s):
– Stakeholders may be of the opinion that the
models you propose generate too few tests to be
meaningful or too many to be economic
– We need to engage stakeholders.
Slide 27
„Measuring quality‟ feels good
but…
• Measurable quality attributes make techies feel
good, but they don‟t help stakeholders if they
can‟t be related to experience
• If statistics don‟t inform the stakeholders‟
vision or model of quality
– We think we do a good job
– They think we waste their time and money.
Slide 28
Relevance
• Documented or not, testers need and use models
to identify what is important and what to test
• A control flow graph has meaning (and value) to a
programmer but not to an end-user
• An equivalence partition may have meaning to
users but not the CEO of the company
• Control flow, equivalence partitions are models
that have value in some, but never all, contexts.
Slide 29
Helping stakeholders to make
better decisions is the tester‟s goal
• We need models that
– Do more than identify tests
– Take account of the stakeholders‟ perspective and
have meaning in the context of their decision-
making
• If we „measure quality‟ using technical models
– We delude both our stakeholders and ourselves
into thinking we are in control of Quality
– We‟re not.
Slide 30
Failures of Systems,
Failures of Models
Slide 31
F-16 bug (found in flight)
• One of the early problems was that you could
flip the plane over and the computer would
gladly let you drop a bomb or fuel tank. It
would drop, dent the wing, and then roll off.
• http://catless.ncl.ac.uk/Risks/3.44.html#subj1.1
Slide 32
Poor test model
Slide 33
Poor test model
Slide 34
Poor test model
2. Web Sub-System
Web Server
1. Application
(objects)
Sub-System
Database Server
Banking System
(Credit Card
Processor)
Legacy System(s)
3. Order Processing Sub-System
4. Full E-Business System
Scope of testing for E-Commerce
People
Process
Training
Environment
Slide 35
Test strategy
Slide 36
• Our test strategy must align with our model
of quality and our risk-assessment
Test Phase Focus
Requirements, design etc. Relevance, correctness, completeness, ambiguity etc.
Component Input validation, correct behaviour, output validation, statement and branch coverage
Integration Correct, authorised transfer of control, exchange of data, consistency of use and reconciliations
System (-system) End-to-end accuracy, consistency, security, performance and reliability
Acceptance Alignment to business goals, end-to-end ease of use and experience, successful outcomes, user personas
Every focus area requires test
model(s)
Failure of testing is usually a failure
in a test model • If the right models are selected, and commitment
is made to cover them
– The testing usually gets done
• But often, no model is explicitly selected at all
• Where a model fails, it is usually wrong because:
– The model does not represent reality
– The scope of the model is too narrow
– The model ignores critical aspects (context, people,
process, environment or training/capability).
Slide 37
Close
• We need to understand what quality is before we can pursue and achieve it
• Testing often fails because test models are not used or understood
• Testers need models to test but the „standard‟ quality models are too simple
• We need to take stakeholder views into account to create relevant testing models
• Using models sounds techy, but it‟s completely natural – it‟s part of what makes us human.
Slide 38
The Pursuit of Quality: Chasing
Tornadoes or Just Hot Air?
Slide 39
gerrardconsulting.com
testela.com
test-axioms.com
uktmf.com
http://catless.ncl.ac.uk/Risks/10.10#
subj6.1 (manufacturer response in RED)
• BA flight from New York and Fairbanks in US
• Co-pilot entered new navigational data into the system and mis-typed a PIN code
• System response: – “Invalid PIN number selected”
– “Access violation, contact your credit institution if you believe there is an error.”
• All the plane's controls froze and it refused to respond to commands
• SOS call to manufacturer at Aerospatiale in France…
• “The pasty little Englishman probably had too many meat pies and Guiness".
Slide 40
Problem traced to the ATM-6000 INS
computer: a modified ATM product • “The system will automatically remove the restrictions
at the start of the next banking day”
• Apparently, manual control could be re-set if a crewmember went back of plane and operated the elevators manually
• “There is nothing wrong with ze plane, that a little pinch in the rear will not cure. Just like a woman. If these English knew anything about women, they would never have had to call us.”
• “The plane was able to safely land at Denver's Stapelton airport, where the craft was repaired and all crewmember's credit histories reviewed."
Slide 41