continuous improvement (groupon, palo alto 2013)
TRANSCRIPT
![Page 1: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/1.jpg)
Noah Sussmannsnoahsussmancomnoahsussman
How Continuous Delivery is changing Quality AssuranceContinuous ImprovementGroupOnPalo AltoJanuary 15 2013
film still from The Lord of the Rings
The canonical Agile release cycle
Cocento Tecnologia on Flickr
Sprints of two or more weeks in length
TheMasonDixon on Etsy
Start deployment once the sprint is over
SisterDimension on Flickr
QA is part of the release process
film still from The Lord of the Rings
QA sign-off is required before going live
evoo73 on Flickr
The Continuous release cycle
Travis S on Flickr
Minimum viable feature set
Releasing a feature is decoupled from deploying code
David E Smith on Flickr
An airport without an air traffic controller
mdashChad Dickerson
Etsy
Real-time data on how releases impact revenue
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 2: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/2.jpg)
film still from The Lord of the Rings
The canonical Agile release cycle
Cocento Tecnologia on Flickr
Sprints of two or more weeks in length
TheMasonDixon on Etsy
Start deployment once the sprint is over
SisterDimension on Flickr
QA is part of the release process
film still from The Lord of the Rings
QA sign-off is required before going live
evoo73 on Flickr
The Continuous release cycle
Travis S on Flickr
Minimum viable feature set
Releasing a feature is decoupled from deploying code
David E Smith on Flickr
An airport without an air traffic controller
mdashChad Dickerson
Etsy
Real-time data on how releases impact revenue
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 3: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/3.jpg)
Cocento Tecnologia on Flickr
Sprints of two or more weeks in length
TheMasonDixon on Etsy
Start deployment once the sprint is over
SisterDimension on Flickr
QA is part of the release process
film still from The Lord of the Rings
QA sign-off is required before going live
evoo73 on Flickr
The Continuous release cycle
Travis S on Flickr
Minimum viable feature set
Releasing a feature is decoupled from deploying code
David E Smith on Flickr
An airport without an air traffic controller
mdashChad Dickerson
Etsy
Real-time data on how releases impact revenue
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 4: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/4.jpg)
TheMasonDixon on Etsy
Start deployment once the sprint is over
SisterDimension on Flickr
QA is part of the release process
film still from The Lord of the Rings
QA sign-off is required before going live
evoo73 on Flickr
The Continuous release cycle
Travis S on Flickr
Minimum viable feature set
Releasing a feature is decoupled from deploying code
David E Smith on Flickr
An airport without an air traffic controller
mdashChad Dickerson
Etsy
Real-time data on how releases impact revenue
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 5: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/5.jpg)
SisterDimension on Flickr
QA is part of the release process
film still from The Lord of the Rings
QA sign-off is required before going live
evoo73 on Flickr
The Continuous release cycle
Travis S on Flickr
Minimum viable feature set
Releasing a feature is decoupled from deploying code
David E Smith on Flickr
An airport without an air traffic controller
mdashChad Dickerson
Etsy
Real-time data on how releases impact revenue
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 6: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/6.jpg)
film still from The Lord of the Rings
QA sign-off is required before going live
evoo73 on Flickr
The Continuous release cycle
Travis S on Flickr
Minimum viable feature set
Releasing a feature is decoupled from deploying code
David E Smith on Flickr
An airport without an air traffic controller
mdashChad Dickerson
Etsy
Real-time data on how releases impact revenue
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 7: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/7.jpg)
evoo73 on Flickr
The Continuous release cycle
Travis S on Flickr
Minimum viable feature set
Releasing a feature is decoupled from deploying code
David E Smith on Flickr
An airport without an air traffic controller
mdashChad Dickerson
Etsy
Real-time data on how releases impact revenue
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 8: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/8.jpg)
Travis S on Flickr
Minimum viable feature set
Releasing a feature is decoupled from deploying code
David E Smith on Flickr
An airport without an air traffic controller
mdashChad Dickerson
Etsy
Real-time data on how releases impact revenue
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 9: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/9.jpg)
Releasing a feature is decoupled from deploying code
David E Smith on Flickr
An airport without an air traffic controller
mdashChad Dickerson
Etsy
Real-time data on how releases impact revenue
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 10: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/10.jpg)
An airport without an air traffic controller
mdashChad Dickerson
Etsy
Real-time data on how releases impact revenue
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 11: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/11.jpg)
Etsy
Real-time data on how releases impact revenue
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 12: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/12.jpg)
Default to open access
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 13: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/13.jpg)
Constant tweaks to live features
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 14: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/14.jpg)
dogpose on Flickr
Large features are deployed piecemeal over time
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 15: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/15.jpg)
NASA
Every feature is part of an AB campaign
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 16: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/16.jpg)
Joy and Jon
Dark launches
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 17: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/17.jpg)
NASA
Opt-in experiments
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 18: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/18.jpg)
NASA
Partial rollouts
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 19: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/19.jpg)
Wikipedia
Config Flags
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 20: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/20.jpg)
Joe Thomissen on Flickr
Wire-Offs
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 21: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/21.jpg)
if ($cfg[new_search]) new hotness$resp = search_solr()
else old busted$resp = search_grep()
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 22: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/22.jpg)
$cfg = array( checkout =gt true homepage =gt true profiles =gt true new_search =gt false)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 23: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/23.jpg)
NASA
There is no ldquodone donerdquo
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 24: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/24.jpg)
NASA
Observed Behavior Of Complex Systems
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 25: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/25.jpg)
NASA
Emergent behaviors require unplanned responses
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 26: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/26.jpg)
NASA
Improvements are discovered rather than designed
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 27: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/27.jpg)
NASA
Users of the system have complex expectations
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 28: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/28.jpg)
NASA
Complex systems are never ldquocompleterdquo
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 29: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/29.jpg)
NASA
QA Happens When
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 30: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/30.jpg)
NASA
First of all what is ldquoQuality Assurancerdquo
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 31: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/31.jpg)
NASA
QA assuring that there are no defects
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 32: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/32.jpg)
NASA
It is impossible to prove the absence of defects
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 33: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/33.jpg)
Lukjonis
There will always be bugs in production
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 34: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/34.jpg)
Testing is everyonersquos job
Library of Congress
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 35: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/35.jpg)
The Jargon File
Myths About Bug Detection
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 36: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/36.jpg)
Myth there are a finite number of bugs
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 37: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/37.jpg)
niscratz on Flickr
Myth here are a finite number of detectable bugs
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 38: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/38.jpg)
NASA
Myth all severity one bugs can be found before release
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 39: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/39.jpg)
Fred Brooks at Etsy
Myth software is built to specifications
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 40: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/40.jpg)
Myth at some point software is finished
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 41: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/41.jpg)
Myth most bugs have complex unpredictable causes
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 42: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/42.jpg)
The animistic metaphor of the bug that maliciously sneaked in while the programmer was not looking disguises that the error is the programmers own creation
mdash Edsger Dijkstra
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 43: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/43.jpg)
The whole time Irsquom programming Irsquom constantly checking my assumptions
mdashRasmus Lerdorf
loriabys
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 44: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/44.jpg)
As youre about to add a comment ask yourself ldquoHow can I improve the code so that this comment isnt neededrdquo Improve the code and then document it to make it even clearer
mdash Steve McConnell
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 45: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/45.jpg)
Debugging is twice as hard as writing the code in the first place Therefore if you write the code as cleverly as possible you are by definition not smart enough to debug it
mdashBrian Kernighan
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 46: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/46.jpg)
No blame
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 47: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/47.jpg)
Many Small Anomalies Combined
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 48: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/48.jpg)
An organizations defenses against failure are a series of barriers represented as slices of swiss cheese The holes in the cheese represent weaknesses in individual parts of the system Failures occur when a hazard passes through all of the holes in all of the defenses
mdash Wikipedia
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 49: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/49.jpg)
John Allspaw
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 50: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/50.jpg)
Prioritize the elimination of small errors
John Allspaw
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 51: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/51.jpg)
Focus less on mitigation of large catastrophic failures
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 52: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/52.jpg)
Optimize for recovery rather than failure prevention
Failure is inevitable
Richard Avedon
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 53: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/53.jpg)
Unit testing is great for preventing small errors
John Allspaw
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 54: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/54.jpg)
Resilience Not ldquoQualityrdquo
John Allspaw
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 55: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/55.jpg)
NASA
Readable code
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 56: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/56.jpg)
NASA
Reasonable test coverage
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 57: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/57.jpg)
NASA
Sane architecture
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 58: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/58.jpg)
NASA
Good debugging tools
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 59: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/59.jpg)
NASA
An engineering culture that values refactoring
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 60: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/60.jpg)
NASA
Measurable goals
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 61: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/61.jpg)
NASA
Manual TestingBut probably not the kind yoursquore thinking of
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 62: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/62.jpg)
NASA
Real-Time Monitoring is the new face of testing
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 63: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/63.jpg)
Etsy
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 64: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/64.jpg)
Etsy
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 65: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/65.jpg)
Etsy
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 66: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/66.jpg)
Anomaly detection is hard
Greg and Tim Hildebrandt
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 67: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/67.jpg)
NASA
Watching the graphs
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 68: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/68.jpg)
NASA
As of 2012 Etsy collected well over a quarter million real-time metrics
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 69: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/69.jpg)
NASA
Deciding which metrics matter is a human problem
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 70: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/70.jpg)
NASA
Everyone watches some subset of the graphs
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 71: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/71.jpg)
NASA
Human vision is an excellent tool for anomaly detection
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 72: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/72.jpg)
NASA
QA happens when
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 73: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/73.jpg)
NASA
Exploratory testing can be performed at any time
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 74: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/74.jpg)
NASA
Rigorous scientific approach
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 75: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/75.jpg)
NASA
Focus on customer satisfaction
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 76: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/76.jpg)
NASA
Less focus on product specifications
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 77: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/77.jpg)
NASA
Exploratory Testing is equally useful before or after a release
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 78: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/78.jpg)
NASA
Just Quality
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 79: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/79.jpg)
NASA
ldquoAssurancerdquo is a terrible word Letrsquos discard it
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 80: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/80.jpg)
NASA
Quality exists but itrsquos tricky to assure or prove that
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 81: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/81.jpg)
NASA
Therersquos no such thing as a formal proof of quality
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 82: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/82.jpg)
NASA
Most of us would agree that quality exists
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 83: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/83.jpg)
NASA
ldquoCustomer Experiencerdquo is a better term of art than ldquoQualityrdquo
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 84: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/84.jpg)
NASA
Customer ExperienceThough therersquos no formal proof for that either
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 85: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/85.jpg)
NASA
Exploratory Testing addresses areas that Developer Testing doesnrsquot
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 86: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/86.jpg)
NASA
Developer Testing validates assumptions
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 87: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/87.jpg)
NASA
The Independent Testerrsquos job is to invalidate assumptions
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 88: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/88.jpg)
NASA
Technology Informs Customer Experience
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 89: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/89.jpg)
NASA
Exploratory Testing requires an understanding of the whole system
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 90: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/90.jpg)
NASA
Exploratory Testing requires understanding how the system serves a community of users
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 91: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/91.jpg)
NASA
Customer Experience is as much about technology as it is about product requirements
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 92: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/92.jpg)
mdash Eric S Raymond
Most bugs most of the time are easily nailed given even an incomplete but suggestive characterization of their error conditions at source-code level
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 93: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/93.jpg)
NASA
Source diffs logsIf your QA Analysts donrsquot look at these mdash teach them
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 94: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/94.jpg)
NASA
Customer Support
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 95: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/95.jpg)
NASA
Your customer support operators spend more time talking to your users than anyone else
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 96: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/96.jpg)
NASA
Customer Support interface with users as individuals rather than as aggregate data
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 97: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/97.jpg)
Keep the feedback loop short
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 98: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/98.jpg)
Manage Your Culture
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 99: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/99.jpg)
NASA
Effeciency To Thoroughness Trade-Off
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 100: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/100.jpg)
NASA
Rapid release cycles have different risks thanslower release cycles
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 101: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/101.jpg)
NASA
Continuous Delivery does not alter the fundamental nature of risk
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 102: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/102.jpg)
NASA
Test in both dev and prod
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 103: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/103.jpg)
NASA
Detectable errors should be caught in dev
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 104: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/104.jpg)
NASA
Undetectable errors must be worked out in production
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 105: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/105.jpg)
NASA
Software exists in context
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 106: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/106.jpg)
NASA
Networks services and people are always in flux
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 107: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/107.jpg)
Small changesets are easier to debug
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 108: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/108.jpg)
NASA
An SCM revert is a changeset
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 109: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/109.jpg)
NASA
Large changesets are riskier and harder to debug
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 110: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/110.jpg)
NASA
Fail Forward
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 111: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/111.jpg)
scrapnow on Etsy
Always deploy the HEAD revision of trunk
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 112: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/112.jpg)
Never roll back to an earlier state Always roll forward When its desireable to revert a previous change do that as part of a new commit
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 113: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/113.jpg)
NASA
Instead of rolling back fix the problem and move on
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 114: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/114.jpg)
NASA
Let go of the idea of ldquolast stable releaserdquo
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 115: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/115.jpg)
Scott Holloway
Focus less on satisfying the requirements
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 116: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/116.jpg)
NASA
Watch the graphs
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 117: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/117.jpg)
NASA
Listen to your customers
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 118: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/118.jpg)
Kirsten Dunst on the set of Marie Antoinette
Build a culture of shared responsibility
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 119: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/119.jpg)
Kirsten Dunst on the set of Marie Antoinette
Low-Ceremony Process
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 120: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/120.jpg)
WSHS Science blog
Iteratively improve your product
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 121: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/121.jpg)
Further ReadingldquoHow Google Tests Softwarerdquo James Whittaker
ldquoLook At Your Datardquo John Rausser
ldquoOptimizing For Developer Happinessrdquo Chad Dickerson
ldquoOutages Postmortems and Human Errorrdquo John Allspaw
httpenwikipediaorgwikiSwiss_cheese_model
ldquoWhat Is Exploratory Testingrdquo James Bach
Questions
noahsussmannsnoahsussmancominfiniteundocom
![Page 122: Continuous Improvement (GroupOn, Palo Alto 2013)](https://reader038.vdocuments.net/reader038/viewer/2022110218/5873cb211a28ab9d168b4f55/html5/thumbnails/122.jpg)
Questions
noahsussmannsnoahsussmancominfiniteundocom