executable specifications agile palooza
TRANSCRIPT
Executable Specifica/onsDocuments for So7ware that
Chris SterlingPartner at Sterling BartonCer/fied Scrum TrainerAgile Consultant
1Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Executable Specifica/ons -‐Topics to Discuss
• What are Executable Specifica/ons?
• Why Create Them?
• Acceptance Tests
• When to Use
• An Approach with Example
• Available Tools
2
2Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
WHAT ARE EXECUTABLE SPECIFICATIONS?The second-‐most detailed specifica/on of the customer’s request
3
3Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Executable Specifica/ons
• Unambiguous defini/on of desired so7ware behavior
• Executable documenta/on
• Repeatable and specific tests
• Regression tests to validate exis/ng so7ware behavior
• The second-‐most detailed specifica/on of the customer’s request
4
4Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
WHY CREATE EXECUTABLE SPECIFICATIONS?Building the “right” so7ware
5
5Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.6
Increasing Cost of Release Stabiliza/on Periods
0
125,000
250,000
375,000
500,000
Release 1Release 2
Release 3Release 4
Release 5Release 6
Cost of Fixing Defects Cost for Feature Dev
6Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Costly Regression Test Phases
7
7Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Managing So7ware Debt – an Overview
8
8Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Effect of Project Constraints on Quality
9
9Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Effect of Project Constraints on Quality
9
9Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
ACCEPTANCE TESTSHow does the Team know they have implemented desired behavior successfully?
10
10Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Agile tes/ng happens at two levels
• Acceptance Tests tell us whether the system does what the customer expects (“building the right code”)
• Programmer Tests define whether the system does what the developers expect (“building the code right”)
11
11Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Agile tes/ng happens at two levels
• Acceptance Tests tell us whether the system does what the customer expects (“building the right code”)
• Programmer Tests define whether the system does what the developers expect (“building the code right”)
11
11Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Acceptance Tests
• Are also called customer tests or func/onal tests
• Tell us whether the system does what the customer expects
• Enable Developers to know they’ve sa/sfied requirements
• Can be run automa/cally at any /me by anyone
• Helps us build the “right” so7ware
• “Running, Tested Features”*
* “A Metric Leading to Agility” – Ron Jeffrieshttp://www.xprogramming.com/xpmag/jatRtsMetric.htm
12
12Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
User Stories
Story 14As a customer I want to check my order status online so that I can know when to expect my package
13
13Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
User Stories
Story 14As a customer I want to check my order status online so that I can know when to expect my package
A small piece of business value that can be delivered in an iteration
13
13Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
What is a User Story?*
• CARD–Token represen4ng the requirement. It's used in planning. Notes are wriDen on it, reflec4ng priority and cost
• CONVERSATION–The requirement itself is communicated from customer to programmers through conversa4on (The conversa4on is largely verbal, but is oMen supplemented with documents)
• CONFIRMATION–The confirma4on provided by the acceptance test is what makes possible the simple approach of card and conversa4on
–When the conversa4on about a card gets down to the details
* “Essential XP: Card, Conversation, Confirmation” – Ron Jeffrieshttp://www.xprogramming.com/xpmag/EXPCardConversationConfirmation.htm
14
14Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Confirma/on through Acceptance Tests
• Product Owner makes first pass at Acceptance Criteria
• During Itera/on Planning Acceptance Criteria are discussed
• Final Acceptance Criteria for each User Story is product of nego/a/on between Delivery Team and Product Manager
• Should be short, easy to understand statements
Story 14As a customer, I want to check my order status online so that I can know when to expect my package
15
15Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Confirma/on through Acceptance Tests
• Product Owner makes first pass at Acceptance Criteria
• During Itera/on Planning Acceptance Criteria are discussed
• Final Acceptance Criteria for each User Story is product of nego/a/on between Delivery Team and Product Manager
• Should be short, easy to understand statements
Story 14As a customer, I want to check my order status online so that I can know when to expect my package
Acceptance Criteria•View status as “waiting for pickup”, “en route” or “delivered”
•Date of each step in route•Estimated time of delivery
15
15Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Take on the Role of a User -‐ “What should the so7ware do next for me?”
This ques/on helps you to decide what the next acceptance test should model
16
16Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Acceptance Test-‐Driven Development
17
17Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
WHEN TO USE EXECUTABLE SPECIFICATIONS?Legacy so7ware, so7ware test automa/on, and commercial off-‐the-‐shelf
18
18Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Tradi/onal balance of tests
Manual GUI Acceptance
Tests
Unit Tests
Automated GUI Tests
Easy to CreateVery familiar – what we always doTypically tediousHow do we know coverage?
Need Automation specialistsAutomation good for performanceSeems like we always rewriteSometimes fragile
What is Development testing?How do we know what these are?How do we know when they fail?
(Originally discussed by Mike Cohn)
19
19Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Mike Cohn’s Tes/ng Pyramid
GUI Acceptance
Tests
ExecutableSpecifications
Unit Tests
Small numberAutomate most
Drive development and acceptance
Do the mostCreate Test Driven Design
(Also “borrowed” from Mike Cohn)
20
20Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Working with Legacy So7ware
21
Define Acceptance Criteria for
Requirement
21Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Working with Legacy So7ware
21
Define Acceptance Criteria for
Requirement
Analyze What Might Be
Affected by Requirement
21Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Working with Legacy So7ware
21
Define Acceptance Criteria for
Requirement
Analyze What Might Be
Affected by Requirement
Write Executable
Specifications for How it
Works Now
21Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Working with Legacy So7ware
21
Define Acceptance Criteria for
Requirement
Analyze What Might Be
Affected by Requirement
Write Executable
Specifications for How it
Works Now
Write Failing Executable
Specifications for How it
Should Work
21Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Working with Legacy So7ware
21
Define Acceptance Criteria for
Requirement
Analyze What Might Be
Affected by Requirement
Write Executable
Specifications for How it
Works Now
Write Failing Executable
Specifications for How it
Should Work
Carefully Modify Source
Code to Implement
Requirement
21Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Working with Legacy So7ware
21
Define Acceptance Criteria for
Requirement
Analyze What Might Be
Affected by Requirement
Write Executable
Specifications for How it
Works Now
Write Failing Executable
Specifications for How it
Should Work
Carefully Modify Source
Code to Implement
Requirement
Execute Executable
Specifications to Verify Change
21Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Working with Legacy So7ware
• Focus on Wri/ng Executable Specifica/ons for New Requirements and Func/onality that may be Affected
• Stabilize Legacy So7ware with Black-‐box Tests (Executable Specifica/ons)
• Incrementally Improve the Internal So7ware Design
• Try to Iden/fy where Programmer Tests can be Added
• Execute Regression Test Suite that Includes Executable Specifica/ons Frequently (more than once per day if possible)
22
22Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
So7ware Test Automa/on
23
Manual Test ScriptManual Test Script
Step 1 Login as Administrato
rStep 2 Create
Order with 1 Item
Step 3 Click “Check Out” Button
Step 4 Verify Item in Shopping
Cart
23Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
So7ware Test Automa/on
23
Manual Test ScriptManual Test Script
Step 1 Login as Administrato
rStep 2 Create
Order with 1 Item
Step 3 Click “Check Out” Button
Step 4 Verify Item in Shopping
Cart
Automate
23Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Version 1: COTS
Software
Commercial Off-‐the-‐Shelf (COTS)
24
24Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Version 1: COTS
Software
Commercial Off-‐the-‐Shelf (COTS)
24
Create Executable
Specifications for Configurations & Customizations
24Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Version 1: COTS
Software
Commercial Off-‐the-‐Shelf (COTS)
24
Create Executable
Specifications for Configurations & Customizations
Version 2: COTS
Software
24Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Version 1: COTS
Software
Commercial Off-‐the-‐Shelf (COTS)
24
Create Executable
Specifications for Configurations & Customizations
Version 2: COTS
Software
Execute Existing Executable
Specifications to Find Upgrade
Path
24Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Version 2: COTS
Software
Commercial Off-‐the-‐Shelf (COTS)
24
Create Executable
Specifications for Configurations & Customizations
24Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
AN EXAMPLE APPROACHWalk through the crea/on of executable specifica/ons and how to execute them effec/vely
25
25Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
A simple example of a FIT test
*Source: http://fit.c2.com/
26
26Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Fit Workflow*
• Starts with whiteboard conversa/on
• Customer (SME, business person, product manager, etc.) informally describe new feature
• Programmers and testers ask ques/ons
*Source: http://fit.c2.com/
27
27Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Fit Workflow*
• Customer, working with delivery team, refines examples into tables
• Use business-‐friendly tools, such as Microso7 Excel and Word, to capture test tables
*Source: http://fit.c2.com/
28
28Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Fit Workflow*
• Delivery team suggest addi/onal areas to cover
*Source: http://fit.c2.com/
29
29Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Fit Workflow*
• Delivery team format tables for use with Fit
*Source: http://fit.c2.com/
30
30Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Fit Overview
*Source: http://fit.c2.com/
31
31Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Fit Workflow*
• Delivery team creates Fit “fixtures” (small piece of code that translates test tables into execu/on tests against the so7ware)
*Source: http://fit.c2.com/
32
32Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
• Click to edit Master text styles
Fit Workflow*
• Execute Fit document against so7ware
• At first some tests should be failing (red)
*Source: http://fit.c2.com/
33
33Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
• Click to edit Master text styles
Fit Workflow*
• Delivery team collaborates with customer to incrementally enhance test tables
• Delivery team implements so7ware to meet test execu/on (green)
*Source: http://fit.c2.com/
34
34Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Fit Workflow*
• Document is kept for regression tes/ng
• Document is included in automated build to ensure everything keeps working
• As ques/ons arise about func/onality an example is added and Fit reports the answer
*Source: http://fit.c2.com/
35
35Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Con/nuous Integra/on
36
36Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Organizing FIT tests
• Maintain suite of regression tests from past itera/ons that always pass
• Run “regression” tests with build.
• Maintain a suite of “in progress” tests for the current itera/on–Begin the itera4on with all tests failing–End the itera4on with most tests passing
• At the end of the itera/on, move newly passing tests into regression suite–Beware the Fitnesse “Refactor/Move” command
37
37Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Automated Regression Tes/ng
38
ApplicationUnder
Development
38Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Automated Regression Tes/ng
38
New Feature
ApplicationUnder
Development
38Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Automated Regression Tes/ng
38
New Feature
ApplicationUnder
Development
Design Changes
38Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Automated Regression Tes/ng
38
New Feature
ApplicationUnder
Development
Design ChangesAutomatedRegressionTest Run
38Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Automated Regression Tes/ng
38
New Feature
ApplicationUnder
Development
Design ChangesAutomatedRegressionTest Run
38Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
A FIT CASE STUDYCost reduc/on using Fit for test automa/on and data conversion
39
39Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Manual Regression Tes/ng
• Tes/ng was taking 75 person hours during 2 full test runs consis/ng of:–Comprehensive manual regression tes4ng
–Data conversion and valida4on• Cost for tes/ng was $17,000 each itera/on
40
40Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Introducing Fit into Tes/ng Process
• A7er 8 itera/ons team had introduced healthy amount of Fit fixtures and automated tests
• Reduced 70+ hour test run/me down to 6 hours which now included:–Fit automated regression tes4ng
–Data conversion and valida4on automated with Fit fixtures
• Reduced cost of tes/ng each itera/on from $17,000 to $7,000
41
41Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
AVAILABLE TOOLSTools that Teams can use and what context they are best used in
42
42Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Table-‐based Acceptance Tes/ng Tools
• Fit – hlp://fit.c2.com/
• FitNesse – hlp://www.fitnesse.org/
• RobotFramework -‐ hlp://code.google.com/p/robonramework/
43
43Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Programmer Acceptance Tes/ng Tools
• Cucumber -‐ hlp://cukes.info/
• Rspec -‐ hlp://rspec.info/
• JBehave -‐ hlp://jbehave.org/
• JMeter -‐ hlp://jakarta.apache.org/jmeter/
• xUnit (JUnit/NUnit/CppUnit/etc…)
44
44Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Web UI Agile Tes/ng Tools
• Open Source Web UI Tes/ng tools–StoryTestIQ–WATiR
–Selenium–Canoo Web Test
• GUI tests tend to be more brille
• Tests wrilen and maintained incrementally
• Most of these tools have code integra/on
45
45Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
What about tradi/onal tes/ng tools?
• Why don’t we use commercial products such as SilkTest, TestDirector, QuickTest Pro, etc. for Agile tes/ng?–Expensive–Automated tools are record-‐and-‐playback; briDle
–Ties our tests to the UI implementa4on
–Manual tools (TestDirector) take too long to run.
–Work fine as an interim strategy (especially if you already have the licenses)
–Consider adding open source Agile Tes4ng tool as a component of your tes4ng strategy
46
46Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
THANK YOUQues/ons and Answers
47
47Thursday, January 14, 2010
Copyright © 2010 Sterling Barton. All rights reserved.
Presenter: Chris SterlingPartner at Sterling Barton• Technology Consultant, Cer/fied Scrum
Trainer and Agile Coach• Consults on so7ware technology across a
spectrum of industries• Consults organiza/ons on Agile
development, management, and enterprise prac/ces
• Founder of Interna/onal Associa/on of So7ware Architects (IASA) Puget Sound chapter
• University of Washington Lecturer: Agile Developer Cer/ficate Program
Email: [email protected] Blog: hlp://www.getngagile.comFollow me on Twiler: @csterwa
48
48Thursday, January 14, 2010