avoiding test hell

Post on 24-Jan-2017

103 Views

Category:

Software

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

1

Avoiding Test HellYun Ki Lee

yunki.lee@gmail.com

2

Overview• Why Test?• Sample Application• BDD and FitNesse• What would you consider hell to be like?• How to get into hell• And how to make it less painful

3

Sample Price Reporting App

• We are a financial service provider• Loads security prices from a client• Produces internally calculated

security prices• Compares the two prices and

selects a price based on a rule• Produces a valuation report

Client Reporting Module

Client Prices

Price Comparator Valuations

File

Valuation Generator

Internal Price

Calculator

Client Price Loader

4

Testing Basics• Why

• To prove what you’ve done works and is what they asked for• Reduce risk when doing deployment to production

• How• Manual vs Automated• Different types of tests, unit tests, component tests, integration tests and so on

• When• After development is complete? (Waterfall)• Before you do any development? (TDD)

• Who• Dedicated team of testers?• Developers, BAs, users?

5

BDD: Behaviour Driven Development• Focus on the behaviour • As a … I want … so that• Given … when … then …

6

FitNesse

Wiki FixturesSystem Under Test

Tables

Java Code (System Under

Test)

7

How to know whether you’re in Test Hell

• Your tests costs you more than they save you• How long until you can release the next release into production?• If you don’t do any manual testing, are you good to go?• If a test fails, do you believe that it has failed?• If a test passes, do you trust it?• How long until a new joiner can get up and running writing tests?• A function needs to be changed, how long until the tests can be updated?

8

Holding up Releases due to Testing

• How long does it take to arrive at a go / no go decision?• Manual testing• Release process not tested and automated• Solution: automate environment deployments• Reduce manual steps• Co-locate services for faster deployments

9

When a test fails

• Did it really fail?• How long does it take to verify it’s a genuine break?• Was the failure due to system state?• Have you got some wait condition so that an action can complete?• Are you running tests in parallel but have some shared data?

• Solutions:• Do not hide testing data (next slide)• Isolate tests that exhibit random behavior (next slide)• Retry?

10

11

12

Testing Boundaries

PriceComparisonEngineThread

Price Loader

extractPrices(source)

Price Rules Persister

loadRules(rule)

comparePrices()

persistPrices()

• Do you want to test the whole engine?

• Or just a couple of features?

13

When a test passes

• Have you got enough coverage to be confident enough to say the functionality works• Vertical vs horizontal coverage

• Coverage tools

14

How easy is it to maintain tests?• Adding new tests• How easy is it for these people?

• Experienced? New? Haven’t touched the system in a while?

• Modifying tests for new functionality?• How many tests do they have to update?

• Solution• Documentation and libraries• DRY principle

15

Technical Debt in Tests

• Packaging of test code isn’t maintained as well as code

• Less time spent refactoring and reducing duplication

• Often you will find similar methods that differ by one or two parameters

• Solution• Refactor and remove duplicate methods• Create libraries or APIs that create test

objects

16

Mocking

• Simulate behavior of components and interfaces so that we can isolate that one component we want to test

• Faster than connecting to a file system or a database

• Can be brittle and a nightmare to maintain

• Can alleviate some issues if you create a test library that hides some of the unnecessary details

PriceComparisonEngineThread

Price Loader

extractPrices(source)

Price Rules Persister

loadRules(rule)

comparePrices()

persistPrices()

17

Test Objects

18

Vendor Software and External Data Providers

• Can not control when API and interface changes

• Complex data sets• Potentially quite difficult to mock out

their interfaces

• Pitfalls of the following approaches• Capture and Replay: changes in

message format• Table dumps: what if table structure

changes due to version upgrade?• Solution

• Use the Vendor API where possible

19

People Issues

• The largest contributor to Test Hell: the people in your team• What are their motivations? What do they get from doing more frequent releases?• Why do they want to do anything different? Why replace their manual tasks with

automation?• Structure of your team• Who is good at what role? Not everyone is the same• Personality types such as: MBTI

• What if they don’t believe in testing?• Gamification? • Who are management focusing their attention on? The poor performers at

the cost of good performers?

20

Things to Google …

• Unit Testing• BDD• TDD• Jmock• EasyMock• PowerMock• Junit• Continuous Integration

• Object Mothers vs Test Builders• Test Coverage• DevOps• FitNesse• Concordion• Cucumber• JBehave

top related