presentation bio return to main menu t1 · ©1999 software development technologies test...
TRANSCRIPT
P R E S E N T A T I O N
International Conference On
Software Testing, Analysis & ReviewNOV 8-12, 1999 • BARCELONA, SPAIN
Presentation
Bio
Return to Main Menu
PresentationPaperBio
Return to Main Menu T1
Thursday, Nov 11, 1999
Ed Kit
The New Frontier...
Testing AutomationThird Generation Software
The New Frontier…
Third Generation SoftwareTesting Automation
Edward Kitwww.sdtcorp.com [email protected]
© 1999 Software Development Technologies
Essential Testing Challenges
! How to design and document inspectable tests?
" What is an effective test automation architecture?
# How to integrate test design and automation?
Slide 2
Key Benefits:
- Better, more maintainable tests
- Automation achieved with fewer technical testers
- Reduced regression, function, system testing costs
- Higher motivation for participants
© 1999 Software Development Technologies
Test Automation:Serious Problems
• Lack of an effective test automation architecture
• Lack of required competencies:
– Test Design
– Technical Automation
– Application
• Using capture/playback at the wrong time
• Lack of sufficient resources:
– Not enough time for automation implementation
– Ratio of testers to developers
– Dedicated capital equipmentSlide 3
© 1999 Software Development Technologies
Test ArchitectureKey Recommendations
• Create interfaces between key framework components
• Separate, yet integrate test design and automation
• Recognize that the proper use of several tools is essential
• Allow for capture/playback tool independence
• Provide infrastructure for effective capture/playback
• Create one engine that can process all automated tests
• Customize framework components for the organization
• Evolve the architecture as test technology maturesSlide 4
© 1999 Software Development Technologies
Proven Test Architecture
• TestFrame is an example of a proven test architecture• Created in 1994 by Hans Buwalda at CMG• The TestFrame approach has been used successfully by
hundreds of projects in Europe• SDT successfully used TestFrame to test ReviewPro, a
Web-based Enterprise Verification Application thatbrings automation to Technical Reviews and Inspections
• SDT has extended TestFrame to include the SDT TestDesign Templates
• SDT and CMG are working together to evolve TestFrame• Has been used for On-line, Web, Batch, API, Embedded
for function, system, acceptance test• Reference [Buwalda, 1998]
Slide 5
© 1999 Software Development Technologies
Automation Recommendation #1Just Say No to C/P!
Capture
PlaybackSlide 6
© 1999 Software Development Technologies
Automation Recommendation #2Separate and Bridge
Test Design
Test Automation
Process User Scenarios
Create User Scenarios from test cases using
action words
Create Test Cases using design
techniques andspreadsheets
Slide 7
© 1999 Software Development Technologies
Automation Architecture Overview
Slide 8
FeatureHierarchy Action Word
Dictionary
AutomationDesign
Test Architecture
Test Design
Test Cases
Automation Script
Execute Tests
Test Plan
Test Effectiveness
UserScenarios
© 1999 Software Development Technologies
Automation Architecture -Tools That Can Help
FeatureHierarchy: Excel Action Word
Dictionary:Word
AutomationDesign: Word
Test Architecture: TestFrame, Visio, Word;For Architecture Review: ReviewPro
Test Design: ExcelFor Design Review:
ReviewPro
Test Cases:AETGWeb
Automation Script:TestFrame,
Capture/Playback Tool
Execute Tests:TestFrame
Test Plan: Word, Visio
Test Effectiveness: Word
User Scenarios:Excel
Slide 9
© 1999 Software Development Technologies
Common DesignTechniques Summary
! Equivalence Partitioning - single input conditions
" Equivalence Partitioning - combinations
# Boundary Value Analysis
$ Output Forcing
% State Models
& Error Guessing
References: [Jorgensen, 1995], [Kit, 1996]
Slide 10
Feature Hierarchy Spreadsheet
Primary FormsReviewPro Key Features
Reviewer’s Log FormDrop Down Lists
Disposition CodeSeverityStatusEntry Type
Edit FieldsDocument Under Review
LocationSummary
Link FieldsHot Doc LinkGeneral Link
GUI Attributes
WorkSheet Priority
HighHigh
High
High
High
MedMed
Med
High
DDDCDDSEDDSTDDETDDDU
EFLOEFSU
LFHDLFGL
Slide 11 © 1999 Software Development Technologies
Test Design Spreadsheet Template
Test Case Validity:
Matrix ID:Matrix Summary:
Author:Test Design:
Risk Analysis:Test Case ID:
Priority:Test Condition
Technique Feature CombinationImpact Likelihood
Expected Results:
Feature Hierarchy:
Date: Version:
Slide 12
© 1999 Software Development Technologies
© 1999 Software Development Technologies
Test Design Template Choices
Test Design:
Technique:– Equivalence Class
– Boundary Value
– Output Condition
– Special Value
– State Transition
Feature Combination:– Yes
– No
Reference: [Kit, 1999]
Risk Analysis:
Impact:– High
– Medium
– Low
– Unknown
Likelihood:– High
– Medium
– Low
– Unknown
Test Case Validity:– Valid
– Invalid
Priority:– High
– Medium
– Low
Slide 13
© 1999 Software Development Technologies
Risk Management:Failure Impact & Fault Likelihood
Failure Impact:
How significant is the impact if the featuresaddressed in this test design fail? For example:system goes down, someone dies, basic applicationfails, money is lost, penalty is applied, users sue
Fault Likelihood:
How likely is it that people will make mistakes for thefeatures addressed in this test design that will leadto software faults that will lead to product failures?
Examples of indicators that contribute to anincreased likelihood of faults getting into thesystem: weakness in this part of the systemarchitecture, inexperienced team member,geographically distributed team, aggressive schedule
Slide 14
© 1999 Software Development Technologies
Test Design Example -The Classic Triangle Routine
• The routine accepts three integer values as input;these are interpreted as representing the lengths ofthe sides of a triangle. All sides must be at least 1and have an upper limit of 200. The sides areentered as a comma-delimited list, e.g., “3, 4, 5” bythe operator using a keyboard.
• The routine outputs a message that states whetherthe triangle is scalene (no sides equal), isosceles(two sides equal) or equilateral (all sides equal).The output message is one of the following strings:“Equilateral”, “Isosceles”, “Scalene”,“NoTriangle”, “IllegalInput”.
Slide 15
Triangle Test Design
Test Case Validity:
Matrix ID:Matrix Summary:
Author:Test Design:
Risk Analysis:Technique Feature Combination
Impact Likelihood
Feature Hierarchy:
Date: Version:Triangle Baseline Fundamental Cases
Triangle/BaselineBFC
Ed KitEquiv. Class NoHigh Low
Test Case ID:
Priority:Test Condition
BFC01 BFC02 BFC03 BFC04 BFC05 BFC06
Side A =Side B =Side C =
Typical Equilat.
Typical Isosceles
Typical Scalene
Typical Illegal
Only Two Sides
One Side Too Big
100 100 30 100 79 201100 100 40 68 24 190100 10 50 190 null 60
High High High High High HighValid InvalidValid Valid Invalid Invalid
Expected Results:
Outp
ut =
Equi
later
al
Outp
ut =
Scale
ne
Outp
ut =
Isosc
eles
Outp
ut =
NoTr
iangl
e
Outp
ut =
Illega
lInpu
t
Outp
ut =
Illega
lInpu
t
Test Description:
2.54/10/99
Slide 16
© 1999 Software Development Technologies
Triangle Boundary Value Analysis Test Design
Test Case Validity:Test Case ID:
Priority:Test Condition
Side A =Side B =Side C =
BVA01Val
BVA02 BVA03 BVA04 BVA05 BVA06 BVA07 BVA08 BVA09Val Val Val ValInv Inv Inv Inv
Hi HiHi Hi Hi Hi Hi Med Med
1 2012001 1901 12
200200
201201201
1200200
11
200
122
199
1992
121
** * *
**
**
*
Outp
ut =
Equi
later
al
All sides minAll sides max
1 side max+1;legalAll sides max+1
Extreme valid Isosc.2 sides min, 1 max
Small valid IsoscelesIsosceles near limit
Nearly minimumExpected Results:
Outp
ut =
Equi
later
al
Outp
ut =
Illega
lInpu
tOu
tput
=Ille
galIn
put
Outp
ut =
Isosc
eles
Outp
ut =
Isosc
eles
Outp
ut =
Isosc
eles
Outp
ut =
NoTr
iangl
e
Outp
ut =
NoTr
iangl
e
Slide 17
© 1999 Software Development Technologies
© 1999 Software Development Technologies
User Scenario
A User Scenario is formed by stringing together testcases previously defined in the test design matrix
TC023
TC098
TC135 TC257
Slide 18
Document User Scenarios
Scenario Validity:User Scenario ID:
Priority:Test Case
Expected Results:
US01 US02 US03 US04 US05 TBD
TC023TC098TC135TC257
Typical Thread 1Typical Thread 2
Illegal Thread 1
123
**
*
*Ex
pect
ed
Resu
lt 01
High High High Med. HighValid InvalidValid Valid Invalid
4
Typical Thread 3
Illegal Thread 2 *
Expe
cted
Re
sult
02
Expe
cted
Re
sult
03
Expe
cted
Re
sult
04
Expe
cted
Re
sult
05
1324
2134
321
21
3
--
--
Slide 19
© 1999 Software Development Technologies
© 1999 Software Development Technologies
Action Words: Key to the Bridge
Action Words:
• Establish a high-level application usage abstraction
• Standardize application actions
• Enable communication between Test Design and TestCase Processor
Tips for designing action words:
• Keep the abstraction at a high level
• Determine what set of user actions the test tool shouldperform with a specific Action Word
• Scope of the test determines the Action Word level
• Create an Action Word Dictionary
Translate each user scenario into an Action Wordspreadsheet
Slide 20
© 1999 Software Development Technologies
Action Word Spreadsheet Example
User Scenario US0129 version 1.1date 2/6/99author Hans Buwalda
Section 1 Test Case CL02 Product codes must be uniqueproduct code product colour type weight
enter product p2 nail black AAX 1expect message Transaction executed correctly
product code product colour type weightinsert product p2 nut grey AAX 1expect message Value in field product code not allowed
A product with another code is allowedproduct code product colour type weight
enter product p3 nail black AAX 1
Check for presence of productproduct code product colour type weight
check product p3 nail black AAX 1delete buttonSection 2 Test Case CL13 All fields need to be filled.Slide 21
© 1999 Software Development Technologies
More Uses ForSpreadsheet Test Design
• Configuration Design
• Performance Testing:
– Load and Stress
– Context Switching
– Client / Server
• Design and re-use test design components:
Test cases -> User Scenarios -> Stress Scenarios
Slide 22
© 1999 Software Development Technologies
Typical WebConfiguration Combinations
A partial set of configuration choices for ReviewPro, aWeb-based Review and Inspection application, include:
Client Browsers (5):• Netscape (3.0, 4.0, 4.5), Internet Explorer (3.0, 4.0)Client Operating Systems (5):• Windows (3.x, 95, 98, NT), Sun (Solaris 2.51)Application Server Operating Systems (3):• NT (3.51, 4.0 with Service Pack 3, 4.0 with Service Pack 4)Database Types (4):• Sybase, Oracle, Informix, MicrosoftWeb Server Software (4):• MS IIS, Apache, Netscape (Enterprise, FastTrack)
Slide 23
© 1999 Software Development Technologies
The Combination Mess
The goal: re-run a core set of tests in the right mix ofconfigurations to achieve effective coverage andfind defects.
• From the previous slide, there are:
5 * 5 * 3 * 4 * 4 = 1200
possible configurations.
• Assume there are 500 core tests.
• This results in the need to run:
1200 * 500 = 600,000
tests!
• Using a real-world web application, ReviewPro,the calculation resulted in 4,800,000 tests!
• There must be a more practical solution! (There is.)Slide 24
© 1999 Software Development Technologies
Dealing with the Combination Mess
Reductions must be made, yet sufficient coverage isrequired. Choices include:
• Use a spreadsheet to manually select a reasonablesubset.
• Use a tool which automatically selects a smallnumber.
Slide 25
© 1999 Software Development Technologies
Design Configurations Using Spreadsheets
Combination Validity:
ReviewPro Configuration Test Categories:
Priority:
Appl. Server OS
Combination ID: RCT04
HighInvalid
RCT01
HighValid
RCT02
HighValid
RCT03
HighValid
RCT05
HighValid
Client OSClient Browser
DatabaseWeb Server SW
Expected Results:
Expe
cted
Re
sult
RCT0
1
Expe
cted
Re
sult
RCT0
2
Expe
cted
Re
sult
RCT0
3
Expe
cted
Re
sult
RCT0
4
Expe
cted
Re
sult
RCT0
5
NS 4.5 IE 4.0 NS 4.0 IE 3.0 NS 3.0Win98 WinNT Win95 Win3.xSolaris 2.5
NT 4 SP 3 NT 4 SP 4 NT 3.51 NT 4 SP 3 NT 4 SP 3Oracle Sybase Microsoft Informix OracleMS IIS Enterprise FastTrack Apache MS IIS
Slide 26
© 1999 Software Development Technologies
Reducing Test Cases -Reducing Testing Costs
• Combinatorial design theory can be used to reducethe number of tests when an astronomical numberof test scenarios are possible.
• Bellcore developed a system called AETG(Automatic Efficient Test Generator) to generatetests for unit, system, and interoperability testing.
• AETGWeb is commercially available as a service.
• Customers interact with the AETG software on theInternet on a secured connection.
• Users enter test specifications, and test cases arereturned.
• The goal is to substantially reduce testing costs.
• References: [Dalal, 1996], [Sherwood, 1994]Slide 27
© 1
999
Soft
war
e D
evel
opm
ent
Tec
hno
logi
es
AETG Configuration Browser Client_OS Server_OS Database Web_Server_SW1 NS4.5 Win3.x NT3.51 Informix FastTrack2 NS4 Solaris 2.51 NT4SP4 Microsoft FastTrack3 IE3 Win3.x NT4SP4 Sybase Apache4 NS4.5 Win95 NT3.51 Oracle Apache5 IE4 WinNT NT3.51 Microsoft Enterprise6 NS4 Win98 NT4SP4 Oracle FastTrack7 NS3 Win98 NT3.51 Sybase Enterprise8 IE3 WinNT NT4SP4 Oracle FastTrack9 IE4 Solaris 2.51 NT4SP4 Informix Apache10 NS4.5 Win98 NT4SP4 Sybase FastTrack11 IE3 Solaris 2.51 NT4SP3 Informix Enterprise12 IE3 Win95 NT4SP3 Microsoft MSIIS13 IE4 Win3.x NT4SP4 Oracle Enterprise14 NS3 WinNT NT4SP3 Sybase Apache15 MSIIS16 Apache17 Enterprise18 MSIIS19 Apache20 MSIIS21 Enterprise22 FastTrack23 MSIIS24 NS3 Win3.x NT4SP3 Microsoft FastTrack25 NS4 WinNT NT3.51 Informix Enterprise
Nearly 50:1 Reduction:
• 25 configurations instead of 1200
• 12,500 tests instead of 600,000
Slide 27a
A Spreadsheet for Load & Stress TestsMatrix ID:Matrix Summary:Risk Analysis: Impact Likelihood
Feature Hierarchy:Ensure that representative user scenarios scale
Master Load & StressMLS
High High
Validity:Priority: High
InvalidHighValid
HighValid
HighValid
Stress Test ID: MLS01 MLS02 MLS03 MLS04
US001# of Users # of Users # of Users# of UsersUser Scenario
US009US012
101510
100150100
170100 80
Entry Level System
200150200
Too Many Users
Normal Loaded SystemMax Valid US001
* * *
*Ex
pect
ed
Resu
lt ML
S01
Expe
cted
Re
sult
MLS0
2
Expe
cted
Re
sult
MLS0
3
Expe
cted
Re
sult
MLS0
4
Expected Results:
Slide 28
© 1999 Software Development Technologies
© 1999 Software Development Technologies
TestFrame Engine Context Switching
targetsystem 1
Sets of Action Words
…switch context Mortgages enter client John Jones 200000 30switch context Loans enter client John Jones 15000 …
targetsystem 2
targetsystem n
navigation
Slide 29
© 1999 Software Development Technologies
TestFrame Engine Client Server
targetsystem
…begin cluster Mortgages enter account John Jones 200000 30end cluster start server James …
servernavigation
servernavigation
servernavigation
clientnavigation
Mortgages
Slide 30
© 1999 Software Development Technologies
TestFrameRoles and Responsibilities
• Test Architect -- Creates the overall approach toverification and validation, including an integratedapproach to test process and automation
• Test Planner/Manager -- Provides test planning,schedule, scope, resources, etc.
• Test Automation Engineer -- Creates Test CaseProcessor script
• Test Designer -- Creates and documents test design,participates in test design inspection
• Test Executor -- Runs and evaluates tests
Slide 31
© 1999 Software Development Technologies
Automation Architecture Roles
Feature Hierarchy:Designer
Test Architecture: Architect
Test Design: Designer
Test Cases: Designer
Test Plan: Manager
Test Effectiveness: Manager
Action WordDictionary:
Designer
AutomationDesign: AE
Automation Script:AE =
Automation Engineer
Execute Tests:Executor
User Scenarios:Designer
Slide 32
© 1999 Software Development Technologies
Case Study Key Test Tool Usage
Test Case Processor
Application Specific Code
Test Case Processor Enginee.g., TestFrame
Capture/Playback Toole.g., WebTest/WinRunner,Robot, SilkTest, QAPlayback
Requirements Managemente.g., DOORS, Requisite Pro,RTM
Test DesignSpreadsheet Templatee.g., Excel
RequirementsRepository
ReviewRepository
Test DesignRepository
Technical Review Managemente.g., ReviewPro
SoftwareUnderTest
Test ResultsReport
Slide 33
© 1999 Software Development Technologies
Benefits of an EffectiveTest Architecture
• Better, more maintainable tests
• Improved test design and development
• Reduced costs, especially for regression testing
• Higher motivation for participants
• Fewer highly technical testers required
• Less sensitive to target system changes
• Better organizational approach:
– Clearer separation of tasks
– Tangible products
– Accountability
Slide 34
© 1999 Software Development Technologies
Summary
• Create an effective Test Automation Architecture
• Focus on Test Design
• Build a bridge between Test Design and Automation
• Use spreadsheets for:
– Feature Decomposition
– Basic Test Case Design
– Configuration Combination Design
– Load and Stress Design
• Verify - Don’t forget Technical Reviews andInspections
Slide 35
© 1999 Software Development Technologies
References
• Buwalda, Hans, Testing with Action Words, STAR May 1998
• Dalal, Siddhartha R., ( and Cohen, Parelius, Patton), TheCombinatorial Design Approach to Automatic Test Generation,International Symposium on Software Engineering, 1996
• Jorgensen, Paul C., Software Testing - A Craftsman’sApproach, CRC Press, 1995
• Kit, Edward, Software Testing in the Real World, AddisonWesley Longman, 1996
• Kit, Edward, Integrated, Effective Test Design andAutomation, Software Development Magazine, February 1999
• Sherwood, George B., Effective Testing of FactorCombinations, STAR, 1994
Slide 36
© 1999 Software Development Technologies
The End
The New Frontier…
Third Generation SoftwareTesting Automation
Edward Kitwww.sdtcorp.com [email protected]
Ed Kit
Edward Kit, founder and president of Software DevelopmentTechnologies, is well known as a test expert, author, and keynote speakerat testing conferences. His best-selling book, Software Testing in the RealWorld: Improving the Process, has been adopted as a standard bycompanies around the world such as Sun Microsystems, Exxon, ChaseManhattan Bank and Cadence Design Systems. His feature articles inSoftware Development Magazine have outlined new state-of-the-practicetest automation models that are currently being adopted around theworld. Mr. Kit continues to advise clients on bringing practical andproven software quality practices to their development efforts.