Download - tarot.brunel.ac.uk
1
Model-Based Testing
Jan Tretmans
Embedded Systems InstituteEindhoven, NL
Radboud UniversityNijmegen, NL
2
Overview
Software testing
Model-based testing
Model-based testing
with labelled transition systems
Testing of component-based systems
Model-based testing of components
3
Software Testing:
What? How?Who? Sorts?
4 © Jan Tretmans
Paradox of Software Testing
But also: ad-hoc, manual, error-prone hardly theory / research no attention in curricula not cool :
“if you’re a bad programmer you might be a tester”
Testing is: importantmuch practiced 30-50% of project effort expensive time critical not constructive
(but sadistic?)
Attitude is changing:more awarenessmore professional
5 © Jan Tretmans
The Triangle Program [Myers]
“A program reads three integer values. The three values
are interpreted as representing the lengths of the sides
of a triangle. The program prints a message that states
whether the triangle is scalene, isosceles, or equilateral.”
Write a set of test cases to test this program
6 © Jan Tretmans
1. valid scalene triangle ?2. valid equilateral triangle ?3. valid isosceles triangle ?4. 3 permutations of previous ?5. side = 0 ?6. negative side ?7. one side is sum of others ?8. 3 permutations of previous ?
1. one side larger than sum of others ?
2. 3 permutations of previous ?3. all sides = 0 ?4. non-integer input ?5. wrong number of values ?6. for each test case: is expected
output specified ?7. check behaviour after output
was produced ?
Test cases for:
The Triangle Program [Myers]
7 © Jan Tretmans
1. is it fast enough ?2. doesn’t it use too much memory ?3. is it learnable ?4. is it usable for intended users ?5. is it secure ?6. does it run on different
platforms ?7. is it portable ?8. is it easily modifiable ?
1. is the availability sufficient ?2. is it reliable ?3. does it comply with relevant
laws ?4. doesn‘t it do harm to other
applications ?5. . . . . .
Test cases for:
The Triangle Program [Myers]
8 © Jan Tretmans
Testing: A Definition
Software testing is:
• a technical process,
• performed by executing / experimenting with a product,
• in a controlled environment, following a specified procedure,
• with the intent of measuring one or more characteristics / quality of the software product
• by demonstrating the deviation of the actual status of the product from the required status / specification.
9 © Jan Tretmans
Testing and Software Quality
Quality : totality of characteristics of an entity(IS 9126) that bear on its ability
to satisfy stated and implied needs
Testing: measuring the quality of a product;obtaining confidence in the quality of product
How to specify quality ?
♦ explicit / implicit / legal requirements
How to classify quality ?
How to quantify quality ?
How to measure quality ?
How to obtain good tests ?
10 © Jan Tretmans
static analysis
reviewing inspection
walk-throughdebugging
certification
CMM
quality control requirement management
software process
Quality: There is more than Testing
verification
QUALITYQUALITY
testingtestingtesting
11 © Jan Tretmans
Sorts of Testing
Many different types and sorts of testing:
♦ functional testing, acceptance testing, duration testing,♦ performance testing, interoperability testing, unit testing,♦ black-box testing, white-box testing,♦ regression testing, reliability testing, usability testing,♦ portability testing, security testing, compliance testing,♦ recovery testing, integration testing, factory test,♦ robustness testing, stress testing, conformance testing,♦ developer testing, acceptance, production testing,♦ module testing, system testing, alpha test, beta test♦ third-party testing, specification-based testing, ………
12 © Jan Tretmans
Sorts of Testing
unit
integration
system
efficiencyportability
functionality
white box black box
Level of detail
Accessibility
Characteristics
usability
reliability
module
maintainability
Other dimensions:- phases in development- who does it- goal of testing- . . . . .
13
Model-Based Testing
14 © Jan Tretmans
Towards Model-Based Testing: Trends
Increase in complexity, and quest for higher quality software
More abstraction♦ less detail
♦ model based development; OMG’s UML, MDA
Checking quality♦ practice: testing - ad hoc, too late, expensive, lot of time
♦ research: formal verification - proofs, model checking, . . . . with disappointing practical impact
Software bugs / errors cost US economy yearly:$ 59.500.000.000 (~ € 50 billion) (www.nist.gov)
$ 22 billion could be eliminated…
15 © Jan Tretmans
x : [0..9] 10 ways that it can go wrong
10 combinations of inputs to check
y : [0..9]
x : [0..9]100 ways that it can go wrong
100 combinations of inputs to check
x : [0..9]y : [0..9]z : [0..9]
1000 ways that it can go wrong
1000 combinations of inputs to check
Towards Model-Based Testing: Trends
Increase testing effort grows exponentially with system size
testing cannot keep pace with developmentof complexity and size of systems
16 © Jan Tretmans
Model-Based Testing
Model based testing has potential to combine♦ practice - testing♦ theory - formal methods
Model Based Testing :♦ testing with respect to a (formal) model / specification
state model, pre/post, CSP, Promela, UML, Spec#, . . . . ♦ promises better, faster, cheaper testing:
• algorithmic generation of tests and test oracles : tools• formal and unambiguous basis for testing• measuring the completeness of tests• maintenance of tests through model modification
17 © Jan Tretmans
Types of Testing
unit
integration
system
efficiencymaintainability
functionality
white box black box
Level of detail
Accessibility
Characteristics
usabilityreliability
module
portability
18 © Jan Tretmans
Model-Based Testing:Formal Models
Use mathematics to model relevant parts of software
Precise, formal semantics: no room for ambiguity or misinterpretation
Allow formal validation and reasoning about systems
Amenable to tools: automation
Examples:♦ Z♦ Temporal Logic♦ First order logic♦ SDL♦ LOTOS♦ Promela♦ Labelled Transition Systems♦ Finite state machine♦ Process algebra♦ UML♦ ......
19 © Jan Tretmans
Automated Model-Based TestingModel-BasedAutomated Testing
model
IUT
IUTconftomodel
TTCNTTCNtestcases
pass fail
testtool
testgeneration
tool
testexecution
toolIUT passes tests
IUT confto model
⇔
⇑
⇓soundexhaustive
20 © Jan Tretmans
A Model-Based Development Process
specification
realization
design
code
formalizablevalidation
formalverification
testing
model-based
informal world
world of models
real world
informalideas
21 © Jan Tretmans
implementationunder test
IUT
Testing functional behaviourof black-box implementationwith respect to a modelin a well-defined language based on a formal definitionof correctness
Model-Based TestingFormal Specification-Based Functional Testing
model-based testing
specification/model is basis for testing and assumed to be correct
world of models
real world
models
22 © Jan Tretmans
Approaches to Model-Based Testing
Several modelling paradigms:
Finite State Machine
Pre/post-conditions
Labelled Transition Systems
Programs as Functions
Abstract Data Type testing
. . . . . . .
Labelled Transition Systems
23
Model-Based Testingwith Labelled Transition Systems
24 © Jan Tretmans
Model Based Testing
s ∈ LTS
i ∈ IOTS
i ioco s
pass fail
testtool
gen : LTS → ℘(TTS)
t || i
i || gen(s) → pass
i ioco s⇑ ⇓
soundexhaustive
pass fail
model
IUT
IUTconftomodel
testtool
testgeneration
tool
testexecution
toolIUT passes tests
IUT confto model⇑ ⇓
soundexhaustive
with Transition Systems
25 © Jan Tretmans
Model-Based Testing with LTS
test execution
testgeneration
tests
model
IUT
imp
Involves:• specification model
• implementation IUT + models of IUTs• correctness imp• test cases• test generation• test execution• test result analysis
pass / fail
=
26 © Jan Tretmans
Model-Based Testing with LTS
test execution
testgeneration
tests
model
IUT
imp
Involves:• specification model
• implementation IUT + models of IUTs• correctness imp• test cases• test generation• test execution• test result analysis
pass / fail
=
27 © Jan Tretmans
Labelled Transition Systems
Labelled Transition System ⟨ S, L, T, s0 ⟩
ConReq
ConConf
Discon Data
Discon
states
actions transitionsT ⊆ S × (L∪{τ}) × S
initial states0 ∈ S
28 © Jan Tretmans
Labelled Transition Systems
a
ba
a b
a b
ab
a a
cb
a τ
c
a
a
a
a
b
b
b
b
29 © Jan Tretmans
Labelled Transition Systems
S0 S1dub transition
dub coffeeS0 S3transitioncomposition
kwart teaS0 executablesequence
non-executablesequence
kwart soupS0
LTS(L) all transitionsystems over LL = { dub,kwart,
coffee,tea,soup}
dub
coffee
kwart
tea
S1 S2
S3
S0s
S4
S5
τ
30 © Jan Tretmans
Labelled Transition Systems
dub
coffee
dub
tea
S1 S2
S3 S4
S0traces (s) = { σ ∈ L* | s σ }
s after σ = { s’ | s σ s’ }
s Sequences of observable actions:
Reachable states:
traces(s) = { ε, dub, dub coffee, dub tea }
s after dub = { S1, S2 }
s after dub tea = { S4 }
31 © Jan Tretmans
Representation of LTS
Explicit :
⟨ {S0,S1,S2,S3},
{dub,coffee,tea},
{ (S0,dub,S1), (S1,coffee,S2), (S1,tea,S3) },
S0 ⟩
Transition tree / graph
Language / behaviour expression :
S := dub ; ( coffee ; stop [] tea ; stop )
coffee
dub
tea
S1
S2 S3
S0
S
32 © Jan Tretmans
Labelled Transition Systems
a
ba
a b
a b
ab
a a
cb
a τ
c
a
a
a
a
b
b
b
b
stop
a ; b ; stop
a ; stop [] b ; stop
Pwhere P := a ; P
33 © Jan Tretmans
Labelled Transition Systems
a b
ab
a a
cb
a τ
c
a
a
a
a
b
b
b
ba ; stop ||| b ; stop
a ; b ; stop [] a ; c ; stop
a ; stop [] τ ; c ; stop
Q
where Q := a ; ( b ; stop ||| Q )
34 © Jan Tretmans
Example : Parallel Composition
M || U dub coffee
M' || U'
dub
coffee
kwart
tea
M U
dub
coffee tea
M' U'
35 © Jan Tretmans
Model-Based Testing with LTS
test execution
testgeneration
tests
model
IUT
imp
Involves:• specification model
• implementation IUT + models of IUTs• correctness imp• test cases• test generation• test execution• test result analysis
pass / fail
=
36 © Jan Tretmans
Equivalences onLabelled Transition Systems
Observable Behaviour
a aaτ
a
τa?≈
?≈
?≈
“ Some transition systems are more equal than others “
37 © Jan Tretmans
S1 S2
environment environment
Suppose an environment interacts with the systems:♦ the environment tests the system as black box
by observing and actively controlling it;♦ the environment acts as a tester;
Two systems are equivalent if they pass the same tests.
Comparing Transition Systems
38 © Jan Tretmans
Comparing Transition Systems
S1 S2
environmente
environmente
↓ ↓
? ?
S1 ≈ S2 ⇔ ∀ e ∈ E . obs ( e, S1 ) = obs (e, S2 )
39 © Jan Tretmans
Trace Equivalence
S1 S2
environment environment
s1 ≈tr s2 ⇔ traces ( s1 ) = traces ( s2 )
traces (s) = { σ ∈ L* | s σ }Traces:
40 © Jan Tretmans
a
b
a
τ
b
aa
b
bτ
a
≈ tr
(Completed) Trace Equivalence
≈ctr
≈ tr
≈ tr
≈ tr
≈ctr
≈ctr
≈ctr
41 © Jan Tretmans
Equivalences on Transition Systems
isomorphism
bisimulation( weak )
failure trace= refusal
failures= testing
completedtrace
trace
weak
strong
observing sequences of actions and their end
observing sequences of actions
test an LTS with another LTS
test an LTS with another LTS, and try again (continue) after failure
test an LTS with another LTS, and undo, copy, repeat as often as you like
now you need to observe τ's ……
42 © Jan Tretmans
d
a
b
c
p
dc d
a
b bb
c
saa
bb
dc
r
bb
dc
a
q
Equivalences : Examples
43 © Jan Tretmans
Preorders on Transition Systems
implementationi
specifications
environmente
environmente
≤
Suppose an environment interacts with the black box implementation i and with the specification s :
♦ i correctly implements sif all observation of i can be related to observations of s
44 © Jan Tretmans
Preorders on Transition Systems
↓ ↓ ↓ ? ? ?
≤
i ≤ s ⇔ ∀ e ∈ E . obs ( e, i ) ⊆ obs (e, s )
i ∈ LTS s ∈ LTSimplementation
ispecification
s
environmente
environmente
45 © Jan Tretmans
Trace Preorder
dub
coffee
dub
tea
coffee
dub
tea coffee
dub≤tr
≤tr
≤tr
≤tr ≤tr
≤tr
i ≤tr s =
traces(i) ⊆ traces(s)
46 © Jan Tretmans
Model-Based Testing with LTS
test execution
testgeneration
tests
model
IUT
imp
Involves:• specification model
• implementation IUT + models of IUTs• correctness imp• test cases• test generation• test execution• test result analysis
pass / fail
=
47 © Jan Tretmans
Input-Output Transition Systems
dub
coffee
kwart
tea
S1 S2
S3 S4
S0
LI = { ?dub, ?kwart }
LU = { !coffee, !tea }
dub, kwart coffee, tea
from user to machine from machine to userinitiative with user initiative with machinemachine cannot refuse user cannot refuse
input outputLI LU
LI ∩ LU = ∅ LI ∪ LU = L
!
??
!
48 © Jan Tretmans
LI = { ?dub, ?kwart }
LU = { !coffee, !tea }
Input-Output Transition Systems
IOTS (LI ,,LU ) ⊆ LTS (LI ,∪ LU )
IOTS is LTS with Input-Outputand always enabled inputs:
for all states s,
for all inputs ?a ∈ LI :
?dub?kwart
?dub?kwart
?dub?kwart
?dub?kwart
?dub
!coffee
?kwart
!tea
S ?a
Input-Output Transition Systems
49 © Jan Tretmans
?dub
!choc
?kwart
!tea
!coffee
?dub?kwart
?dub?kwart
?dub?kwart
!choc
?dub
!tea
Example Models
!coffee
?dub
!tea
?dub
!coffee
?dub
Input-Output Transition Systems
50 © Jan Tretmans
implementationi
specifications
Preorders onInput-Output Transition Systems
environmente
environmente
imp
i ∈ IOTS(LI,LU) s ∈ LTS(LI,LU)
imp ⊆ IOTS (LI,LU) x LTS (LI,LU)
Observing IOTS where system inputsinteract with environment outputs, and v.v.
51 © Jan Tretmans
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
CorrectnessImplementation Relation ioco
p δ p = ∀ !x ∈ LU ∪{τ} . p !x
out ( P ) = { !x ∈ LU | p !x , p∈P } ∪ { δ | p δ p, p∈P }
Straces ( s ) = { σ ∈ (L∪{δ})* | s σ }
p after σ = { p’ | p σ p’ }
52 © Jan Tretmans
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
Intuition:
i ioco-conforms to s, iff
• if i produces output x after trace σ , then s can produce x after σ
• if i cannot produce any output after trace σ , then s cannot produce any output after σ ( quiescence δ )
CorrectnessImplementation Relation ioco
53 © Jan Tretmans
?dub
!choc
?kwart
!tea
!coffee
?dub?kwart
?dub?kwart
?dub?kwart
!choc
?dub
!tea
ioco ioco
Implementation Relation ioco
!coffee
?dub
!tea
s
iocoioco
δ?dub
!coffee
?dub
54 © Jan Tretmans
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
? x (x >= 0)! √x
? x (x < 0)
? y
implementation i
! -√x? x (x >= 0)! √x
? x (x < 0)
? y
specification s
i ioco s
s ioco i
Implementation Relation ioco
equation solver for y2 =x :
55 © Jan Tretmans
Model-Based Testing with LTS
test execution
testgeneration
tests
model
IUT
imp
Involves:• specification model• implementation IUT + models of IUTs• correctness• test cases• test generation• test execution• test result analysis
pass / fail
=
56 © Jan Tretmans
Test Cases
♦ labels in L ∪ { θ }• ‘quiescence’ label θ
♦ tree-structured♦ ‘finite’, deterministic♦ sink states pass and fail♦ from each state:
• either one input !a and all outputs ?x• or all outputs ?x and θ
!dub
!kwart
?tea
?coffee?tea
θ
!dub
θ
pass
failfail
failpass
Model of a test case = transition system : failfail
?coffee?tea
failpass
?coffee?tea
failfail
?coffee?tea
?coffee
LU ∪ {θ}
pass
LU ∪ {θ}
fail
57 © Jan Tretmans
Model-Based Testing with LTS
test execution
testgeneration
tests
model
IUT
imp
Involves:• specification model• implementation IUT + models of IUTs• correctness• test cases• test generation• test execution• test result analysis
pass / fail
=
58 © Jan Tretmans
Test Generation
i ioco s =def ∀σ ∈ Straces (s) : out (i after σ) ⊆ out (s after σ)
δ
out (s after σ)= { !x, !y, δ }
σ
s
!x !yδ
i
!x
σ
!z
out (i after σ)= { !x, !z, δ }
out (test after σ)= LU ∪ { θ }
θpasspass fail
test
?x
σ
?y ?z
pass
59 © Jan Tretmans
Algorithm
To generate a test case t(S) from a transition system specification S, with S ≠ ∅: set of states ( initially S = s0 after ε )
1 end test casepass
Apply the following steps recursively, non-deterministically:
Test Generation Algorithm
allowed outputs (or δ): !x ∈ out ( S )forbidden outputs (or δ): !y ∉ out ( S )
3 observe all outputs
fail
t ( S after !x )
fail
allowed outputsforbidden outputs?y
θ ?x2 supply input !a
!a
t ( S after ?a ≠ ∅ )
fail
t ( S after !x )
fail
allowed outputsforbidden outputs?y ?x
60 © Jan Tretmans
?coffee
failpass
θ ?tea?choc
failfail
Test Generation Example
test
?coffee
failfail
θ ?tea?choc
pass
s
!tea
?dub
!coffeefail
!dub ?coffee
fail ?tea
?choc
fail
61 © Jan Tretmans
?-2?2
pass pass
otherwise
fail
!4 otherwisefail
otherwise
fail
test
To cope with non-deterministic behaviour, tests are not linear traces, but trees
Test Generation Example
Equation solver for y2=x
specification
? x (x >= 0)
! √x
? x (x < 0)
! -√x pass
?-3otherwise?3
fail
!9
62 © Jan Tretmans
Model-Based Testing with LTS
test execution
testgeneration
tests
model
IUT
imp
Involves:• specification model• implementation IUT + models of IUTs• correctness• test cases• test generation• test execution• test result analysis
pass / fail
=
63 © Jan Tretmans
Test Execution
Test execution = all possible parallel executions (test runs) oftest t with implementation i going to state pass or fail
Test run : t i σ pass i' or t i σ fail i'
t t’, i i’a a
t i t’ i’a
i i’τ
t i t i’τ
t t’ ,θ
t i t’ i’θ
i i’δ
64 © Jan Tretmans
Test Execution Example
Two test runs :
t i dub tea pass i'
fail i''t i dub choc i fails t
!choc
?dub
!tea
i
i' i'' ?coffee
failpass
θ ?tea?choc
failfail
test
?coffee
failfail
θ ?tea
?choc
pass
fail
!dub ?coffee
fail ?tea
?choc
fail
65 © Jan Tretmans
Validity of Test Generation
For every test t generated with algorithm we have:
Soundness :t will never fail with correct implementation
i ioco s implies i passes t
Exhaustiveness :each incorrect implementation can be detectedwith a generated test t
i ioco s implies ∃ t : i fails t
66 © Jan Tretmans
Model-Based Testing with LTS
test execution
testgeneration
tests
model
IUT
ioco
Involves:• specification model• implementation IUT + models of IUTs• correctness ioco• test cases• test generation• test execution• test result analysis
pass / fail
=
67 © Jan Tretmans
Model-Based Testing with LTS
test execution
testgeneration
tests
model
IUT
ioco
Involves:• specification model• implementation IUT + models of IUTs• correctness ioco• test cases• test generation• test execution• test result analysis
pass / fail
=
68 © Jan Tretmans
Test Generation Tools
AETG Agatha Agedis Autolink Conformiq Cooper Cover G∀st Gotcha Leirios
TestComposer
TGV
TorX
TorXakis
Uppaal-Tron
Tveda
. . . . . .
TorX
Some Model-Based Testing Approaches and Tools
Phact/The Kit QuickCheck Reactis RT-Tester SaMsTaG SpecExplorer Statemate STG TestGen (Stirling) TestGen (INT)
69 © Jan Tretmans
A Tool for Transition Systems Testing: TorX
On-the-fly test generation and test execution Implementation relation: ioco Mainly applicable to reactive systems / state based systems;
♦ specification languages: LOTOS, Promela, FSP, Automata
TorX IUTobserveoutput
offerinput
nextinput
specificationcheckoutput
passfailinconclusive
user:manualautomatic
70 © Jan Tretmans
TorX Case Studies
Conference Protocol
EasyLink TV-VCR protocol
Cell Broadcast Centre component
‘’Rekeningrijden’’ Payment Box protocol
V5.1 Access Network protocol
Easy Mail Melder
FTP Client
“Oosterschelde” storm surge barrier-control
DO/DG dose control
Laser interface
The new Dutch electronic passport
academic
Philips
LogicaCMG
Interpay
Lucent
LogicaCMG
academic
LogicaCMG
ASML/Tangram
ASML/Tangram
Min. of Int. Aff.
71 © Jan Tretmans
TorX Case Study Lessons
model construction♦ difficult: missing or bad specs♦ leads to detection of design errors♦ not supported by integrated environment yet♦ research on test based modelling
adapter/test environment♦ development is cumbersome♦ specific for each system
longer and more flexible tests♦ full automation : test generation + execution + analysis
no notion of test selection or specification coverage yet♦ only random coverage or user guided test purposes
72
Model Based Testing,Verification,
and the Test Assumption
73 © Jan Tretmans
Formal Testing with Transition Systems
exec : TESTS × IMPS → ℘(OBS)
gen : LTS → ℘(TTS)
Ts ⊆ TTS
s ∈ LTS
IUT ∈IMPS
ioco
iIUT ∈IOTS
passes : IOTS × TTS →
{pass,fail}
Proof soundness and exhaustiveness:∀ i∈IOTS . ( ∀t∈gen(s) . i passes t ) ⇔ i ioco s
Test assumption :∀IUT∈IMP . ∃ iIUT ∈IOTS . ∀t∈TTS . IUT passes t ⇔ iIUT passes t
pass / fail
=
74 © Jan Tretmans
IUT iIUT
Comparing Transition Systems:An Implementation and a Model
environmente
environmente
IUT ≈ iIUT ⇔ ∀ e ∈ E . obs ( e, IUT ) = obs (e, iIUT )
75 © Jan Tretmans
Formal Testing : Test Assumption
Test assumption :
∀ IUT . ∃ iIUT ∈ MOD.
∀ t ∈ TEST . IUT passes t ⇔ iIUT passes t
IUT iIUT
test t test t
76 © Jan Tretmans
Completeness of Formal Testing
IUT passes Ts ⇔def ∀ t ∈ Ts . IUT passes t
Test assumption : ∀ t ∈ TEST . IUT passes t ⇔ iIUT passes t
Proof obligation: ∀ i ∈ MOD ( ∀ t ∈ Ts . i passes t ) ⇔ i imp s
IUT passes Ts ⇔ IUT confto s ?
Definition : IUT confto s⇔ IUT confto s
⇔ iIUT imp s
⇔ ∀ t ∈ Ts . iIUT passes t
⇔ ∀ t ∈ Ts . IUT passes t
IUT passes Ts
77
Variations of ioco
78 © Jan Tretmans
Variations on a Themei ioco s ⇔ ∀σ ∈ Straces(s) : out ( i after σ) ⊆ out ( s after
σ)
i ≤ ior s ⇔ ∀σ ∈ ( L ∪{δ} )* : out ( i after σ) ⊆ out ( s after σ)
i ioconf s ⇔ ∀σ ∈ traces(s) : out ( i after σ) ⊆ out ( s after σ)
i iocoF s ⇔ ∀σ ∈ F : out ( i after σ) ⊆ out ( s after σ)
i uioco s ⇔ ∀σ ∈ Utraces(s) : out ( i after σ) ⊆ out ( s after σ)
i mioco s multi-channel iocoi wioco s non-input-enabled iocoi eco e environmental conformancei sioco s symbolic iocoi (r)tioco s (real) timed tioco (Aalborg, Twente, Grenoble, Bordeaux,. . . .)
i iocor s refinement iocoi hioco s hybrid iocoi qioco s quantified ioco. . . . . .
79 © Jan Tretmans
Underspecification: uioco
?a?a
?b
!z
?b?a
!y !x
i ioco s ⇔ ∀σ ∈ Straces(s) : out ( i after σ) ⊆ out ( s0 after σ)
s0
s1 s2
out ( s0 after ?b ) = ∅
but ?b ∉ Straces(s) : under-specification :
anything allowed after ?b
out ( s0 after ?a ?a ) = { !x }
and ?a ?a ∈ Straces(s)
but from s2 , ?a ?a is under-specified :
anything allowed after ?a ?a ?
80 © Jan Tretmans
?a?a
?b
!z
?b?a
!y !x
s0
s1 s2
i uioco s ⇔ ∀σ ∈ Utraces(s) : out ( i after σ) ⊆ out ( s0 after σ)
Now s is under-specified in s2 for ?a :
anything is allowed.
Utraces(s) =
{ σ ∈ Straces (s) | ∀ σ1 ?a σ2 = σ,
∀ s': s σ1 s' ⇒ s'
?a }
ioco ⊂ uioco
Underspecification: uioco
81 © Jan Tretmans
LI∪LUτ τ
LI
χ?a
?a
?b
!z
?b?a
!y !x
s0
s1
?b
χ
?a
χ
s2
Alternatively, via chaos process χfor under-specified inputs
Underspecification: uioco
i uioco s ⇔ ∀σ ∈ Utraces(s) : out ( i after σ) ⊆ out ( s0 after σ)
82
Testing ofComponent-Based Systems
83 © Jan Tretmans
Component-Based DevelopmentSoftware ....... is very complex has many quality issues is very expensive to develop but we increasingly depend it !
Contribution to a solution: Component-Based Development „Lego“ approach:
building components and combining components into systems divide and conquer common in many other engineering disciplines:
construction, electronics, cars
84 © Jan Tretmans
component
component
componentcomponent component
component
Component-Based Development
system
85 © Jan Tretmans
Potential benefits independence of component en system development
♦ master complexity♦ third party development, outsourcing♦ specialization
reuse♦ standard components♦ reduce system development cost and time♦ improve quality
substitution♦ choice and replacements from other suppliers♦ update components
Component-Based Development
86 © Jan Tretmans
Prerequisites “good“ components
♦ precise specification♦ correct implementation
infrastructure and glue to connect components standardization of interfaces
♦ at component supplier side, and♦ at componenent user side
“Good“ components“ independent from system reusable : components that fit in many systems substitutable : system must work with substituted components
Component-Based Development
87 © Jan Tretmans
system : (autonomous) entity interacting with other entities (= environment)
system boundary : common frontier between system and environment
function of system:what system is intended to do
behaviour : what the system does to implement its function= sequence of states and actions
environment
system
Components and Systems
88 © Jan Tretmans
structure : internal composition that enables system to generate behaviour♦ set of components
♦ with possible interactions
component : yet another system
environment
component A
component B
Components and Systems
system
89 © Jan Tretmans
user : part of the environment who or that is interested in the function of the system= role of a system
service : behaviour of a systemas perceived and received by its user
provider : role of systemin delivering service
system can simultaneously beprovider and user
environment
component A
component B
Components and Systems
user
system
90 © Jan Tretmans
service interface : part of the boundary
where service is delivered
use interface : part of the boundary of
the user at which the user receives service
service specification : specification
of all possible behaviours that shall be
perceivable at the service interface
environment
component A
component B
Components and Systems
user
serviceinterface
serviceinterface
useinterface
useinterface
system
91 © Jan Tretmans
Testing Component-Based Systems
Traditional approaches
bottom-up (with drivers)
top-down (with stubs)
big-bang
mixed
component A
component B
system test
stub for 1
driver for 2
driver for 1
system
Testing of components interactions (integration) whole systemwhere later detection of errors
is more costly
92 © Jan Tretmans
Testing challenges
reusability : testing in many/all different environments (cf. Ariane)
substitutability : system must accept substitute components
independence :♦ component tester does not know use environment♦ integration tester has no code (black-box),
no expertise about component,cannot repair and maintain component
Functionality is compositional, to a certain extent
But emerging and non-functional properties are not :♦ performance, memory usage, reliability, security
Components developed once, but tested often ?
Testing Component-Based Systems
93 © Jan Tretmans
Testing Component-Based Systems
Testing of components A and B
testing B:♦ does B provide the correct service
testing A :♦ does A provide the correct service♦ does A use the service of B correctly
Disadvantages of traditional approaches
bottom-up: A is not tested with different B
top-down:B is not tested with different A
component A
component B
providedservice
providedservice
useservice
system
94
Model-Based Testingof Components
95 © Jan Tretmans
Model-Based Testing of Components
Testing component A
ideal:♦ access to all interfaces♦ formal specfication model of
complete behaviour at all interfaces♦ coordinated testing at all interfaces
based on formal model, e.g. with ioco
often in practice:♦ only specification of provided
service available♦ unspecified how component should
use other componets
driver
stub
testerIUT
component A
96 © Jan Tretmans
Model-Based Testing of Components
Testing component A
specification of A only prescribes
provided service of A, typically
1. what A provides to its user
2. what A expects from its user
ideal for bottum-up testing at service
interface of A
but not for testing of A in isolation
component A
component B
test based on
model of A
system
model of A
97 © Jan Tretmans
Model-Based Testing of Components
Testing component A
suppose also such a specification of B
is available that prescribes
♦ what B provides to its user = A
♦ what B expects from its user = A
use specification of B to test whether
A behaves following expectations of B,
using what B provides
model of B as “intelligent stub“
requires something else than ioco !
component A
component B
test based on
model of A
test based on
model of B
model of B
model of A
98 © Jan Tretmans
methodinvocations
IUTcomponent
||
IUTcomponent
methodinvocations
methodsinvoked
method call
IUTcomponent
method returned
method called
method return
IUTcomponent
methodinvocation
Model-Based Testing of ComponentsFormalization of components(in a client-server like setting)
labelled transition systems♦ what are labels ?♦ input enabled ?
labels♦ all interactions between
components must be modelled atomically as labels
• methods calls• method returns• methods called• methods returned• (assuming no side-effects)
99 © Jan Tretmans
specifications
ideal
sA ∈ LTS(LI , LU )
practice
sA ∈ LTS(LI p
, LU p )
sB ∈ LTS(LI u
, LU u )
LI = LI p ∪ LI
u = provided methods calls ∪ used methods returns
LU = LU p ∪ LU
u = provided methods returns ∪ used methods calls
Model-Based Testing of Components
testerIUT
component A
methodcall
methodreturn
methodcall
methodreturn
LI u
LI p
LU u
LU p
100 © Jan Tretmans
Input-enabledness:
∀ s of IUT, ∀ ?a ∈ LI : s ?a
No ! ?
For method returns ?
asdffdsf
testerIUT
component A
methodcall
methodreturn
methodcall
methodreturn
LI u
LI p
LU u
LU p
Model-Based Testing of Components
101 © Jan Tretmans
i uioco s =def ∀σ ∈ Utraces (s) : out (i after σ) ⊆ out (s after σ)
in (s after σ) = { a? ∈ LI | s after σ must a? }
i wioco s =def ∀σ ∈ Utraces (s) : out (i after σ) ⊆ out (s after σ)
and in (i after σ) ⊇ in (s after σ)
s after σ must a? = ∀ s’ ( s σ s’ ⇒ s’ a? )
Implementation Relation wioco(u)ioco domain-extended, conservativelyfor non-input enabled implementations: wioco
102 © Jan Tretmans
component A
component B
test based on
model of A
model of A
Test implementation of A : IA
with respect to model of A : MA
according to implementation relation wioco
IA wioco MA =def
∀σ ∈ Utraces (MA) :
out (IA after σ) ⊆ out (MA after σ)
and in (IA after σ) ⊇ in (MA after σ)
wioco
Implementation Relation wioco
103 © Jan Tretmans
?dub
!choc
?kwart
!tea
!coffee
?dub?kwart
?dub?kwart
?dub?kwart
!choc
?dub
!tea
ioco ioco
Implementation Relation wioco
!coffee
?dub
!tea
s
iocoioco
δ?dub
!coffee
?dub
104 © Jan Tretmans
?dub
!choc
?kwart
!tea
!coffee
?dub?kwart
?dub?kwart
?dub?kwart
!choc
?dub
!tea
wioco wioco
Implementation Relation wioco
!coffee
?dub
!tea
s
wioco wioco
δ?dub
!coffee
?dub
105 © Jan Tretmans
?dub
!choc
?kwart
!tea
!coffee
?dub?kwart
?dub?kwart
?dub?kwart
!choc
?dub
!tea
wioco wioco
Implementation Relation wioco
!coffee
?dub
!tea
s
wioco wioco
δ?dub
!coffee
?dub
106 © Jan Tretmans
?dub
!choc
?kwart
!tea
!coffee
?dub
!choc
?dub
!tea
wioco
Implementation Relation wioco
s
wioco
δ?dub
!coffee
?dub
?dub
!coffee
?kwart
!tea
wioco
wioco
107 © Jan Tretmans
i uioco s =def ∀σ ∈ Utraces (s) : out (i after σ) ⊆ out (s after σ)
in (e after σ) = { a? ∈ LI | e after σ must a? }
i eco e =def ∀σ ∈ Utraces (s) ∩ L* : uit (i after σ) ⊆ in (e after σ)
e after σ must a? = ∀ e’ ( e σ e’ ⇒ e’ a? )
uit (i after σ) = out (i after σ) \ { δ }
Implementation Relation eco(u)ioco for testing with a specification of the environment: eco
[ and in (i after σ) ⊇ uit (e after σ) ]
108 © Jan Tretmans
component A
component B
test based on
model of A
test based on
model of B
Test implementation of A : IA
with respect to model of B : MB
according to implementation relation eco
wioco
model of Beco
IA eco MB =def
∀σ ∈ Utraces (MB) ∩ L* :
uit (IA after σ) ⊆ in (MB after σ)
and in (IA after σ) ⊇ uit (MB after σ)
Implementation Relation eco
109 © Jan Tretmans
?dub
!choc
?kwart
!tea
!coffee
?dub?kwart
?dub?kwart
?dub?kwart
!choc
?dub
!tea
Implementation Relation eco
!coffee
?dub
!tea
e?dub
!coffee
?dub
110 © Jan Tretmans
!dub
?choc
!kwart
?tea
?choc
!dub
?tea
eco eco
Implementation Relation eco
!coffee
?dub
!tea
e
eco
!dub
?coffee
!dub
eco
?coffee
!dub
?tea?coffee
?tea
?tea?coffee
111 © Jan Tretmans
!dub
?choc
!kwart
?tea
?coffee
!dub
?choc
!dub
?tea
eco
Implementation Relation eco
!coffee
?dub
!tea
e
eco
!dub
?coffee
!dub
eco
eco
112 © Jan Tretmans
Algorithm
To generate a test case t(E) from a transition system specification E, with E ≠ ∅: set of states ( initially E = e0 after ε )
1 end test casepass
Apply the following steps recursively, non-deterministically:
Test Generation Algorithm for eco
allowed outputs : ?x ∈ in ( E ) and θforbidden outputs : ?y ∉ in ( E )
3 observe all outputs
fail
t ( E after ?x )
fail
allowed outputsforbidden outputs?y
θ ?x2 supply input !a
!a
t ( E after !a ≠ ∅ )
fail
t ( E after ?x )
fail
allowed outputsforbidden outputs?y ?x
!a ∈ uit (E)
t (E) for θ
113 © Jan Tretmans
eco
Test Generation for eco
!coffee
?dub
!tea
e
i
?kwart
pass
θ ?dub
failfail
eco test t
!tea
failfail
θ
?dub?kwart
?dub
pass
?kwart
fail
fail i‘ t i kwart !dub
?choc
!kwart
?tea
114 © Jan Tretmans
Testing with wioco and eco can be performed concurrently but it still is partial testing !
coordination and dependence between actions at both interfaces is not tested
Example: A shall look up information in a database (= component B); A queries B, but does not use this information, but invents itself the information.
This is not the “ideal” testing of component A in isolation, but only step in that direction !
More research required ........
wioco and eco
model-based wioco
model-based eco
testerIUT
component A
115 © Jan Tretmans
!x
?ok
!err
?but
?but
?but
!err
?but
!ok
?ok
?ok?
err !x
?err
!y
?ok?err
?ok?err
?ok?err
?but
τ
!x
i1 ioco s1
i2 ioco s2
ioco s1||s2i1||i2
okerr
but
Xy
okerr
but
Xy
Compositional Testing
τ?but
?but
?but
!y
?but
116 © Jan Tretmans
Compositional Testing
i1
i2 s2
s1
ioco
i1 ioco s1
i2 ioco s2
s1 || s2i1 || i2
If s1, s2 input enabled - s1, s2 ∈ IOTS - then ioco is preserved !
117 © Jan Tretmans
Concluding
Testing can be formal, too (M.-C. Gaudel, TACAS'95)♦ Testing shall be formal, too
A test generation algorithm is not just another algorithm :♦ Proof of soundness and exhaustiveness♦ Definition of test assumption and implementation relation
For labelled transition systems :♦ (u/w)ioco, eco for expressing conformance between imp and spec♦ sound and exhaustive test generation algorithms♦ tools generating and executing tests:
TGV, TestGen, Agedis, TorX, . . . .
118 © Jan Tretmans
Model based formal testing can improve the testing process :
model is precise and unambiguous basis for testing
♦ design errors found during validation of model
longer, cheaper, more flexible, and provably correct tests
♦ easier test maintenance and regression testing
automatic test generation and execution
♦ full automation : test generation + execution + analysis
extra effort of modelling compensated by better tests
Perspectives
119 © Jan Tretmans
Thank You