applications of automated model based testing with torx ed brinksma course 2004
TRANSCRIPT
Applications of AutomatedModel Based Testing
with TorX
Ed BrinksmaCourse 2004
© Ed Brinksma/Jan Tretmans
TorX Case Studies
Conference Protocol
EasyLink TV-VCR protocol
Cell Broadcast Centre component
‘’Rekeningrijden’’ Payment Box protocol
V5.1 Access Network protocol
Easy Mail Melder
FTP Client
“Oosterschelde” storm surge barrier-
control
academi
c
Philips
CMG
Interpay
Lucent
CMG
academi
c
CMG
© Ed Brinksma/Jan Tretmans
The Conference Protocol Experiment
Academic benchmarking experiment,initiated for test tool evaluation and comparison
Based on really testing different implementations
Simple, yet realistic protocol (chatbox service)
Specifications in LOTOS, Promela, SDL, EFSM
28 different implementations in C
one of them (assumed-to-be) correct
others manually derived mutants
http://fmt.cs.utwente.nl/ConfCase
© Ed Brinksma/Jan Tretmans
CPE
UDP Layer
CPECPE
The Conference Protocol
joinleavesend
receive
Conference Service
© Ed Brinksma/Jan Tretmans
Conference ProtocolTest Architecture
CPE= IUT
UT-PCO = C-SAP
LT-PCO
UDP Layer
U-SAP LT-PCO
TesterTorX
B C
A
© Ed Brinksma/Jan Tretmans
The Conference Protocol Experiments
TorX - LOTOS, Promela : on-the-fly ioco testing
Axel Belinfante et al.,
Formal Test Automation: A Simple ExperimentIWTCS 12, Budapest, 1999.
Tau Autolink - SDL : semi-automatic batch testing
TGV - LOTOS : automatic batch testing with test purposes
Lydie Du Bousquet et al.,
Formal Test Automation: The Conference Protocol with TGV/TorXTestCom 2000, Ottawa.
PHACT/Conformance KIT - EFSM : automatic batch testing
Lex Heerink et al.,
Formal Test Automation: The Conference Protocol with PHACTTestCom 2000, Ottawa.
© Ed Brinksma/Jan Tretmans
Conference Protocol Results
Results:
failpass“core dump”
PHACTEFSM
2161
TorXLOTOS
2530
pass 000444666
000444666289293398
TGVLOTOSrandom2530
TGVLOTOSpurposes2440
TorXPromela
2530
000444666332
000444666
000444666
© Ed Brinksma/Jan Tretmans
Conference Protocol Analysis
Mutants 444 and 666 react to PDU’s from non-existent
partners:
no explicit reaction is specified for such PDU’s,
so ioco-correct, and TorX does not test such behaviour
So, for LOTOS/Promela with TGV/TorX:
All ioco-erroneous implementations detected
EFSM:
two “additional-state” errors not detected
one implicit-transition error not detected
© Ed Brinksma/Jan Tretmans
Conference Protocol Analysis
TorX statistics all errors found after 2 - 498 test events
maximum length of tests : > 500,000 test events
EFSM statistics 82 test cases with “partitioned tour method” ( = UIO )
length per test case : < 16 test events
TGV with manual test purposes ~ 20 test cases of various length
TGV with random test purposes ~ 200 test cases of 200 test events
© Ed Brinksma/Jan Tretmans
TVTV
VCRVCRcommunication
object of testing
EasyLink Case Study
EasyLink protocol between TV and VCR
simple, but realistic features:
preset download WYSIWYR (what you see is what you record) EPG download ...
© Ed Brinksma/Jan Tretmans
MBB (= Magic Black Box)
allows to monitorcommunicationbetween TV and VCR by PC
allows PC to send messagesto mimic TV or VCR
TorX distributedover PC and workstation
EasyLink Test Architecture
TVTV
VCRVCR
MBB
MBB
PCPC
Work StationWork
Station
RCRC
manual inter-action
© Ed Brinksma/Jan Tretmans
Testing Preset Download Feature
What? check whether TV correctly implements
preset download based on Promela specification How?
let PC play role of VCR and initiate preset download receive settings from TV WHILE (TRUE) {
let PC initiate preset downloadlet PC non deterministically stop preset downloadcheck for consistency in presets
} feature interaction:
shuffle presets on TV using RC all under control of PC
© Ed Brinksma/Jan Tretmans
Results:
EasyLink Experiences
test environment influences what can be tested testing power is limited by
functionality of MBB
initially, state of TV is unknown tester must be prepared for
all possible states
some “hacks” needed in specification and tool architecture in order to decrease state space
automatic specification based testing feasible
tool architecture also suitable to cope with user interaction
some (non fatal) non-conformances detected
© Ed Brinksma/Jan Tretmans
CMG - CBC Component Test
Test one component of Cell Broadcast Centre
LOTOS (process algebra) specification of 28 pp.
Using existing test execution environment
Based on automatic generation of “adapter” based on IDL
Comparison (simple):existing test TorX
code coverage 82 % 83 %
detected mutants/10 5 7
Conclusion:
TorX is as least as good as conventional testing
(with potential to do better)
LOTOS is not nice (= terrible) to specify such systems
© Ed Brinksma/Jan Tretmans
Interpay ‘’Rekeningrijden’’Highway Tolling System
© Ed Brinksma/Jan Tretmans
“Rekeningrijden”
Characteristics :
Simple protocol
Parallellism :
many cars at the same time
Encryption
Real-time issues
System passed traditional testing
phase
© Ed Brinksma/Jan Tretmans
‘’Rekeningrijden’’ :Phases for Automated Testing
IUT study
informal and formal specification
Available tools study
semantics and openness
Test environment
test architecture, test implementation, SUT specification
testing of test environment
Test execution
test campaigns, execution, analysis
© Ed Brinksma/Jan Tretmans
PaymentBox
(PB)Road SideEquipment
OnboardUnit
UDP/IPWireless
‘’Rekeningrijden’’ Highway Tolling System
© Ed Brinksma/Jan Tretmans
spec
PB
TorX PaymentBox
‘’Rekeningrijden’’: Test Architecture I
PCO
© Ed Brinksma/Jan Tretmans
spec
PB+
UDP/IP
TorX PaymentBox
SUT
‘’Rekeningrijden’’: Test Architecture II
Test Context
UDP/IPPCO IAP
© Ed Brinksma/Jan Tretmans
Test Context
ObuSim
spec
PB+
ObuSim+
TCP/IP+
UDP/IP
PaymentBox
TCP/IP
TorX
‘’Rekeningrijden’’: Test Architecture III
PCO
SUT
UDP/IP IAP
© Ed Brinksma/Jan Tretmans
‘’Rekeningrijden’’: Test Campaigns
Introduction and use of Test
Campaigns :
Management of test tool configurations
Management of IUT configurations
Steering of test derivation
Scheduling of test runs
Archiving of results
© Ed Brinksma/Jan Tretmans
‘’Rekeningrijden’’: Issues Parallellism :
very easy
Encryption :
Not all events can be synthesized :
Leads to reduced testing power
Real-time :
How to cope with real time constraints ?
Efficient computation for on-the-fly testing ?
Lack of theory: quiescence vs. time-out
© Ed Brinksma/Jan Tretmans
TorX PB
tq
Timeout
Quiescence
Input
Observe
Timeout
Input
TorX PB
tq
Timeout
Tick
Input
Observe
Timeout
Input
Spec := Spec + Tick
‘’Rekeningrijden’’ Problem:Quiescence in ioco vs. time-out
© Ed Brinksma/Jan Tretmans
Spec := Refine + Buffer
TorX PB
Timeout
Unexpected
Input0
Input1
Input0
Input1
TorX PB
Input01
Input0
Input1
Error
Full
‘’Rekeningrijden’’ Problem:Action Refinement
© Ed Brinksma/Jan Tretmans
‘’Rekeningrijden’’: Issues Modelling language: LOTOS Promela
Spec for testing Spec for validation
Development of specification is iterative process
Development of test environment is laborious
Parameters are fixed in the model
Preprocessing: M4/CPP
Promela problem: Guarded inputs
Test Campaigns for bookkeeping and control of experiments
Probabilities incorporated
© Ed Brinksma/Jan Tretmans
‘’Rekeningrijden” : Results Test results :
1 error during validation (design error)
1 error during testing (coding error)
Automated testing :
beneficial: high volume and reliability
many and long tests executed ( > 50,000 test events )
very flexible: adaptation and many configurations
Step ahead in formal testing of realistic systems