thoughts on tdl stephan schulz rapporteur meeting, munich, dec 2010
DESCRIPTION
CONFORMIQ DESIGNER. Thoughts on TDL Stephan Schulz Rapporteur Meeting, Munich, Dec 2010. TDL - What’s New? . A number of efforts have been undertaken in the past to standardize ways how to describe tests at ETSI Somehow they all fell through for one reason or another – why? - PowerPoint PPT PresentationTRANSCRIPT
Automated Test Design™ © 2011 Conformiq, Inc.
CONFORMIQDESIGNER
Thoughts on TDLStephan Schulz
Rapporteur Meeting, Munich, Dec 2010
Automated Test Design™ © 2011 Conformiq, Inc.
TDL - What’s New? • A number of efforts have been undertaken in the past to standardize
ways how to describe tests at ETSI• Somehow they all fell through for one reason or another – why? • This is one view on them but there is likely to be more• The work on TDL should learn from them an avoid to repeat the
“same mistakes”• Similar as with (other) model-based testing related work the general
situation has changed and it is time to try again
Automated Test Design™ © 2011 Conformiq, Inc.
Previous Attempts: TTCN• Earliest attempt we know of to standardize a way on how to
describe tests at ETSI was (the birth of) TTCN– Initial goal when starting to design TTCN-2
• We know how it ended … in a programming language• Why?
– Because the fundament (clearly defined test execution) was missing?
Automated Test Design™ © 2011 Conformiq, Inc.
Previous Attempts: TTCN-3 GFT• In 2000 ETSI releases the first edition of TTCN-3
– At its core a textual test scripting language– But with also offering alternative “presentation formats” – one being the
so called graphical format “GFT” (build on a MSC like notation)• In 2009 ETSI TC MTS declare the GFT part to be “historical” because
this presentation format (as well as others) was not really adopted• Why?
– Because it was driven only by a single tool vendor?– Because it tried to express too much graphically ? (GFT was “100%”
compatible to core notation, i.e., 100s of pages of standard)– Because it had no (graphical) way for expressing data?
Automated Test Design™ © 2011 Conformiq, Inc.
Previous Attempts: HLTD WI• In 2008 ETSI started a work item on “Requirements for High Level
Test Descriptions”– Created after a Siemens presentation and answering a concrete to a
concrete need at ETSI– Multiple rapporteurs meetings and standards drafts
• In 2010 ETSI TC stopped the work item due to lack of progress & convergence– Lots of work and analysis … no standard
• Why? – Disinterest by wider MTS community? No tool vendors?
Automated Test Design™ © 2011 Conformiq, Inc.
Previous Attempts: TPLan• In 2008 ETSI started an attempt to test specification further by
defining a notation for expressing so call “test purposes”– See presentation by Steve Randall– Intended for describing for “what” are we testing – not “how” – Purely textual: “a way of writing structured English” via dictionary
• Precondition, stimulus, response
• Reasonable success at ETSI– Used in a number of ETSI conformance test specification projects– 3GPP has adapted “a simplified version”
• No known use (beside in standardization) in industry– Only very rudimentary tool support (no commercial tools)
Automated Test Design™ © 2011 Conformiq, Inc.
Why TDL? Why Now?• It has become significantly easier to produce tools for domain specific languages• The need in standardization is still there
– TTCN-3 code can not be reviewed by committees– 3GPP and others (e.g., OMA) have been and are using heavily test descriptions with
no agreed format to date• Industrial test specification seems to be moving up in abstraction
– TTCN-3 is “generally” used in industry (and ETSI) in conjunction with frameworks created by expert users .. on top of which tests are specified
– New industry trend seems to replace TTCN-3 by keyword driven test tools – Deployment of MBT is picking up– In 2011 testers come increasingly with little programming background and desire to
pick it up (in part because there is alternatives)
Automated Test Design™ © 2011 Conformiq, Inc.
Conformiq Requirements on TDL• Should be suitable for test generation and should be able serve as basis
for manual and automated test execution• Should be suitable for standardization and industrial/internal use• It should not require knowledge of programming syntax
– Graphical/MSC like vs. tabular vs. prose vs. transfer format – Leave actual representation open?
• Representation must be suitable for printing and reading– Ideally all in one place as it is done already today in informal approaches
• Should be based on a limited number of concepts– Must not attempt to replicate TTCN-3 in power of expression
• Data must finally appear as “equal citizen”
Automated Test Design™ © 2011 Conformiq, Inc.
Example for Inspiration: ETSI IPv6Test Identifier IP6_HDR_NOD_GEN_VIO_001 Test Objective To ensure that an IPv6 IUT is able to process Version=6 packets and discard Version={0-3, 5, 7-
15} packets it receives (Clause 5.1.1) Standard Reference RFC2460 [5] clause 3, figure 1 PICS Reference (in this document) Clause A.2 (only Version=0, 5, 7, 15 are tested) PIXIT Reference (in this document) Clause A. Pre-Conditions In the test configuration (shown in Figure A.1), run Common Test Setup A.6.1.1 Step Action Pass Condition Preamble None 1 The tester transmits an Echo request to the IUT with an
IPv6 header having Version field set to 6 The tester receives an Echo response from the IUT
2a The tester transmits an Echo request to the IUT with an IPv6 header having Version field set to 0; repeat three times
In each of the three attempts, the tester does not receive any invalid packet from the IUT within 30 seconds
2b The tester transmits an Echo request to the IUT with an IPv6 header having Version field set to 6
The tester receives an Echo response from the IUT
3a The tester transmits an Echo request to the IUT with an IPv6 header having Version field set to 5; repeat three times
In each of the three attempts, the tester does not receive any invalid packet from the IUT within 30 seconds
3b The tester transmits an Echo request to the IUT with an IPv6 header having Version field set to 6
The tester receives an Echo response from the IUT
4a The tester transmits an Echo request to the IUT with an IPv6 header having Version field set to 7; repeat three times
In each of the three attempts, the tester does not receive any invalid packet from the IUT within 30 seconds
4b The tester transmits an Echo request to the IUT with an IPv6 header having Version field set to 6
The tester receives an Echo response from the IUT
5a The tester transmits an Echo request to the IUT with an IPv6 header having Version field set to 15; repeat three times
In each of the three attempts, the tester does not receive any invalid packet from the IUT within 30 seconds
5b The tester transmits an Echo request to the IUT with an IPv6 header having Version field set to 6
The tester receives an Echo response from the IUT
Postamble Run the common test postamble (clause A. 6.1.4)
Automated Test Design™ © 2011 Conformiq, Inc.
Example for Inspiration: OMA
Automated Test Design™ © 2011 Conformiq, Inc.
Example for Inspiration: MSC + Data
Automated Test Design™ © 2011 Conformiq, Inc.
Example for Inspiration: Tabular