tp#16 onem2m approach to testing tp#16 source: etsi meeting date: 2015-03-23

23
TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

Upload: adele-mccormick

Post on 17-Jan-2016

222 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

TP#16oneM2M approach to testing

TP#16Source: ETSIMeeting Date: 2015-03-23

Page 2: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

2

Presentation Outline

• Achieving interoperability– Conformance versus interoperability testing

• test specification development– Test purposes– Conformance tests– Interoperability tests

Page 3: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

3

Interoperability is …• The ultimate aim of standardization in Information and

Communication Technology • The red thread running through the entire standards development

process, it’s not an isolated issue– Not something to be somehow fixed at the end

• Best practise approach of SDO– To produce with the required degree of parallelism Base Standards and Test

Specifications– Base standards should be designed with interoperability in mind

• Profiles can help to reduce potential non-interoperability– Two complementary forms of testing

• Conformance testing• Interoperability testing (i.e., a more formal form of IOT)

– Testing provides vital feedback into the standardization work

Page 4: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

4

About Conformance Testing

• Is Black-Box testing– Stimulation and Response

TestSystem

Device

Point of Control and Observation (PCO)

Reqs.

Page 5: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

5

Conformance is usually tested layer-by-layer

lat

ii

1 2 3

4 5 67 8 9

* 8 #

latigid

TestSystem

SUT API

MMI

L3

L2

PHY

SUT = System Under Test

IUT = Implementation Under Test

L3

Upper Tester

Lower Tester

Lower Adapter

Upper Adapter

L3 Test

Page 6: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

6

Characteristics of Conformance Testing

• Tests IUT against requirements specified either in a base specification or profile (standard)– Gives a high-level of confidence that key components of a device or system are

working as they were specified and designed to do– Usually limited to one requirement per test

• High IUT control and observability– Can explicitly test error behaviour– Can provoke and test non-normal (but legitimate) scenarios– Can be extended to include robustness tests

• Test execution can be automated and repeated– Requires a test system development (and executable test cases)– Can be expensive (e.g., radio-based test system)– Tests under ideal conditions

• Tests are thorough and accurate but limited in scope – At level of detailed protocol messages, service primitives, or procedure calls

Page 7: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

7

Limitations of Conformance Testing

• Does not prove end-to-end functionality (interoperability) between communicating systems– Conformance tested implementations may still not interoperate

This is often a specification problem rather than a testing problem! Need for minimum requirements coverage or profiles

• Tests individual system components– But a system is often greater than the sum of its parts!– Does not test the user’s ‘perception’ of the system

• Standardised conformance tests do not include proprietary ‘aspects’ – Difficult to test via proprietary interfaces, e.g., user APIs – Can however be done by a manufacturer with own conformance tests

for proprietary requirements

Page 8: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

8

About Interoperability Testing• End-to-End Interoperability of devices

– What is being tested?– Assignment of verdicts?

• Need to distinguish between– More formal interoperability testing (= test specifications) versus– Informal interoperability events used to validate/develop technologies, e.g.,

Interop events

Device1 Devicen

Device2

Page 9: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

9

SUT

API

MMI

L3

L2

PHY

API

MMI

L3

L2

PHY

QE = Qualified Equipment

EUT = Equipment Under Test

Methodology for Interoperability Testing

Interoperability Testing(of terminals)

1 2 3

4 5 67 8 9

* 8 #

EUT / QE EUT

1 2 3

4 5 67 8 9

* 8 #

Means of Communication (MoC)

WI-0027 Testing Framework

Page 10: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

10

Characteristics of Interoperability Testing

• It is system testing– Tests a complete device or a collection of devices

• Shows that (two or more) devices interoperate– But within a limited scenario

• Tests at a ‘high’ level (perception of users)– Tests the ‘whole’, not the parts or layers

• E.g., protocol stacks integrated with applications

– Tests functionality• Shows function is accomplished (but not how)

• Does not usually involve test systems– Uses existing interfaces (standard/proprietary)

• Interoperability testing looks at end-to end functionality– Less thorough but wide in scope– Gives a high-level of confidence that devices (or components in a system)

will interoperate with other devices (components)

Page 11: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

11

Limitations of Interoperability Testing

• Does not prove that a device is conformant– Interoperable devices may still interoperate even though they are non-

conformant to a base specification or standard

• Cannot explicitly test error behaviour or unusual scenarios– Invalid conditions may need to be forced

(lack of controllability)– Has limited coverage (does not fully exercise the device)

• Usually performed manually– Difficult to automate, tests use more time to be prepared than

executed.

• Does not prove 100% interoperability with other implementations with which no testing has been done– ‘A’ may interoperate with ‘B‘ and ‘B’ may interoperate with ‘C’. But it

doesn’t necessarily follow that ‘A’ will interoperate with ‘C’.

Page 12: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

12

Conformance and Interoperability Testing are Complementary

• Standardization experience– As you move up a system stack the emphasis should change from

conformance to interoperability testing

• Lower layer protocols, infrastructure– Emphasis on conformance

• Middleware, enablers– Combination of conformance and

interoperability testing

• Services, applications, systems– Emphasis on interoperability testing

• Conformance testing can be viewed as a pre-requisite to interoperability testing

Interop TestingConforman

ce Testing

Interoperability

Page 13: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

13

Stages in Test Specifications Development

Base Standard or Profile Specification

Requirement Catalogue

Implementation Check List (ICS/IFS)

Identification of Test Group Structure (TSS)

Specification of Test Purposes (TP)

Specification of Test Cases (TC)

Validation of Test Cases

Specification of Test Descriptions (TD)

Page 14: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

Conformance Test Specification Development

Standard

Test Purpose

s

Test Descriptio

ns

TTCN-3 Test Suite

Executable Tests

Compilation

Test Case Parameterisation and

Selection

Requirements Catalogue

and/or ICS/IXIT

Page 15: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

15

Implementation Conformance Statement (ICS)

Q# ICS Question Reference Status Support

Q1 Support of Feature F1 [x] Clause a OPTIONAL Yes/No

Q2 Support of Feature F2 [x] Clause b MANDATORY Yes/No

: : : : :

Qn Support of Message M1 [y] Clause c CONDITIONAL Yes/No

: : : : :

Qk Support of message Parameter P1

[y] Clause d OPTIONAL Yes/No

• Tabular overview of the features and options that are implemented by a given product

• Checklist of the capabilities supported by the Implementation Under Test (IUT)

• Allows quick & easy first good guess on potential interoperability between two or more implementations

Page 16: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

16

Test Purposes• Precise descriptions of the purpose of a test in relation to a

particular (base standard) requirement– Meaningful to a larger audience than technical experts

• Define the functionality being tested, i.e., WHAT?– Should be formulated using a similar level of information content and

wording as the corresponding requirement description– Should only mention directly relevant aspects of interactions but not

describe them in detail– Reference a test configuration (i.e., architecture)– Identify one or more test purposes per requirement

• Specified in – Natural language, or Standardized Test Purpose notation (eg. TPLan) – But definitely not coded!

Page 17: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

Example of TP (ITS-G5)TP Id TP/GEONW/PON/FPB/BV-01

Test objective Check Source packet buffering into UC forwarding buffer for unreachable Unicast destinations (absence of a suitable next hop candidate)

Reference EN 302 636-4-1 [1], clauses 7.5.3, 9.3.6.3, 9.3.8.2, 9.3.8.3Config Id CF03

PICS Selection PICS_GN_GUC_SRCInitial conditions

with {the IUT being in the "initial state" andthe IUT not having received any Beacon information from ItsNodeB andthe IUT having a Location Table Entry for ItsNodeA (see note) andthe IUT having been requested to send a GUC packet addressed to ItsNodeA

containing TrafficClass.SCF set to 1}

Expected behaviourensure that {

when {the IUT receives a Beacon packet from ItsNodeB

}then {

the IUT selects the ItsNodeB as the next hop and the IUT sends the buffered GUC packet

}}NOTE: Location Table Entry is created by sending any GeoNetworking packet, originated

by ItsNodeA, from ItsNodeC to IUT.

Page 18: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

18

Conformance Test Suite

• A collection of detailed test cases or scripts that implement test purposes– Can be compiled and executed, e.g., when using TTCN-3

• Specifies HOW to test– Implements also handling of, e.g., unexpected or background behaviour, or

returning the SUT to the initial state after a test• Composition of test cases

– Each individual test has a preamble, test body (i.e., implementation of the Test Purpose), and postamble

– Test components may be used clearly separate interactions with different logical IUT interfaces during a test

• Assigns test verdicts• Should be parameterizable at run-time, e.g., with SUT address• A test suite is not a test system

– Test focus on IUT behaviour– Test system handles also message encoding and transport

Page 19: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

19

What is TTCN-3?

• Testing and Test Control Notation Version 3• Internationally standardized language developed specifically for

executable test specification – Is independent of a specific IUT or IUT interfaces– Is independent of a test execution environment

• Allows unambiguous implementation of tests• Look and feel of a regular programming language• Good tool support (many commercial tools available)• Successfully deployed at 3GPP, OMA, ETSI, WiMAX Forum, etc …

and industry in a variety of application domains– e.g., telecom, automotive, software, etc.

• Good text book available– http://www.wiley.com/legacy/wileychi/ttcn-3/

Page 20: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

20

Example TTCN-3 Test Case

testcase TC_COR_0047_01() runs on Ipv6Node system EtherNetAdapter { f_cf02Up(); // Configure test system for HS->RT // No preamble required in this case f_TP_HopsSetToOne(); // Perform test // No postamble required in this case f_cf02Down(); // Return test system to initial state}

function f_TP_HopsSetToOne() runs on Ipv6Node { var Ipv6Packet v_ipPkt; var FncRetCode v_ret := f_echoTimeExceeded( 1, v_ipPkt ); if ( v_ret == e_success and v_ipPkt.icmpCode == 0 ) { setverdict(pass);} else { setverdict(fail); }}

function f_echoTimeExceeded(in UInt8 p_hops, out Ipv6Packet p_ípPkt ) runs on Ipv6Node return FncRetCode { var Ipv6Packet v_ipPacket; var FncRetCode v_ret; ipPort.send( m_echoReqWithHops(p_hops) ); alt { [] ipPort.receive( mw_anyTimeExceeded ) -> value p_ipPkt { return e_success } [] ipPort.receive { return e_error } }}

Page 21: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

21

Validation of (Conformance) Tests

• Typical test suite validation requires that– Tests compile with several TTCN-3 tools– Tests are executed on one or more platforms against various

implementations of the standard (usually by oneM2M members)

• Requires very close co-operation with test tool suppliers, test platform providers, equipment manufacturers etc.– Strict and well-defined processes between

• Test lab (who execute tests[, certify] and provide feedback)• And oneM2M WG TST (who updates the tests)

– Timely availability of stable base specifications, test platforms (tools) and implementations is essential

Page 22: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

22

Interoperability Test Descriptions• Specifies detailed steps to be followed to achieve stated test

purpose, i.e., HOW?• State steps clearly and unambiguously without unreasonable

restrictions on actual method:– Example:

• Answer incoming call NOT• Pick up telephone handset

• Written in a structured and tabulated natural language so tests can be performed manually

Page 23: TP#16 oneM2M approach to testing TP#16 Source: ETSI Meeting Date: 2015-03-23

Example Interoperability TDInteroperability Test Description

Identifier: TD_M2M_NH_07Objective: AE creates a container ressource in registrar CSE via a container Create RequestConfiguration: M2M_CFG_01

References: [1] 10.2.4.1[2] 7.3.5.2.1

 Pre-test conditions:

Resource <AE> has been created in registrar CSE

 Test Sequence:

Step Type Description

  1 Stimulus AE is requested to send a container Create Request (POST) to create container <container>

2 Check (Mca)

Sent POST request containsRequest URI <CSEBase>/<AE>?rty=containerHost : Registrar CSEPayload: container resource <container> to be created

3 Check Check if possible that the <container> resource is created in registrar CSE.

4 Check (Mca)

Registrar sends response containing:Code = 200Content-Location : <CSEBase> /<AE>/<container>Payload: Created <container>

5 Verify AE indicates successful operation6   Repeat step 1-5 creating container resource with short lifetime

7   Check the created resource is removed after the expiry of its lifetime