automated test suite generation for time-continuous simulink models

39
.lu software verification & validation V V S Automated Test Suite Generation for Time-Continuous Simulink Models Reza Matinnejad Shiva Nejati Lionel Briand SnT Center, University of Luxembourg Thomas Bruckmann Delphi Automotive Systems, Luxembourg

Upload: lionel-briand

Post on 17-Feb-2017

187 views

Category:

Software


1 download

TRANSCRIPT

Page 1: Automated Test Suite Generation for Time-Continuous Simulink Models

.lusoftware verification & validationVVS

Automated Test Suite Generation for Time-Continuous Simulink Models

Reza Matinnejad Shiva Nejati Lionel Briand SnT Center, University of Luxembourg

Thomas Bruckmann Delphi Automotive Systems, Luxembourg

Page 2: Automated Test Suite Generation for Time-Continuous Simulink Models

Simulink Models

Simulation

2

Code-Generation

Page 3: Automated Test Suite Generation for Time-Continuous Simulink Models

Simulink Models -- Simulation

Simulation models

3

Time-Continuous Simulink Model Hardware

Model

Network Model

Mixed discrete-continuous

Page 4: Automated Test Suite Generation for Time-Continuous Simulink Models

Simulink Models -- Code Generation

Code-generation models

4

Time-discrete behavior

Time-Discrete Simulink Model C Code

SUM

Page 5: Automated Test Suite Generation for Time-Continuous Simulink Models

5

Simulink Model Testing Challenges

Page 6: Automated Test Suite Generation for Time-Continuous Simulink Models

Simulink Testing Challenge I

Incompatibility Existing testing techniques are not applicable to simulation models (with time-continuous behaviors)

6

Page 7: Automated Test Suite Generation for Time-Continuous Simulink Models

++

0.051FuelLevelSensor

-0.05

1000.8

+-

Gain

Gain1

Add1

Add

1FuelLevel

ContinuousIntegrator

++

0.051FuelLevelSensor

-0.05

1000.8

+-

Gain

Gain1

Add1

Add

1FuelLevel

DiscreteIntegrator

Sum

Incompatibility Challenge -- Example

7

Applicable Not Applicable

Simulation Model Code Generation Model

Page 8: Automated Test Suite Generation for Time-Continuous Simulink Models

Incompatibility with"the Underlying Technique

8

• The techniques rely on SAT/Constraint solvers, and inherit their limitations in handling

• Time-continuous blocks

• Complex mathematical functions

• Floating-point operations

Page 9: Automated Test Suite Generation for Time-Continuous Simulink Models

Simulink Testing Challenge II

Low Fault-Revealing Ability Existing testing techniques make unrealistic assumptions about test oracles

9

Page 10: Automated Test Suite Generation for Time-Continuous Simulink Models

Low Fault-Revealing Ability Challenge

• Testing is mainly driven by structural coverage

• Structural coverage might be effective when automated test oracles are available

10

• Test oracles are likely to be manual in practice

• Covering a fault may not help reveal it.

Page 11: Automated Test Suite Generation for Time-Continuous Simulink Models

Faulty Model Output

11

Correct Model Output

Low Fault-Revealing Ability Example

Covers the fault and Covers the fault but is Likely to reveal it is very unlikely to reveal it

Page 12: Automated Test Suite Generation for Time-Continuous Simulink Models

12

Our Goal

Generating Fault Revealing Test Suites

for both simulation and code generation models

Page 13: Automated Test Suite Generation for Time-Continuous Simulink Models

13

Our Approach

Search-based Test Generation

Driven by Output-Diversity and Anti-Patterns

Page 14: Automated Test Suite Generation for Time-Continuous Simulink Models

14

Search-Based Test Generation

Initial Test Suite

Slightly Modifying Each Test Input

Repeat

Until maximum resources spent

S Initial Candidate Solution

Search Procedure

R Tweak (S)

if Fitness (R) > Fitness (S)

S R

Return S

Output-based Heuristics

Page 15: Automated Test Suite Generation for Time-Continuous Simulink Models

Output-Based Heuristics

Failure Patterns

Output Diversity

15

Page 16: Automated Test Suite Generation for Time-Continuous Simulink Models

Failure-based Test Genration

16

Instability Discontinuity

• Maximizing the likelihood of presence of specific failure patterns in output signals

0.0 1.0 2.00.0 1.0 2.0-1.0

-0.5

0.0

0.5

1.0

Time Time

0.0

0.25

0.50

0.75

1.0

Output

Page 17: Automated Test Suite Generation for Time-Continuous Simulink Models

"Output Diversity -- Vector-Based

17

Output

Time Output Signal 2 Output Signal 1

Page 18: Automated Test Suite Generation for Time-Continuous Simulink Models

18

Output Diversity -- Feature-Based

increasing (n) decreasing (n)constant-value (n, v)

signal featuresderivative second derivative

sign-derivative (s, n) extreme-derivatives

1-sided discontinuity

discontinuity

1-sided continuitywith strict local optimum

value

instant-value (v)constant (n)

discontinuitywith strict local optimum

increasing

C

A

B

Page 19: Automated Test Suite Generation for Time-Continuous Simulink Models

19

Evaluation

How does the fault revealing ability of our algorithm compare with that of

Simulink Design Verifier?

Page 20: Automated Test Suite Generation for Time-Continuous Simulink Models

Simulink Design Verifier (SLDV)

• Underlying Technique: Model Checking and SAT solvers

• Test objective: Testing is guided by structural coverage

20

Page 21: Automated Test Suite Generation for Time-Continuous Simulink Models

Our Approach vs. SLDV

21

Faults 1 2 3 4 5 6 7 8 9 10 11

12 13 14 15 16 17 18 19 20 21 22

SLDV

SLDV

5 14 2 20 20 20 20 20 20 15 15

Faults

SLDV could not find the fault SLDV found the fault

20 16 20 11 5 20 14 17 11 20 4 Our Approach

Our Approach

• Our approach outperformed SLDV in revealing faults

# The number of fault-revealing runs of our algorithm (out of 20)

Page 22: Automated Test Suite Generation for Time-Continuous Simulink Models

SimCoTest Tool

https://sites.google.com/site/simcotesttool/

22

SimCoTestSimulink Controller Tester

Page 23: Automated Test Suite Generation for Time-Continuous Simulink Models

Conclusion • We distinguished two challenges in Simulink model

testing: Incompatibility and low fault revealing ability

23

• We proposed two output-based test generation algorithms for Simulink models: failure-based and output diversity

• Our output diversity test generation algorithm outperformed Simulink Design Verifier in revealing faults in Simulink models

Page 24: Automated Test Suite Generation for Time-Continuous Simulink Models

.lusoftware verification & validationVVS

Automated Test Suite Generation for Time-Continuous Simulink Models

Reza Matinnejad ([email protected]) Shiva Nejati Lionel Briand SnT Center, University of Luxembourg

Thomas Bruckmann Delphi Automotive Systems, Luxembourg

Page 25: Automated Test Suite Generation for Time-Continuous Simulink Models

Incompatibility with"the Underlying Technique

25

• The techniques rely on SAT/Constraint solvers, and inherit their limitations in handling

• Time-continuous blocks

• Complex mathematical functions

• Floating-point operations

• Supporting library code and system functions is cumbersome

Page 26: Automated Test Suite Generation for Time-Continuous Simulink Models

26

The output of a test case generated based on Output Diversity

The correct output signal is not required for test generation!

Output Diversity vs. Coverage-Based

Correct Model Output

Structural Coverage

Faulty Model Output

Page 27: Automated Test Suite Generation for Time-Continuous Simulink Models

Existing Simulink Testing Techniques

27

Test Oracle Test Objective Underlying Technology

Model Checking

SAT/Constraint Solvers

Specified Oracles

Manual Oracles

Violating Assertions

Structural Coverage

Implicit Oracles

Page 28: Automated Test Suite Generation for Time-Continuous Simulink Models

Incompatibility Issues"due to Underlying Technology

28

Test Oracle Test Objective Underlying Technology

Specified Oracles

Manual Oracles Structural Coverage

Implicit Oracles Model Checking

SAT/Constraint Solvers

Violating Assertions

Page 29: Automated Test Suite Generation for Time-Continuous Simulink Models

Test Oracle Assumption

29

Test Oracle Test Objective Underlying Technology

Model Checking

SAT/Constraint Solvers

Specified Oracles

Manual Oracles Structural Coverage

Implicit Oracles

Specified Oracles

Implicit Oracles Violating Assertions Violating

Assertions

The effectiveness of coverage-driven test generation is not yet ascertained for Simulink testing!

Manual Oracles Structural Coverage

Page 30: Automated Test Suite Generation for Time-Continuous Simulink Models

30

• Model checking is not applicable to Simulink models with time-continuous blocks

Incompatibility Issues"due to Underlying Technology (cont.)

• Constraint solvers are not effective at handling floating-point operations ( e.g., trig functions or square root )

• Supporting library code and system functions is cumbersome

Page 31: Automated Test Suite Generation for Time-Continuous Simulink Models

31

• When test oracles are manual, the existing techniques only focus on structural coverage

• Structural coverage, although necessary, is not sufficient to generate fault revealing test cases for Simulink models

Low Fault Revealing Ability when Test Oracles are Manual (cont.)

Page 32: Automated Test Suite Generation for Time-Continuous Simulink Models

A++

0.051

-0.05

100

0.8

+-

1

Sum

B

Faulty Model

Faulty Model Output

Manual Test Oracle

32

Correct Model Output

• For manual test oracles, to be able to reveal faults, test outputs should noticeably deviate from correct output

Input Output

Test Input Generated Based on Coverage

++

0.051

-0.05

100

0.8

+-

1

SumA

Correct Model

Page 33: Automated Test Suite Generation for Time-Continuous Simulink Models

0 0.01 0.02 0.03 0.04 0.05

Input

0.0

1.0

0

Input

0.0

1.0

10

Why does SLDV perform poorly "compared to our approach?

33

•  Though the outputs produced by SLDV cover faulty parts of the models, they either do not deviate or only slightly deviate from the correct output:

0

Input

0.0

1.0

10

Test Input Generated by Our Algorithm

Test Input Generated by SLDV

• We conjecture SLDV poor performance is because of its test input generation strategy

Page 34: Automated Test Suite Generation for Time-Continuous Simulink Models

Simulink Testing Challenges (CPS)

• Mixed discrete-continuous behavior (combination of algorithms and continuous dynamics)

•  Inputs/outputs are signals (functions over time)

• Simulation is inexpensive but not yet systematically automated

• Partial test oracles

34

Page 35: Automated Test Suite Generation for Time-Continuous Simulink Models

35

Signal Segments Adaptation to "Model Coverage

P=1

P=2

P=7

•  The algorithm starts from an initial P, e.g., P=1 and gradually increases P, only if

•  Coverage has reached a plateau less than 100%

•  Coverage has been actually increased the last time the algorithm increased P

Page 36: Automated Test Suite Generation for Time-Continuous Simulink Models

Vector-Based Output Diversity"Objective Function : Ov

• Generates a test suite with test outputs maximizing the vector-based diversity function Ov for a test suite:

36

TC1

TC2TC3

TC4

TC5

TestSuite Outputs TSO (q=5)

Page 37: Automated Test Suite Generation for Time-Continuous Simulink Models

Feature-Based Output Diversity"Objective Function : Of

• Generates a test suite with test outputs maximizing the feature-based diversity function Of for a test suite:

37

TC1

TC2TC3

TC4

TC5

TestSuite Outputs TSO (q=5)

Page 38: Automated Test Suite Generation for Time-Continuous Simulink Models

RQ1: Sanity

38

•  Our algorithm with both objective functions performed significantly better than Random for all the test suite sizes

Page 39: Automated Test Suite Generation for Time-Continuous Simulink Models

RQ2: Vector-based vs. Feature-based

39

•  Feature-based diversity (Of) performed better than vector-based diversity (Ov) for all the test suite sizes