advanced software engineering: software testing comp 3702 instructor: anneliese andrews
Post on 29-Dec-2015
225 Views
Preview:
TRANSCRIPT
Advanced Software Engineering: Software
TestingCOMP 3702
Instructor:Anneliese Andrews
A Andrews - Software Engineering: Software Testing'06
News & Project
News Updated course program Reading instructions The book, deadline 23/3
Project IMPORTANT to read the project description
thoroughly Schedule, Deadlines, Activities Requirements (7-10 papers), project areas Report, template, presentation
A Andrews - Software Engineering: Software Testing'06
Lecture
Chapter 4 (Lab 1) Black-box testing techniques
Chapter 12 (Lab 2) Statistical testing Usage modelling Reliability
A Andrews - Software Engineering: Software Testing'06
Why test techniques?
Exhaustive testing (use of all possible inputs and conditions) is impractical
must use a subset of all possible test cases want must have high probability of detecting faults
Need processes that help us selecting test cases Different people – equal probability to detect faults
Effective testing – detect more faults Focus attention on specific types of fault Know you’re testing the right thing
Efficient testing – detect faults with less effort Avoid duplication Systematic techniques are measurable
A Andrews - Software Engineering: Software Testing'06
Dimensions of testing
Testing combines techniques that focus on Testers – who does the testing Coverage – what gets tested Potential problems – why you're testing (risks /
quality) Activities – how you test Evaluation – how to tell whether the test passed or
failedAll testing should involve all five dimensions
Testing standards (e.g. IEEE)
A Andrews - Software Engineering: Software Testing'06
Black-box testing
Ie
Input test data
OeOutput test results
System
Inputs causinganomalousbehaviour
Outputs which revealthe presence ofdefects
A Andrews - Software Engineering: Software Testing'06
userqueries
numericaldata
output format requests
responsesto prompts
command key input
mouse picks on menuPartitioning is basedon input conditions
Equivalence partitioning
A Andrews - Software Engineering: Software Testing'06
Equivalence partitioning
System
Outputs
Invalid inputs Valid inputs
Input condition:is a range
one valid and two invalid classes are defined
requires a specific value one valid and two invalid
classes are definedis a boolean
one valid and one invalid class are defined
A Andrews - Software Engineering: Software Testing'06
Test Cases
Which test cases have the best chance of successfully uncovering faults?
as near to the mid-point of the partition as possible
the boundaries of the partition andMid-point of a partition typically represents the “typical values” Boundary values represent the atypical or unusual valuesUsually identify equivalence partitions based on specs and experience
A Andrews - Software Engineering: Software Testing'06
Consider a system specification which states that a program will accept between 4 and 10 input values (inclusive), where the input values must be 5 digit integers greater than or equal to 10000What are the equivalence partitions?
Equivalence Partitioning Example
A Andrews - Software Engineering: Software Testing'06
Example Equivalence Partitions
Between 10000 and 99999Less than 10000 More than 99999
999910000 50000
10000099999
Input values
Between 4 and 10Less than 4 More than 10
34 7
1110
Number of input values
A Andrews - Software Engineering: Software Testing'06
userqueries
numericaldata
output format requests
responsesto prompts
command key input
mouse picks on menu
outputdomain
Boundary value analysis
A Andrews - Software Engineering: Software Testing'06
Boundary value analysis
Range a..b a, b, just above a, just below bNumber of values: max, min, just below min, just above maxOutput bounds should be checkedBoundaries of externally visible data structures shall be checked (e.g. arrays)
Component–1 0…99 100 -10 –9…499 500–1 0…99 100 -10 –9…499 500
A Andrews - Software Engineering: Software Testing'06
Some other black-box techniques
Risk-based testing, random testingStress testing, performance testingCause-and-effect graphingState-transition testing
A Andrews - Software Engineering: Software Testing'06
Error guessing
Exploratory testing, happy testing, ...Always worth includingCan detect some failures that systematic techniques missConsider
Past failures (fault models) Intuition Experience Brain storming ”What is the craziest thing we can do?” Lists in literature
A Andrews - Software Engineering: Software Testing'06
Usability testing
Characteristics
Accessibility
Responsiveness
Efficiency
Comprehensibility
Environments
Free form tasks
Procedure scripts
Paper screens
Mock-ups
Field trial
A Andrews - Software Engineering: Software Testing'06
Specification-based testing
Formal methodTest cases derived from a (formal) specification (requirements or design)
Specification Test case generation
Test execution
Model (state chart)
A Andrews - Software Engineering: Software Testing'06
Model-based Testing
Specification
Top-levelDesign
DetailedDesign
Coding
Unit Test
Integration
VALIDATION
Test phase
Requirements
Model
Usage
A Andrews - Software Engineering: Software Testing'06
Statistical testing /Usage based testing
Usage specification
Test case generation
Test execution
Failure logging
Reliability estimation
Test Case
1.1.3Setup1.1.4Call
Failure Report
#13
Output failure
Item 1
Item 2
Item 3
Up
Dow
n SelectShown
Hidden
Up
Up
Dow
nD
own
Select
Select
Invoke
Down
A Andrews - Software Engineering: Software Testing'06
Usage specification models
Algorithmic models
Grammar model State hierarchy model
<test_case> ::= <no_commands> @ <command> <select>;
<no_commands> ::= ( <unif_int>(0,2) [prob(0.9)] | <unif_int>(3,5) [prob(0.1)]);
<command> ::=(<up> [prob(0.5)] | <down> [prob(0.5)]);
Usage
Usertype
Usertype
User User User
Servic e Service Se rvice Service
Subtype
Subtype
Subtype
User
Service
A Andrews - Software Engineering: Software Testing'06
Usage specification models
Domain based models
Operational profile
Markov model
UserSelect menu
Function x
Function y
Normal use
System mode ZItem 1
Item 2
Item 3
10%
90%
25%
50%
25%
40%
30%
30%
Item 1
Item 2
Item 3
Up
Dow
n SelectShown
Hidden
Up
Up
Dow
nD
own
Select
Select
Invoke
Down
A Andrews - Software Engineering: Software Testing'06
Operational profiles
A Andrews - Software Engineering: Software Testing'06
Operational profiles
A Andrews - Software Engineering: Software Testing'06
Statistical testing / Usage-based testing
Usage model
Code
Random sample
A Andrews - Software Engineering: Software Testing'06
Usage Modelling
Each transition corresponds to an external eventProbabilities are set according to the future use of the systemReliability prediction
Main Window
Dialog Box
Right-click
Invoke
Terminate
Move
ResizeCANCEL or OKwith Valid Hour
Click on OK with non-valid hour
Close Window
A Andrews - Software Engineering: Software Testing'06
Markov model
System states, seen as nodes
Probabilities of transitions
Conditions for a Markov model:
Probabilities are constants
No memory of past states
Transition matrix
To Node
N1 N2 N3 N4
From Node
N1 P11 P12 P13 P14
N2 P21 P22 0 P24
N3 P31 0 P33 P34
N4 P41 0 0 P44
N1 N2
N3N4
P12
P13
P14
P34
P24P31
P41
P21
A Andrews - Software Engineering: Software Testing'06
Model of a program
The program is seen as a graph One entry node (invoke) and one exit node
(terminate) Every transition from node Ni to node Nj has a
probability of Pij
If no connection between Ni and Nj, then Pij= 0N1 N2
N3N4
P12
P13
P14
P34
P24P31
P21
Output
Input F
A Andrews - Software Engineering: Software Testing'06
Clock SoftwareClock
Options
24 Aug 1997
AnalogDigitalClock OnlySecondsDateChange Time/Date…Info…Exit
Change Time/Date
current time: 11:10:27a new time: ________
current date: Sat 24 Aug 1996 new date: ___________
OK
Info
Vaporware Clock, version 1.0
OK
A Andrews - Software Engineering: Software Testing'06
Input Domain – Subpopulations
Human users – keystrokes, mouse clicksSystem clock – time/date inputCombination usage - time/date changes from the OS while the clock is executing
Create one Markov chain to model the input from the user
A Andrews - Software Engineering: Software Testing'06
Operation modes of the clock
Window = {main window, change window, info window}Setting = {analog, digital} Display = {all, clock only}Cursor = {time, date, none}
A Andrews - Software Engineering: Software Testing'06
State of the system
A state of the system under test is an element of the set S, where S is the cross product of the operational modes.States of the clock
{main window, analog, all, none}{main window, analog, clock-only, none}{main window, digital, all, none}{main window, digital, clock-only, none}{change window, analog, all, time}{change window, analog, all, date}{change window, digital, all, time}{change window, digital, all, date}{info window, analog, all, none}{info window, digital, all, none}
A Andrews - Software Engineering: Software Testing'06
Top Level Markov Chain
Window operational mode is chosen as the primary modeling mode
terminated notinvoked
mainwindow
infowindow
changewindow
mainwindow
changewindow
invoke(prob=1)
options.info
options.change
options.exit(prob=1/3)
(prob=1/3)
(prob=1/3)ok (prob=1)
end (prob=1)
Rules for Markov chainsEach arc is assigned a probability between 0 and 1 inclusive,The sum of the exit arc probabilities from each state is exactly 1.
A Andrews - Software Engineering: Software Testing'06
Top Level Model – Data Dictionary
Arc Label Input to be Applied Comments/Notes for Testerinvoke Invoke the clock software Main window displayed in full
Tester should verify window appearance, setting, and that it accepts no illegal input
options.change Select the “Change Time/Date...” item from the “Options” menu
All window features must be displayed in order to execute this commandThe change window should appear and be given the focusTester should verify window appearance and modality and ensure that it accepts no illegal input
options.info Select the “Info...” item from the “Options” menu
The title bar must be on to apply this inputThe info window should appear and be given the focusTester should verify window appearance and modality and ensure that it accepts no illegal input
options.exit Select the “Exit” option from the “Options” menu
The software will terminate, end of test case
end Choose any action and return to the main window
The change window will disappear and the main window will be given the focus
ok Press the ok button on the info window The info window will disappear and the main window will be given the focus
A Andrews - Software Engineering: Software Testing'06
Level 2 Markov Chain
options.info
options.change
invoke
ok
end
options.exit
digital, all digital,clock only
analog, all analog,clock only
double-click
double-click
options.secondsoptions.date
options.clock-only
options.digital
options.secondsoptions.date
options.clock-only
options.analog
Submodel for the Main Window
A Andrews - Software Engineering: Software Testing'06
Arc Label Input to be Applied Comments/Notes
invoke Invoke the clock software •Main window displayed in full•Invocation may require that the software be calibrated by issuing either an options.analog or an options.digital input•Tester should verify window appearance, setting, and ensure that it accepts no illegal input
options.change Select the “Change Time/Date...” item from the “Options” menu
•All window features must be displayed in order to execute this command•The change window should appear and be given the focus•Tester should verify window appearance and modality and ensure that it accepts no illegal input
options.info Select the “Info...” item from the “Options” menu
•The title bar must be on to apply this input•The info window should appear and be given the focus•Tester should verify window appearance and modality and ensure that it accepts no illegal input
options.exit Select the “Exit” option from the “Options” menu
•The software will terminate, end of test case
end Choose any action (cancel or change the time/date) and return to the main window
•The change window will disappear and the main window will be given the focus•Note: this action may require that the software be calibrated by issuing either an options.analog or an options.digital input
Data Dictionary – Level 2
A Andrews - Software Engineering: Software Testing'06
Arc Label Input to be Applied Comments/Notesok Press the ok button on the info
window•The info window will disappear and the main window will be given the focus•Note: this action may require that the software be calibrated by issuing either an options.analog or an options.digital input
options.analog Select the “Analog” item from the “Options” menu
•The digital display should be replaced by an analog display
options.digital Select the “Digital” item from the “Options” menu
•The analog display could be replaced by a digital display
options.clock-only
Select the “Clock Only” item from the “Options” menu
•The clock window should be replace by a display containing only the face of the clock, without a title, menu or border
options.seconds Select the “Seconds” item from the “Options” menu
•The second hand/counter should be toggled either on or off depending on its current status
options.date Select the “Date” item from the “Options” menu
•The date should be toggled either on or off depending on its current status
double-click Double click, using the left mouse button, on the face of the clock
•The clock face should be replaced by the entire clock window
Data Dictionary – Level 2
A Andrews - Software Engineering: Software Testing'06
Level 2 Markov Chain
options.change
end
time datemove
edit time
move
edit date
Submodel for the Change Window
A Andrews - Software Engineering: Software Testing'06
Data Dictionary
Arc Label Input to be Applied Comments/Notes for Testeroptions.change Select the “Change Time/Date...” item
from the “Options” menu•All window features must be displayed in order to execute this command•The change window should appear and be given the focus•Tester should verify window appearance and modality and ensure that it accepts no illegal input
end Choose either the “Ok” button or hit the cancel icon and return to the main window
•The change window will disappear and the main window will be given the focus•Note: this action may require that the software be calibrated by issuing either an options.analog or an options.digital input
move Hit the tab key to move the cursor to the other input field or use the mouse to select the other field
•Tester should verify cursor movement and also verify both options for moving the cursor
edit time Change the time in the “new time” field or enter an invalid time
•The valid input format is shown on the screen
edit date Change the date in the “new date” field or enter an invalid date
•The valid input format is shown on the screen
A Andrews - Software Engineering: Software Testing'06
Failu
re R
ate
Time
Techniques Markov models Reliability growth
models
Software Reliability
A Andrews - Software Engineering: Software Testing'06
Dimensions of dependability
a
Dependability
Availability Reliability Security
The ability of thesystem to deliver
services whenrequested
The ability of thesystem to deliver
services as specified?
The ability of thesystem to operate
without catastrophicfailure
The ability of thesystem to protect itelfagainst accidental ordeliverate intrusion
Safety
A Andrews - Software Engineering: Software Testing'06
Costs of increasing dependability
Cost
Low Medium High Veryhigh
Ultra-high
Dependability
A Andrews - Software Engineering: Software Testing'06
Availability and reliability
Reliability The probability of failure-free system operation over a
specified time in a given environment for a given purpose
Availability The probability that a system, at a point in time, will
be operational and able to deliver the requested services
Both of these attributes can be expressed quantitatively
A Andrews - Software Engineering: Software Testing'06
Term DescriptionSystem failure An event that occurs at some point in time when
the system does not deliver a service as expectedby its users
System error Erroneous system behaviour where the behaviourof the system does not conform to itsspecification.
System fault An incorrect system state i.e. a system state thatis unexpected by the designers of the system.
Human error ormistake
Human behaviour that results in the introductionof faults into a system.
Reliability terminology
A Andrews - Software Engineering: Software Testing'06
Usage profiles / Reliability
Removing X% of the faults in a system will not necessarily improve the reliability by X%!
Possibleinputs
User 1
User 3User 2
Erroneousinputs
A Andrews - Software Engineering: Software Testing'06
Reliability achievement
Fault avoidance Minimise the possibility of mistakes Trap mistakes
Fault detection and removal Increase the probability of detecting and
correcting faultsFault tolerance
Run-time techniques
A Andrews - Software Engineering: Software Testing'06
Reliability quantities
Execution time is the CPU time that is actually spent by the
computer in executing the software
Calendar time is the time people normally experience in terms of
years, months, weeks, etc
Clock time is the elapsed time from start to end of computer
execution in running the software
A Andrews - Software Engineering: Software Testing'06
Reliability metrics
A Andrews - Software Engineering: Software Testing'06
Nonhomogeneous Poisson Process (NHPP) Models
N(t) follows a Poisson distribution. The probability that N(t) is a given integer n is:
,...2,1,0,!
)]([})({ )( ne
n
tmntNP tm
n
m(t) = (t) is called mean value function, it describes the expected cumulative number of failures in [0,t)
A Andrews - Software Engineering: Software Testing'06
The Goel-Okumoto (GO) model
AssumptionsThe cumulative number of failures detected at time t follows a Poisson distributionAll failures are independent and have the same chance of being detectedAll detected faults are removed immediately and no new faults are introduced The failure process is modelled by an NHPP model with mean value function (t) given by:
;0,0),1()( baeat bt
A Andrews - Software Engineering: Software Testing'06
Goel-Okumoto
The shape of the mean value function ((t)) and the intensity function ((t)) of the GO model
(t)
(t)t
A Andrews - Software Engineering: Software Testing'06
S-shaped NHPP model
(t)=a[1-(1+bt)e-bt], b>0
t
(t)
.)1()(
)( 2 btbtbt teababeebtabdt
tdt
A Andrews - Software Engineering: Software Testing'06
The Jelinski-Moranda (JM) model
Assumptions1. Times between failures are independent,
exponential distributed random quantities2. The number of initial failures is an unknown but
fixed constant3. A detected fault is removed immediately and no
new fault is introduced4. All remaining faults contribute the same amount
of the software failure intensity
A Andrews - Software Engineering: Software Testing'06
Next weeks
This week Read project description thoroughly, Decide
subject Optional exercise tomorrow – Project
Next week (April 7, April 12) Lab 1 – Black-box testing
top related