chapter 7: testing activities, management, and automation
DESCRIPTION
Chapter 7: Testing Activities, Management, and Automation. Major Testing Activities Test Management Testing Automation. Test Planning and Preparation. Major testing activities: Test planning and preparation Execution (testing) Analysis and follow-up Test planning: Goal setting - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/1.jpg)
Chapter 7: Testing Activities,Management, and Automation Major Testing Activities
Test Management
Testing Automation
![Page 2: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/2.jpg)
Test Planning and Preparation Major testing activities:
Test planning and preparation Execution (testing) Analysis and follow-up
Test planning: Goal setting Overall strategy
Test preparation: Preparing test cases & test suite(s) (systematic: model-based; our focus) Preparing test procedure
![Page 3: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/3.jpg)
Test Planning: Goal setting and strategic planning Goal setting
Quality perspectives of the customer Quality expectations of the customer Mapping to internal goals and concrete (quantified)
measurement. Example: customer's correctness concerns =>specific
reliability target Overall strategy, including:
Specific objects to be tested. Techniques (and related models) to use. Measurement data to be collected. Analysis and follow-up activities. Key: Plan the “whole thing"!
![Page 4: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/4.jpg)
The test plan Allocate resources
affects specific models and techniques chosen simple models based on checklists and
partitions require less resources Decide on using new models (Sec. 7.1.2) or
using adaptations of models (Ch. 12) Generic steps and activities in test model
construction information source identification and data
collection (in-field or anticipated usage? code?) analysis and initial model construction model validation and incremental improvement
![Page 5: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/5.jpg)
The test plan
testplan.html
![Page 6: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/6.jpg)
Test Preparation Procedure for test preparation
Preparing test cases (model-based) individual test cases test case allocation
Preparing test procedure basis for test procedure order, flow, follow-up
General concepts Test run: operation instances Input variable: test point Input space: all possible input variable values Test case: static object + input to enable test runs to
start-execute-finish.
![Page 7: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/7.jpg)
Test Cases Based on Formal Models Most organized, systematic test cases are derived from
formal testing models: Directly via newly constructed models. Indirectly via exist test cases, etc.
Model construction steps: Information source identification and data collection Analysis and initial model construction Model validation and improvement
Model usage: Defining test cases (details with individual models/techniques) Indirectly in analysis/follow-up (Part IV).
![Page 8: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/8.jpg)
Individual Test Case Preparation Individual test cases (micro-level) vs. test suite (macro-
level) From multiple sources:
Actual runs (usage-based). Implementation-based (white-box). Specification-based (black-box). May use similar/earlier products. (direct) record and replay (less often) – WinRunner and SilkTest. (via) formal models (OP, Control Flow Testing, etc.)
Defining input values (model => test cases): Initial/intermediate/interactive input (expected output too?) Exercise path/slice/track/etc. In testing terminology: sensitization
![Page 9: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/9.jpg)
Test Suite Preparation Test suite (macro-level)
Collection of individual test cases that will be run in a test sequence until some stopping criteria are satisfied
Existing suite: what and where? suitability? selection/screening?
Construction/generation of new ones Organization & management
too valuable to throw away information on various attributes stored
Adding new test cases Estimate # of new test cases Specify new (individual) test cases Integrate to existing test cases
Allocation to systems/operations OP-/structure-based allocation Both old and new test cases in suite
![Page 10: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/10.jpg)
Test Procedure Preparation Key consideration: Sequencing
General: simple to complex. Dependency among test cases. Defect detection related sequencing. Sequence to avoid accident. Problem diagnosis related sequencing. Natural grouping of test cases.
Other considerations: Effectiveness/efficiency concerns. Smooth transition between test runs. Management/resource/personnel/etc.
![Page 11: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/11.jpg)
Test Execution Major testing activities:
Test planning and preparation Execution (testing) Analysis and follow-up
Test execution: Want a smooth transition from one test run to another Execution planning and management Related activities: important part
failure identification and measurement other measurement
![Page 12: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/12.jpg)
Test Execution General steps
Allocating test time (& resources) Invoking tests (& collecting execution info.) Identifying system failures
(& gathering info. for follow-up actions)
Allocating test time planned, but needs monitoring & adjusting OP-based or Coverage-based Alternative: bottom-up approach
individual test cases (test time) (sum-up) overall allocation by OP or coverage areas
![Page 13: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/13.jpg)
Test Execution Invoking tests – follow plan, provide input variable values over
whole execution duration following pre-planned sequence
Invoking test (OP-based) OP => input variables (test points) Follow probabilistic distributions (could be dynamically
determined) Sequence (what to test first?): COTS, product, super system
Invoking test (coverage-based) Organize sensitized test cases Sequence (coverage hierarchies)
Common part: Retest due to Defect fix => verify fix Code or feature change General regression test
![Page 14: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/14.jpg)
Test Execution Identifying system failures (oracle problem):
Similar for OP-/coverage-based Analyze test output for deviations Determine: deviation = failure ? Handling normal vs. failed runs
non-blocking failure handling
Solving oracle problem: Theoretically undecidable. Some cases obvious: crash, hang, etc. Practically based on heuristics:
product domain knowledge cross-checking with other products implementation knowledge & internals limited dynamic consistency checking
![Page 15: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/15.jpg)
Test Execution Failure observation and measurement:
When determining deviation = failure Establish when failure occurred
used in reliability and other analysis Failure information (e.g., ODC):
what/where/when/severity/etc.
Defect handling and test measurement: Defect status and change (controlled) Information gathering during testing:
example template: Table 7.1 (p.93) Follow-up activities:
fix-verification cycle other possibilities (defer, invalid, etc.)
![Page 16: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/16.jpg)
![Page 17: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/17.jpg)
Testing Analysis and Follow up Major testing activities:
Test planning and preparation Execution (testing) Analysis and follow up
Test analysis and follow up: Execution/other measurement analyzed Analysis results as basis for follow up Feedback and follow up:
defect fixing decision making (exit testing? etc.) adjustment and improvement.
![Page 18: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/18.jpg)
Testing Analysis and Follow-up Input to analysis
Test execution information Particularly failure cases Timing and characteristics data
Analysis and output Basic individual (failure) case
problem identification/reporting repeatable problem setup
Overall reliability and other analysis? (Module V)
Follow-up activities Defect analysis and removal (& re-test). Decision making and management. Test process and quality improvement.
![Page 19: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/19.jpg)
Testing Analysis and Follow up For individual test runs:
Success, continue with normal testing. Failure: see below.
Analysis and follow up for failed runs: Understanding the problem by studying the execution record. Recreating the problem (confirmation). Problem diagnosis
may involve multiple related runs. Locating the faults. Defect fixing (fault removal)
commonly via add/remove/modify code sometimes involve design changes
Re-run/re-test to confirm defect fixing.
![Page 20: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/20.jpg)
Testing Analysis and Follow up Analysis and follow-up for overall testing:
Reliability analysis and follow-up. assess product reliability; determine if goal has been
achieved/release; steps to achieve if not; id low reliability areas
Coverage analysis and follow-up. surrogate for reliability; input for stopping criterion
Defect analysis and follow-up. defect distribution; id high-defect areas
Focus of Part IV. Follow up activities: Similar.
Decision making and management. Test process and quality improvement.
![Page 21: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/21.jpg)
Test Management People's roles/responsibilities in formal and informal
testing. In informal testing:
“plug-and-play" by novice users. “run-and-observe" by testers (or automation tools). Informal testing with ad-hoc knowledge for easy, simple systems. Deceptively “easy", but not all failures or problems easy to
recognize.
In formal testing: Role of “code owners" (multiple roles in unit testing) Testers, and organized in teams (integration and system testing). Management/communication structure. 3rd party (IV&V) testing.
![Page 22: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/22.jpg)
Test Management Test team organization:
Vertical: Project oriented product domain knowledge, staffing/resource management issues (personnel reassigned).
Horizontal: Task oriented teams perform one kind of testing on many different products even distribution of staff/resources lack of internal knowledge/expertise
Mixed models might work better. Users and 3rd party testers:
User involvement in beta-testing and other variations IV&V with 3rd party testing/QA (DOD extensively uses this)
Impact of new technologies: CBSE, COTS impact security, dependability requirements.
![Page 23: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/23.jpg)
Test Automation Basic understanding:
Automation needed for large systems. Fully automated: Theoretically impossible. Focus on specific needs/areas.
Key issues to consider: Specific needs and potential for automation. Existing tools available/suitable?
related: cost/training/etc.
Constructing specific tools? Additional cost in usage & support.
![Page 24: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/24.jpg)
Test Automation Automation by test activity areas:
Automated test planning & preparation. Automated test execution. Automated test measurement, analysis, and follow-up. Slightly different grouping due to tight coupling for
measurement & analysis. Automation for test execution.
Many debuggers: semi-automatic (testers may intervene). Task sequencing/scheduling tools. Load/test generator: script => runs Generally easier to obtain test scripts.
![Page 25: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/25.jpg)
Test Automation Automation for test planning/preparation:
Test planning: Human intensive not much can be done (inspection and FV).
Test model construction: similar to above. automation possible at a small scale.
Test case generation: focus. Test case generation:
From test model to test cases. Specific to individual techniques
e.g., cover checklist items, paths, etc. Various specific tools. Key: which specific testing technique supported by the
specific tool?
![Page 26: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/26.jpg)
The automated test plan
![Page 27: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/27.jpg)
The test script for successful reservation
![Page 28: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/28.jpg)
The test results for SilkTest
![Page 29: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/29.jpg)
WinRunner Status window to indicate that the application is currently being learned
![Page 30: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/30.jpg)
WinRunner - The test script and the initialization window for a test
![Page 31: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/31.jpg)
The test script for main menu
![Page 32: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/32.jpg)
/* 1. Compile and link the executable with command “lt test01.ada” *//* 2. Execute the test with command “test01 > zout” *//* 3. Verify that the file zout does not have the word “Error” in it anywhere *//* 4. Verify that the file zout shows same value on each side of equal signs */...procedure Test 01 is Test_Passed : BOOLEAN := TRUE; /* This procedure checks specific values for the SQRT function */ procedure Check_Sqrt (Test_Value : in INTEGER; Expected_Results : in INTEGER) is begin Actual_Results := Math_Pkg.Sqrt(Test_Value); Text_Io.Put_Line (INTEGER’image(Expected_Results) & “ = ” & INTEGER’image(Actual_Results)); Test_Passed := Test_Passed and (Expected_Results = Actual_Results); end Check_Sqrt;begin ... /* This section checks a set of values for the SQRT function */ Check_Sqrt (Test_Value => 0, Expected_Results => 0); Check_Sqrt (Test_Value => 16, Expected_Results => 4); ... /* This section verifies that sqrt & squared are inverses of each other */ for I in 0..1000 loop if (I /= Math_Pkg.Sqrt(Math_Pkg.Squared(I))) then Text_Io.Put_Line (“Error Detected for value of” & INTEGER’image(I)); Test_Passed := FALSE; end if; end loop; ... /* This section outputs the summary of the results. Makes output analysis faster */ if (Test_Passed) then Text_Io.Put_Line (“Test shows no problems”); else Text_Io.Put_Line (“*** Error detected, Analyze results ***”); end if; ...end Test01;
![Page 33: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/33.jpg)
Example of test output.
This is a test of the Math_Pkg Sqrt function...0 = -14 = 4...Error Detected for value of 0*** Error detected, Analyze results ***End of test
![Page 34: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/34.jpg)
Test Automation Test measurement, analysis, and follow-up.
Analyses dictate measurements needed. Most common: reliability/coverage.
need timing data for individual test runs
Defect measurement needed in most cases: defect tracking tools collect data on personnel, component can identify problematic areas for improvement
Reliability analysis related tools: Analysis/modeling tools. Collecting execution/input/etc. data. More in Chapter 22.
![Page 35: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/35.jpg)
Test Automation Coverage-based testing
measuring coverage and compare to pre-set goals. usually requires more detailed information than reliability
analysis. Test coverage tools:
Different levels/definitions of coverage=> different tools. Example tools:
McCabe: execution (control flow) path S-TCAT: functional coverage; call-pair information A-TAC: data flow coverage.
Test coverage steps: Preparation: program instrumentation. Measurement step: run and collect data. Analysis step: analysis for coverage. Example: Fig 7.1 (p.100).
![Page 36: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/36.jpg)
![Page 37: Chapter 7: Testing Activities, Management, and Automation](https://reader030.vdocuments.net/reader030/viewer/2022033102/56812db9550346895d92ee5e/html5/thumbnails/37.jpg)
Summary Test activities:
Planning & preparation: focus of Part II. Execution & measurement: common. Analysis & follow up: focus of Part IV.
Test management: Different roles and responsibilities. Good management required.
Test automation: Set realistic expectations. Specific areas for automation, esp. in execution,
measurement, and analysis.