sblim: test suite - sourceforgesblim.sourceforge.net/doc/sblimtestsuite.pdf ·  · 2005-02-03test...

32
SBLIM: Test Suite Test Suite Specification and Implementation Details Revision 4 Last updated June 13, 2003 Document Owner: Heidi Neumann LTC Systems Management [email protected] Distribution: SBLIM Open Source Project http.//oss.software.ibm.com/developerworks/projects/sblim/ Master Copy is located on PC of document owner

Upload: lecong

Post on 15-Apr-2018

218 views

Category:

Documents


5 download

TRANSCRIPT

Page 1: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

SBLIM: Test Suite

Test Suite Specification and Implementation Details

Revision 4 Last updated June 13, 2003

Document Owner: Heidi Neumann

LTC Systems Management [email protected]

Distribution: SBLIM Open Source Project

http.//oss.software.ibm.com/developerworks/projects/sblim/

Master Copy is located on PC of document owner

Page 2: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

2

1 Document Control ............................................................................................................................ 4

1.1 Change History.......................................................................................................................... 4 1.2 Reviewers List ........................................................................................................................... 4 1.3 Contributing Authors................................................................................................................... 4

2 Introduction...................................................................................................................................... 5 2.1 Purpose of this document ........................................................................................................... 5 2.2 Test Approach / Requirements.................................................................................................... 5

2.2.1 Interface Test....................................................................................................................... 5 2.2.2 Consistence Test ................................................................................................................. 6 2.2.3 Specification Test ................................................................................................................. 6

2.3 Problem Areas ........................................................................................................................... 6 3 SBLIM Test Suite ............................................................................................................................. 7

3.1 Prerequisites.............................................................................................................................. 7 3.2 Architecture ............................................................................................................................... 7

3.2.1 File Structure ....................................................................................................................... 7 3.2.2 User Interfaces .................................................................................................................... 8

3.2.2.1 Script : run.sh ................................................................................................................ 8 3.2.2.2 Script : interface.pl ......................................................................................................... 8 3.2.2.3 Script : consistence.pl .................................................................................................... 8

4 Interface Test ................................................................................................................................. 10 4.1 Input File ................................................................................................................................. 10

4.1.1 Class specific Syntax.......................................................................................................... 10 4.1.1.1 Header Syntax............................................................................................................. 10 4.1.1.2 Paragraph Syntax ........................................................................................................ 10 4.1.1.3 Example ...................................................................................................................... 11

4.1.2 Association specific Syntax................................................................................................. 11 4.1.2.1 Header Syntax............................................................................................................. 11 4.1.2.2 Paragraph Syntax ........................................................................................................ 12 4.1.2.3 Example ...................................................................................................................... 12

4.2 Report File............................................................................................................................... 13 4.2.1 Class specific Syntax.......................................................................................................... 13 4.2.2 Association specific Syntax................................................................................................. 13

4.3 Instance Interface..................................................................................................................... 15 4.3.1 enumInstanceNames() or enumInstances() .......................................................................... 15 4.3.2 getInstance()...................................................................................................................... 15 4.3.3 setInstance()...................................................................................................................... 16 4.3.4 createInstance() ................................................................................................................. 16 4.3.5 deleteInstance() ................................................................................................................. 16 4.3.6 execQuery()....................................................................................................................... 17

4.4 Association Interface ................................................................................................................ 17 4.4.1 associatorNames() & associators() ...................................................................................... 17 4.4.2 referenceNames() & references() ........................................................................................ 17 4.4.3 Algorithm to Handle the Input Parameter “role”, “resultRole” and “resultClass”........................ 18

4.4.3.1 associatorNames() ....................................................................................................... 19 4.4.3.2 referenceNames() ........................................................................................................ 20 4.4.3.3 Report File................................................................................................................... 20

4.5 Method Interface...................................................................................................................... 22 4.6 Property Interface..................................................................................................................... 22 4.7 Indication Interface................................................................................................................... 22 4.8 Common Interface Tests........................................................................................................... 22

4.8.1 Check for expected Exception............................................................................................. 22 5 Consistence Test ........................................................................................................................... 23

5.1 Capabilities .............................................................................................................................. 23 5.2 Input File ................................................................................................................................. 23

5.2.1 Class specific Syntax.......................................................................................................... 24

Page 3: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

3

5.2.1.1 Header Syntax............................................................................................................. 24 5.2.1.2 Paragraph Syntax on Class Level ................................................................................. 24 5.2.1.3 Paragraph Syntax on Property Level ............................................................................. 24 5.2.1.4 Paragraph Syntax on Instance Level ............................................................................. 25 5.2.1.5 Example ...................................................................................................................... 25

5.2.2 Association specific Syntax................................................................................................. 26 5.2.2.1 Header Syntax............................................................................................................. 26 5.2.2.2 Paragraph Syntax on Reference Level .......................................................................... 26 5.2.2.4 Example ...................................................................................................................... 26

5.3 Report File............................................................................................................................... 26 5.3.1 Class Specific Syntax......................................................................................................... 27 5.3.1 Association Specific Syntax ................................................................................................ 27

5.4 Class Specific Test Cases ........................................................................................................ 28 5.4.1 Class Level........................................................................................................................ 28 5.4.2 Property Level.................................................................................................................... 29 5.4.3 Instance Level.................................................................................................................... 30

5.5 Association Specific Test Cases................................................................................................ 30 5.5.1 Association Level ............................................................................................................... 30

6 Specification Test ........................................................................................................................... 32

Page 4: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

4

1 Document Control

1.1 Change History Revision Date Description

1 04/04/2003 Initial Draft Version

2 04/25/2003 - reorganization of input and report file descriptions for interface test (moved

from chapter 3 to chapter 4) - Chapter 5 : Consistence Test written

3 05/15/2003 - Chapter 5 : additional return codes on property level defined - Chapter 5 : input / report file descriptions and return codes for association

test defined

4 06/13/2003 - Chapter 6 : comment added to be future task - Review of complete document in preparation to the announcement

1.2 Reviewers List Name Role & Responsibility

Robert Kieninger Adrian Schuur Viktor Mihajlovski

1.3 Contributing Authors

Name e-mail address Description Robert Kieninger [email protected] Input for Requirements and Purpose of the test suite

Page 5: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

5

2 Introduction

2.1 Purpose of this document Performing “hand-operated” tests of available CIM Instrumentation can become a nightmare, if tests for different operating systems, Linux distributions, releases and hardware platforms are required. In order to guarantee that one CIM Instrumentation is functioning correct within different environments, a test suite is being developed that offers the user a well defined set of test types to define Instrumentation dependent test scenarios and the possibility to perform automated function verification tests against the installed CIM Instrumentation. The SBLIM Test Suite is subject of this document. The reader should be familiar with CIM, CIM terms and should have some deeper knowledge about provider implementations.

2.2 Test Approach / Requirements The test suite acts as a CIM Client and uses a generic command line interface (SBLIM WBEMCLI package) that can be parameterized to perform the CIMOM calls. The command line interface WBEMCLI can communicate with different CIMOM Implementations, like OpenCimom and Pegasus. The test suite itself is script based and interprets the return values of the WBEMCLI command. The test suite has to be easy extendable regarding the test of new classes / providers. The test suite covers the following three types of tests for CIM Instrumentation. One that generically verifies that all required provider interfaces are properly implemented, one that checks if the returned values are really meaningful and one that checks if the returned values follow the specification. All tests have to be as much self checking as possible. 2.2.1 Interface Test The most important and supposing “easiest” to implement test scenario is a general functionality test of the interface types supported by the provider. Known types are the Instance, Association, Property, Method and Indication (or Event) Interfaces. Each interface type supports a defined set of operations. The first iteration of the test suite will be able to execute each operation of the Instance and Association Interfaces and interpret the returned results. A requirement against the design of the test suite is the possibility to easily support the other interface types. A modularized approach will be the choice here. The user is able to define on operations level which tests to perform. Therefore he needs knowledge about the interface types and their special operations. This allows the user to specify the scope of the test scenario. The user should be able to specify a certain Exception he expects. Some implementations require, that an exception like “NOT_SUPPORTED” is thrown by the provider. This makes sense, if an operation is callable, but has no functionality. For example, the createInstance() call of the OperatingSystem instance provider does not make sense. Supporting this operation would mean, that an OS should install another OS over itself ! To offer the client a self-describing return code, the decision was made, to return “NOT_SUPPORTED”. Returning a status of “OK” would mean that the operation was successfully performed. This is wrong, because there is no functionality implemented for this operation.

Page 6: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

6

2.2.2 Consistence Test The consistence test distinguished between class and association level. The class level contains a test between the numbers of reported instances against the system state. The number of valid instances is collected by a script or only one single command. Described on a file system example, the provider should return not more and not less instances than file systems are known to the operating system. So it can be checked if the provider returns duplicates or does not report some of the instances. On property level two types are important to check the correctness of the returned property values. One can check the observance of threshold values. The other one is checking returned property values against system values. The last test step checks the consistence of the property values within each single instance. The association level contains a test between the number of reported target instances against the association definition and the system state. A further test checks the correctness of the instances reported by the association provider against the instances, reported by the instance provider of the target instance. The system values are collected by test scripts. To automate the execution of these test steps the system test scripts follow certain syntax. 2.2.3 Specification Test The specification test makes use of the meta-information about the model. Within this test type, it is possible to figure out, if the provider implements the class definition in the required manner. For example, the definition of a class marks a certain property as required. But the provider does not return a value for this property. This is a violation of the CIM specification.

2.3 Problem Areas Due to the dynamic nature of the returned data it will be hard, if not impossible, to create tests that check for semantically correct results (chapter 2.2.2 Consistence Test). However, because of the layered architecture it should be possible to create a dummy version of the “Resource Encapsulation Library”, that will allow the providers to generate reproducible results. They can be used for automated validation of correct data processing within the providers.

Figure 1 : Architecture of CIM Instrumentation Packages

CMPI Provider

CMPI Provider

CMPI Provider

Resource Encapsulation Library (Data Gathering)

R R R R R

CIMOM ______________________

CMPI

Instrumentation Package

Page 7: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

7

3 SBLIM Test Suite

3.1 Prerequisites To guarantee a correct functioning of the SBLIM Test Suite, the user has to make sure, that the following prerequisites are satisfied.

• the CIMOM is configured and functioning correct • the CIMOM is running • the model to test is completely loaded into the CIMOM repository • the installation process of the CIM Instrumentation (provider) was successfully completed • the CIMOM is capable of calling the provider

3.2 Architecture The entry point to the test suite is class level. That means each test type (see “2.2 Test Approach / Requirements”) is performed to one single class name. To automate the test for a collection of classes a small shell script can be written, which calls the test suite entry point with the respective class name. The used programming languages are Perl and shell scripting. For each class and test type an input file must be written by the user. The semantic of this file follows a well defined structure, which is described in the chapters “4.1 Input File” and “5.2 Input File”. For each performed test of a certain class a report file is created. These statistical data describe the status of each test step (“ok” or “failed – ‘…and a description…’”), the time spend in system and user mode (describes the resource consumption of the test process), a summary of the succeeded and failed test steps and the overall times spend in system and user mode. 3.2.1 File Structure

File Name Functionality

run.sh shell script

contains the collection of test types to perform (interface, constistence, specification)

cimom.pm perl module

Contains the access information to the specific CIMOM -> edit to fit your system

interface.pl perl script parses the input file and selects the corresponding interface type

instance.pm perl module

implementation of the instance interface tests

associator.pm perl module implementation of the association interface tests

Page 8: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

8

consistence.pl perl script

parses the input file and selects the corresponding level to test (class or association)

consistence.pm perl module implementation of the class and association level tests

system/<platform>/ createKeyFiles.sh

shell script

creates two files : ComputerSystem.keys and OperatingSystem.keys; the files offer the CreationClassName and Name of the classes for reference by other classes, e.g. Process

cim directory contains the input files for the interface tests system/<platform> directory contains the input files for the consistence tests

stat directory contains all report files after test execution; for each executed test one report file

Each type of test (Interface, Consistence and Specification) is divided into two major parts, where the first part is mandatory and the second part a recommendation to automate the execution of tests for more than one class / association. The first part is writing the necessary definition files, later called input files to define the classes / associations to test. The input files have to follow certain syntax. The syntax is described in the corresponding chapter. In addition it can be necessary to write shell script for the system data gathering. The second part is writing a small shell script to automate the execution of the test types. The script “run.sh” can be used as entry point. 3.2.2 User Interfaces 3.2.2.1 Script : run.sh The script run.sh is the entry point for a user, who wants to use the whole functionality of the test suite. The script executes first the interface test and second the consistence test for the given class. Usage : First the variable SYSTEM_PATH has to be edited to fit to the actual system. The variables HOST and TEST_DIR should be initialized correctly, but they can be change if necessary. ./run.sh <class name> 3.2.2.2 Script : interface.pl The perl script interface.pl executes the interface operations against the provider as defined by the input file. In addition the parameter -assocParams defines, that the algorithm which handles the validation of the input parameter “role”, “resultRole” and “resultClass” of the association operations has to be tested. Usage : ./interface.pl <class name> ./interface.pl <class name> -assocParams 3.2.2.3 Script : consistence.pl The perl script consistence.pl executes the consistence test and compares the values returned by the provider against the system data as defined by the input file.

Page 9: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

9

If the consistence test is started for the first time on a certain machine, the script system/<platform>/createKeyFiles.sh is executed to create the platform specific key files. If this step fails, the complete test will fail. Under circumstances it can be necessary to enhance the functionality of this test script. It depends on the classes to test. Usage : ./consistence.pl <class name> <system/<platform>>

Page 10: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

10

4 Interface Test

4.1 Input File The input file defines the class / association to test. For each class / association one input file has to exist in the directory “cim”. The name of the file has to be <class name>.cim. The syntax of the input files distinguishes between two parts. The header contains overall information of the class / association itself. The header is followed by paragraphs, describing the interface type and operation to test. It is not necessary to insert all operations of a certain interface. By adding a certain paragraph to the input file, the decision was made to test this interface operation. By leaving a certain paragraph out, the decision was made to not test this interface operation. 4.1.1 Class specific Syntax 4.1.1.1 Header Syntax The key words class and objectPath are mandatory. The class qualifier defines the name of the class. The objectPath qualifier defines a full qualified object path of a sample instance of this class. The object path must not contain a newline ! The object path is necessary to perform operations like createInstance() and deleteInstance(). If the intention is only to test a certain returned Exception, the sample instance can be a dummy instance and does not need to exist on the test system. The qualifier volatile should only be used for classes, which represent volatile data, for example the class CIM_Process. The intention of this qualifier is to find a mechanism how to handle the dynamical nature of some data. It is very likely that each time you call especially the getInstance() operation of a provider for volatile data, the returned answer can vary between ok and “NOT_FOUND”. To avoid in principle failing the test has to be aware of this difference. ****************************************************************************** class : <class name> objectPath : <object path of a sample instance> [ volatile ] \n 4.1.1.2 Paragraph Syntax Each paragraph defines the interface type Instance and the operation to perform. Supported key words for operations are enumInstanceNames, enumInstances, get, set, create, delete and execQuery. The qualifier Expected Exception is optional and needs only be defined, if a certain Exception is expected to be thrown (as described in “2.2.1 Interface Test”). ----------------------------------------------------------------------------- Instance enumInstanceNames \n ----------------------------------------------------------------------------- …

Page 11: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

11

----------------------------------------------------------------------------- Instance delete [Expected Exception : <insert expected exception type>] \n ----------------------------------------------------------------------------- Instance execQuery 4.1.1.3 Example ***************************************************************************** class : Linux_ComputerSystem objectPath : Linux_ComputerSystem.CreationClassName=Linux_ComputerSystem,Name\ =testsystem.ibm.com \n ----------------------------------------------------------------------------- Instance enumInstanceNames \n ----------------------------------------------------------------------------- Instance create Expected Exception : NOT_SUPPORTED 4.1.2 Association specific Syntax An association is defined in mof syntax as followed : [ Association ] class Linux_RunningOS : CIMRunningOS { [Override “Antecedent”] Linux_OperatingSystem REF Antecedent; [Override “Dependent”] Linux_ComputerSystem REF Antecedent; } 4.1.2.1 Header Syntax The key words association and objectPath are mandatory. The association qualifier defines the name of the association. The objectPath qualifier defines a full qualified object path of a sample instance of this association. The object path must not contain a newline ! The object path is necessary to perform instance operations like createInstance() and deleteInstance(). If the intention is only to test a certain returned Exception, the sample instance can be a dummy instance and does not need to exist on the test system. The qualifiers sourceRole, targetRole, sourceClass and targetClass are mandatory and define the association specific reference properties. The sourceRole specifies the name of the source reference. It is recommended to use the name of the first reference in the association, for example “Antecedent”. The targetRole specifies the name of the target reference. It is recommended to use the name of the second reference in the association, for example “Dependent”. The sourceClass specifies the class name of the source reference, for example “Linux_OperatingSystem”. If the provider supports a CIM_<class name> and returns instances with class names of other, derived classes, then list first the CIM_<class name> followed by all the derived class names. The targetClass specifies the class name of the target reference, for example “Linux_ComputerSystem”. If the provider supports a CIM_<class name> and returns instances with class names of other, derived classes, then list first the CIM_<class name> followed by all the derived class names.

Page 12: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

12

****************************************************************************** association : <association name> objectPath : < object path of a sample instance> sourceRole : <role of the source reference> targetRole : <role of the target reference> sourceClass : <class name(s) of the source reference> targetClass : <class name(s) of the target reference> \n 4.1.2.2 Paragraph Syntax Each paragraph defines the interface type Association and the operation to perform. Supported operations are associators, associatorNames, references and referenceNames. ----------------------------------------------------------------------------- Association associators \n ----------------------------------------------------------------------------- … ----------------------------------------------------------------------------- Association referenceNames 4.1.2.3 Example ***************************************************************************** association : Linux_RunningOS objectPath : Linux_RunningOS.Antecedent=”Linux_OperatingSystem.CSCreationClass Name=Linux_ComputerSystem,CSName=testsystem.ibm.com,CreationClassName=Linux_OperatingSystem,Name=testsystem.ibm.com”,Dependent=”Linux_ComputerSystem.CreationClassName=Linux_ComputerSystem,Name=testsystem.ibm.com” sourceRole : Antecedent targetRole : Dependent sourceClass : Linux_OperatingSystem targetClass : Linux_ConputerSystem \n ----------------------------------------------------------------------------- Instance enumInstanceNames \n ----------------------------------------------------------------------------- Instance create Expected Exception : NOT_SUPPORTED \n ----------------------------------------------------------------------------- Association associators \n ----------------------------------------------------------------------------- Association associatorNames \n ----------------------------------------------------------------------------- Association references \n ----------------------------------------------------------------------------- Association referenceNames

Page 13: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

13

4.2 Report File The report file is generated per tested class. For each executed interface test one report file exists in the directory “stat”. The name of the file is <class name>.stat. The syntax of the report files is similar to the syntax of the input files. The report file contains some additional information. Each interface type has its own “chapter”. If an input file defines tests on instance and association operations this test results in a report file, containing one chapter for the results of the instance interface test and one chapter containing the results of the association interface tests. Each chapter header specifies the start date of the test. The paragraph of the single operations depends on the returned execution status. Each report file contains a summary to report an overall statistic of the tests. It contains the number of succeeded and failed tests and the times the whole test case spend in user and system mode. 4.2.1 Class specific Syntax Possible return codes of the operations are defined in chapter “4.3 Instance Interface”. If the test returned with “ok”, a statistic of the time spent in system and user mode is reported. In addition if an enumeration was tested the number of returned instances is reported. Example : ****************************************************************************** class : Linux_ComputerSystem Fri Apr 4 8:43:09 2003 ------------------------------------------------------------------------------ Instance enumInstanceNames Status : ok RC : 1 Number of returned Instances : 1 user time : 0.01 child user time : 0 ut : 0.01 system time : 0 child system time : 0 st : 0 ------------------------------------------------------------------------------ Instance create Status : failed – unexpected exception occurred - <wbemcli output> RC : 2 ------------------------------------------------------------------------------ ****************************************************************************** Linux_ComputerSystem Number of Tests where Status ok : 1 Number of Tests where Status failed : 1 user time : 0.02 child user time : 0 ut : 0.02 system time : 0 child system time : 0.01 st : 0.01 4.2.2 Association specific Syntax Possible return codes of the operations are defined in chapter “4.4 Association Interface”. The traversal of an association can be performed in two directions. The test distinguishes between them. Therefore the

Page 14: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

14

status of both directions is reported separately. If both tests returned with “ok” a statistic of the time spend in system and user mode is reported. Example : ****************************************************************************** class : Linux_RunningOS Fri Apr 4 8:44:59 2003 ------------------------------------------------------------------------------ Instance enumInstanceNames Status : ok RC : 1 Number of returned Instances : 1 user time : 0.01 child user time : 0 ut : 0.01 system time : 0 child system time : 0 st : 0 ------------------------------------------------------------------------------ Instance create Status : ok – operation returned with NOT_SUPPORTED RC : 1 ------------------------------------------------------------------------------ ****************************************************************************** association : Linux_RunningOS Fri Apr 4 8:45:33 2003 ------------------------------------------------------------------------------ Association associators RC : 1.1 Status : Linux_OperatingSystem to Linux_ComputerSystem : ok Number of returned Instances : 1 Status : Linux_ComputerSystem to Linux_OperatingSystem : ok Number of returned Instances : 1 user time : 0.03 child user time : 0 ut : 0.03 system time : 0 child system time : 0.01 st : 0.01 ------------------------------------------------------------------------------ Association associatorNames RC : 1.1 Status : Linux_OperatingSystem to Linux_ComputerSystem : ok Number of returned Instances : 1 Status : Linux_ComputerSystem to Linux_OperatingSystem : ok Number of returned Instances : 1 user time : 0.05 child user time : 0 ut : 0.05 system time : 0 child system time : 0.02 st : 0.02 ------------------------------------------------------------------------------ Association references RC : 1.1 Status : Linux_OperatingSystem to Linux_ComputerSystem : failed –<description> ------------------------------------------------------------------------------ Association referenceNames RC : 1.1 Status : Linux_OperatingSystem to Linux_ComputerSystem : failed –<description>

Page 15: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

15

------------------------------------------------------------------------------ ****************************************************************************** Linux_RunningOS Number of Tests where Status ok : 4 Number of Tests where Status failed : 2 user time : 0.08 child user time : 0.03 ut : 0.11 system time : 0.03 child system time : 0.02 st : 0.05

4.3 Instance Interface The CIM Instance Provider Interface supports the following operations : • enumInstances() • enumInstanceNames() • getInstance() • setInstance() • createInstance() • deleteInstance() • execQuery() The perl functions to test the instance operations are implemented in the perl module instance.pm. The module exports the functions enumerate, get, create, delete, execQuery and check_returnedException. 4.3.1 enumInstanceNames() or enumInstances() The perl function enumerate is responsible for testing the instance provider operations enumInstanceNames() and enumInstances(). The perl function sends an enumInstanceNames() or enumInstances() request to the CIMOM and validates the returned values.

Test Return Code Return Code Description

check if an exception occurred 2 failed - exception occurred - <wbemcli output> check if the returned instance(s) is/are of the requested class type 3 failed - returned instance(s) is/are of wrong

class type 1 ok

4.3.2 getInstance() The perl function get is responsible for testing the instance provider operation getInstance(). The perl function sends an enumInstanceNames() request to the CIMOM to get the object paths of all available instances. For each object path an getInstance() request is send to the CIMOM and the returned value is validated.

Page 16: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

16

Test Return Code Return Code Description

check if enumInstanceNames() returned with an exception 2 failed - enumeration - exception occurred -

<wbemcli output> check if the by enumInstanceNames() returned instance(s) are of the requested class type

3 failed - enumeration - returned instance(s) is/are of wrong class type

check if an exception occurred 12 failed - exception occurred - <wbemcli output> check if the returned instance is of the requested class type

13 failed - returned instance is of wrong class type - <wbemcli output>

check if a volatile class type returned with another exception than NOT_FOUND

14 warning - volatile class type returned with exception - <wbemcli output>

1 ok

4.3.3 setInstance() Currently not supported. 4.3.4 createInstance() The perl function create is responsible for testing the instance provider operation createInstance(). The perl function sends an createInstance() request to the CIMOM and validates the returned value. If successful, an getInstance() operation of the new instance is executed to test if the instance was really created.

Test Return Code Return Code Description

check if an exception occurred 2 failed - exception occurred - <wbemcli output> check if instance was created 3 failed - instance was not created check if in getInstance an exception occurred 12 failed - exception occurred - <wbemcli output>

1 ok

4.3.5 deleteInstance() The perl function delete is responsible for testing the instance provider operation deleteInstance. The perl function sends an deleteInstance() request to the CIMOM and validates the return value. If successful, an getInstance() operation of the old instance is executed to test if the instance was really deleted.

Test Return Code Return Code Description

check if an exception occurred 2 failed - exception occurred - <wbemcli output> check if instance was deleted 3 failed - instance was not deleted check if in getInstance an exception occurred

12 failed - exception occurred - <wbemcli output>

1 ok

Page 17: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

17

4.3.6 execQuery() Currently not supported.

4.4 Association Interface The CIM Association Provider Interface supports the following operations : • associators() • associatorNames() • references() • referenceNames() The perl functions to test the association operations are implemented in the perl module associator.pm. The module exports the functions associators, references and check_role_resultRole_resultClass_Params. 4.4.1 associatorNames() & associators() The perl function associators is responsible for testing the association provider operations associatorName s() and associators(). For each reference class name (source and target reference) the perl function associators sends an enumInstanceNames() request to the CIMOM to get the object paths of all available instances. For each object path an associatorNames() or associators() request is send to the CIMOM and the returned values are checked. If the qualifier sourceClass or targetClass define a list of class names, only the first entry with the CIM_<class name> is taken to perform this test. ! The parameter “assocClass” is set to the class name of the association. If this parameter is not supported by the provider, the complete test will fail !

Test Return Code Return Code Description

check if in enumInstanceNames() an exception occurred 2 failed - enumeration <source class name> -

exception occurred - <wbemcli output>

check if an exception occurred 12 failed - (associators | associatorNames) - exception occurred - <wbemcli output>

check if the returned instance is of the expected target class type

13 failed - returned instance(s) is/are of wrong class type <target class name>

1 ok

4.4.2 referenceNames() & references() The perl function references is responsible for testing the association provider operations referenceNames() and references().

Page 18: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

18

For each reference class name (source and target reference) the perl function references sends an enumInstanceNames() request to the CIMOM to get the object paths of all available instances. For each object path an referenceNames() or references() request is send to the CIMOM and the returned values are checked. If the qualifier sourceClass or targetClass define a list of class names, only the first entry with the CIM_<class name> is taken to perform this test. ! The parameter “resultClass” is set to the class name of the association. If this parameter is not supported by the provider, the complete test will fail !

Test Return Code Return Code Description

check if in enumInstanceNames() an exception occurred 2 failed - enumeration <source class name> -

exception occurred - <wbemcli output>

check if an exception occurred 12 failed - (references | referenceNames) - exception occurred - <wbemcli output>

check if the returned instance is of the expected target class type

13 failed - returned instance(s) is/are of wrong class type <target class name>

1 ok

4.4.3 Algorithm to Handle the Input Parameter “role”, “resultRole” and “resultClass” The perl function check_role_resultRole_resultClass_Params is responsible for testing the algorithm, which validates the parameters “role”, “resultRole” and “resultClass”. • associators() supports role, resultRole and resultClass • associatorNames() supports role, resultRole and resultClass • references() supports role and resultClass • referenceNames() supports role and resultClass First a clarification to the meaning / handling of the input parameters “assocClass” and “resultClass” within the association operations needs to be done. The associators() / associatorNames() calls return instance(s) / object path(s) of the referenced target class. The references() / referenceNames() calls return instance(s) / object path(s) of the association class itself. The parameter “assocClass” defines the name of the association class the requester is interested to. The parameter “resultClass” defines the class name of the returned instance(s) / object path(s). So for referenceNames() / references() the parameters “assocClass” and “resultClass” have the same meaning. Both define the name of the association class. This behavior is visible in the Client request. The Client request only supports setting the “resultClass” parameter for the references() / referenceNames() calls. The CIMOM takes the value of the “resultClass” parameter and submits it as “assocClass” parameter to the provider. So don’ t be confused, when using the association provider interface. As described in the two chapters before, the support of the parameter “assocClass” is mandatory for the execution of the tests. That’s why an additional test for this parameter is not performed. For references() / referenceNames() only the role parameter is checked. The test requires, that at least referenceNames() and references() on the one hand and associatorNames() and associators() on the other hand use the same algorithm to validate the input parameter. Ideally all four operations would use the same algorithm. This requirement is based on the implementation of the perl function check_role_resultRole_resultClass_Params. Only the referenceNames() and associatorNames() calls are used for the execution of the test.

Page 19: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

19

The perl function check_role_resultRole_resultClass_Params sends an enumInstanceNames() request of the sourceClass to the CIMOM to get the object paths of all available instances. The described tests are only performed, if the source is not a CIM_<class name>. If the input file defines a list of class names starting with the common CIM_<class name>, the CIM_<class name> is jumped over and the enumeration is performed for each of the derived class names. For each object path returned by the enumeration, the following described tests against the associatorNames() and referenceNames() are executed. To test the algorithm the other way around, the perl function check_role_resultRole_resultClass_Params switches the content of the sourceClass and the targetClass. This means the targetClass becomes the sourceClass and the former sourceClass becomes the targetClass. Then the test is performed again. 4.4.3.1 associatorNames() For each object path returned by the enumeration, the per l function check_resultClass is called. The function tests the validation of the input parameter “resultClass”. If this test was successful, the perl function check_role_resultRole, to test the validation of the input parameter “role” and “resultRole” is called. The order is mandatory and based on the implementation of check_role_resultRole. The “resultClass” parameter in check_role_resultRole is set for each call. For completeness the perl functions check_resultClass and check_role_resultRole set the “resultClass” parameter to each possible targetClass. This is important, if a list of class names for a certain reference is supported by the association provider.

Test Return Code Return Code Description

check if in enumInstanceNames() an exception occurred 2 failed - enumeration <source class name> -

exception occurred - <wbemcli output > check if in associatorNames() an exception occurred 12 failed - exception occurred - <wbemcli output >

set the value of the resultClass parameter to the target class name - check if instance(s) of the target class is/are returned

17 failed - resultClass parameter seems to be ignored (1)

1 ok set the value of the resultClass parameter to the source class name - check if no instances are returned

18 failed - resultClass parameter seems to be ignored (2)

1 ok - no instance(s) returned set the value of the role parameter to the source role - check if instance(s) of the target class is/are returned

13 failed - role parameter seems to be ignored (1)

1 ok set the value of the role parameter to the target role - check if no instances are returned

14 failed - role parameter seems to be ignored (2)

1 ok - no instance(s) returned set the value of the resultRole parameter to the target role - check if instance(s) of the target class is/are returned

15 failed - resultRole parameter seems to be ignored (1)

Page 20: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

20

1 ok set the value of the resultRole parameter to the source role - check if no instances are returned

16 failed - resultRole parameter seems to be ignored (2)

1 ok - no instance(s) returned 4.4.3.2 referenceNames() For each object path returned by the enumeration, the perl function check_role_resultRole is called. The function tests the validation of the input parameter “role”.

Test Return Code Return Code Description

check if in enumInstanceNames() an exception occurred

2 failed - enumeration <source class name> - exception occurred - <wbemcli output >

check if in referenceNames() an exception occurred 12 failed - exception occurred - <wbemcli output >

set the value of the role parameter to the source role - check if the source class plays the specified role in the association instance

13 failed - role parameter seems to be ignored (1)

1 ok set the value of the role parameter to the target role - check if no instances are returned

14 failed - role parameter seems to be ignored (2)

1 ok - no instance(s) returned 4.4.3.3 Report File The report file is generated per tested association and only if the test of the algorithm was requested explicitly by calling ./interface.pl <class name> -assocParams. For each tested association one report file is created in the directory “stat”. The name of the file is params_<class name>.stat. The file header specifies the name of the association and the start date of the test. Each interface type (associatorNames() and referenceNames())has its own “chapter”. The chapter contains information about the object path of the source class, the name of the target class and the tested input parameter. Reported are the status description of the test and the number of returned instances. Each report file ends with a summary to report an overall statistic of the tests. It contains the number of succeeded and failed tests and the times the whole test case spend in user and system mode. The report file may become very large but each chapter is self-describing and should be easy to read. Example : ****************************************************************************** class : Linux_RunningOS test algorithm to validate the input parameter role, resultRole and resultClass Fri Apr 4 8:44:59 2003

Page 21: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

21

------------------------------------------------------------------------------ associatorNames sourceClass : testsystem.ibm.com:5988/root/cimv2:Linux_OperatingSystem.CSCreat ionClassName=Linux_ComputerSystem,CSName=testsystem.ibm.com,CreationClassName=Linux_OperatingSystem,Name= testsystem.ibm.com targetClass : Linux_ComputerSystem test content : resultClass is set to Linux_ComputerSystem Status : ok Number of returned Instances : 1 test content : resultClass is set to Linux_OperatingSystem Status : ok – no instance(s) returned ------------------------------------------------------------------------------ referenceNames sourceClass : testsystem.ibm.com:5988/root/cimv2:Linux_OperatingSystem.CSCreat ionClassName=Linux_ComputerSystem,CSName=testsystem.ibm.com,CreationClassName=Linux_OperatingSystem,Name= testsystem.ibm.com targetClass : Linux_ComputerSystem test content : role is set to Antecedent Status : ok Number of returned Instances : 1 test content : role is set to Dependent Status : ok – no instance(s) returned ------------------------------------------------------------------------------ associatorNames sourceClass : testsystem.ibm.com:5988/root/cimv2:Linux_OperatingSystem.CSCreat ionClassName=Linux_ComputerSystem,CSName=testsystem.ibm.com,CreationClassName=Linux_OperatingSystem,Name= testsystem.ibm.com targetClass : Linux_ComputerSystem test content : resultRole is set to Dependent … resultClass is set to Linux_ComputerSystem Status : ok Number of returned Instances : 1 test content : resultRole is set to Antecedent … resultClass is set to Linux_ComputerSystem Status : ok – no instance(s) returned ------------------------------------------------------------------------------ associatorNames sourceClass : testsystem.ibm.com:5988/root/cimv2:Linux_ComputerSystem.Creation ClassName=Linux_ComputerSystem,Name=testsystem.ibm.com targetClass : Linux_OperatingSystem test content : resultClass is set to Linux_OperatingSystem Status : ok Number of returned Instances : 1

Page 22: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

22

test content : resultClass is set to Linux_ComputerSystem Status : ok – no instance(s) returned ------------------------------------------------------------------------------ … ------------------------------------------------------------------------------ ****************************************************************************** Linux_RunningOS Number of Tests where Status ok : 16 Number of Tests where Status failed : 0 user time : 0.01 child user time : 0.13 ut : 0.14 system time : 0.02 child system time : 0.08 st : 0.1

4.5 Method Interface An automated test for Method Provider is currently not implemented, but planed.

4.6 Property Interface An automated test for Property Provider is currently not implemented, but planed.

4.7 Indication Interface An automated test for Indication Provider is currently not implemented, but planed.

4.8 Common Interface Tests 4.8.1 Check for expected Exception The perl function check_returnedException can be used for each supported instance and association operation. Intention is to test if a certain exception, like NOT_SUPPORTED is thrown by the provider. If the input file contains the qualifier ”Expected Exception” for a certain operation, the perl function check_returnedException is called and the returned value is checked for the expected exception. The perl function sends the appropriate request to the CIMOM and validates the returned value to be the expected exception.

Test Return Code Return Code Description

check if an exception occurred 8 failed - <expected exception> expected – check behavior of the provider by hand !

check if an unexpected exception occurred 9 failed - unexpected exception occurred - <wbemcli output>

1 ok - operation returned with <expected exception>

Page 23: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

23

5 Consistence Test

5.1 Capabilities The consistence test offers different capabilities for two different levels. It is distinguished between class specific and association specific test cases. On class level the following cases can be tested for consistence : • test the number of provider reported instances against the number reported by the system itself

o The number of available file system instances is a subset of the entries in the /etc/fstab and /etc/mtab (only supported fs types and without duplicates).

• test the property values reported by the provider against the values reported by the system itself o The provider reported Name for ComputerSystem is the hostname. The property has to contain

the same value, as it is given by the “hostname” call. • test the observance of threshold values for property values

o The reported value has to be <, > or = the user defined value • test the property values of a certain instance, to be consistent to each other

o The processor speed of ID “0” is 930 MHz and of ID “1” 450 MHz. The provider has to report within the instance for ID “0” a CurrentClockSpeed of 930 MHz and nothing else.

On association level the following cases can be tested for consistence : • test the number of reported target instances against the association definition (system values)

o The association RunningOS between ComputerSystem and OperatingSystem is one-to-one. The provider has to report one instance for both directions.

• test if the association provider supported associators() call reports instance(s), that is / are equivalent to the corresponding instance(s), reported by the getInstance() call against the class provider o The association provider reported a ComputerSystem instance. This instance has to be equal to

the ComputerSystem instance reported by the getInstance() call against the class provider of ComputerSystem.

5.2 Input File The input file defines the classes to test. Each class / association, where the consistence test should be performed, has to have its input file in the directory “system/<platform>”. The name of the file has to be <class name>.system. Otherwise the consistence test is not executed for this class / association.

Page 24: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

24

5.2.1 Class specific Syntax 5.2.1.1 Header Syntax The key word class is mandatory and defines the name of the class. The qualifier volatile should only be used for classes, which represent volatile data, for example the class CIM_Process. The intention of this qualifier is to find a mechanism how to handle the dynamical nature of some data. It is very likely that each time you call especially the getInstance() operation of a provider for volatile data, the returned answer can vary between ok and “NOT_FOUND”. To avoid in principle failing the test has to be aware of this difference. ****************************************************************************** class : <class name> [ volatile ] \n 5.2.1.2 Paragraph Syntax on Class Level Each line defines the name of the class, the compare operation to execute and the number of reported instances. Supported key words for the compare operations are : • -eq ok if instance count is equal to the expected number • -lt ok if instance count is less than the expected number • -gt ok if instance count is greater than the expected number Supported syntax and key words for the number of reported instances are : • -cmd <command to execute> command has to return a number • <value> The execution of the command ( -cmd ) has to return only one value. A list of values can not be handled! ----------------------------------------------------------------------------- <class name> : (compare operation) (expected number) <class name> : (compare operation) (expected number) \n 5.2.1.3 Paragraph Syntax on Property Level Each line defines the name of the property, the compare operation to execute and the system value. The values given for this test case are checked at each reported instance, independent of the system object it represents. This is important to know, because checking of instance specific values is not possible with this test scenario. The next chapter describes how to check value consistence within a certain instance. Supported key words for the compare operations are : • -eq ok if property value is equal to system value • -lt ok if property value is less than system value • -gt ok if property value is greater than system value • -grep ok if property value contains a certain regular expression • -set ok if property value contains a value • -empty ok if property value does not contain a value

Page 25: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

25

Supported syntax and key words for the system values are : • -cmd <command to execute> • -file <filename>[<line index>] • <value> The execution of the command ( -cmd ) has to return only one value. A list of values can not be handled! If the enumeration of the class returns more than one instance, the given system value has to fulfill the compare operation (-eq, -lt …) for each instance ! Otherwise the test will fail by default. The tester has to be aware of this behavior when writing the input files for the consistence test. ----------------------------------------------------------------------------- <property name> : (compare operation) (system value) <property name> : (compare operation) (system value) 5.2.1.4 Paragraph Syntax on Instance Level This test case checks the data consistence within a certain instance. It is difficult to realize and needs some special attention. Adding the corresponding paragraph to the input file is only the first of two steps. The following two lines have to be placed in the <class name>.system file. The first line defines the test type “Instance Level” and a script to execute. The second line describes the order for the mapping between the values in the generated system data file and the CIM properties. The compare operation is always “equal”. ----------------------------------------------------------------------------- Instance Level : <full qualified script name> Property Order : <unique key> <prop1> <prop2> … <propN> The second step is writing a script called <class name>.sh. The script must be located in the system/<platform> directory. The script collects the system values on instance level and generates a file <class name>.instance in the system/<platform> directory. The script has to support the option -rm to delete the generated <class name>.instance file. The content of the generated file has to be organized as following : • a paragraph contains the system values of one special instance • the order of the values has to follow the property order, defined in the input file of the class • delimiter between paragraphs is an empty line 5.2.1.5 Example ***************************************************************************** class : Linux_ComputerSystem \n ----------------------------------------------------------------------------- Linux_ComputerSystem : -eq 1 \n ----------------------------------------------------------------------------- CreationClassName : -eq -file ComputerSysten.keys[0] Name : -eq -file ComputerSysten.keys[1] EnabledStatus : -eq 2 ----------------------------------------------------------------------------- Instance Level : Linux_ComputerSystem.pl Property Order : Name PrimaryOwnerName PrimaryOwnerContact

Page 26: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

26

5.2.2 Association specific Syntax 5.2.2.1 Header Syntax The key word association is mandatory and defines the name of the association. ****************************************************************************** association : <class name> \n 5.2.2.2 Paragraph Syntax on Reference Level Each line defines the name of the source class, the name of the target class, the compare operation to execute and the number of reported instances. In addition of testing the number of returned instances reported by the association provider via the associators() call, these instances are compared to the instance(s) reported by the corresponding instance provider of the target class. Syntax to define the target class name is : • -target <target class name> Supported key words for the compare operations are : • -eq ok if instance count is equal to the expected number • -lt ok if instance count is less than the expected number • -gt ok if instance count is greater than the expected number Supported syntax and key words for the number of reported instances are : • -cmd <command to execute> command has to return a number • <value> The execution of the command ( -cmd ) has to return only one value. A list of values can not be handled! ----------------------------------------------------------------------------- <source class name> : (target) (compare operation) (expected number) <source class name> : (target) (compare operation) (expected number) \n 5.2.2.4 Example ***************************************************************************** association : Linux_RunningOS \n ----------------------------------------------------------------------------- Linux_ComputerSystem : -target Linux_OperatingSystem –eq 1 Linux_OperatingSystem : -target Linux_ComputerSystem –eq 1

5.3 Report File The report file is generated per tested class. The consistence test creates one report file in the directory “stat”. The name of the file is <class name>.system.

Page 27: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

27

5.3.1 Class Specific Syntax The syntax of the report files is similar to the syntax of the input files. The report file contains some additional information. Each level has its own “chapter”. Each chapter header specifies the name of the class, the start date of the test and the tested level. Each report file contains a summary to report the overall statistic of the tests. It contains the number of succeeded and failed tests. ****************************************************************************** class : Linux_ComputerSystem Fri Apr 4 8:43:09 2003 ------------------------------------------------------------------------------ Class Level ------------------------------------------------------------------------------ equal Status : ok … counted instances 1 equal 1 ****************************************************************************** class : Linux_ComputerSystem Fri Apr 4 8:43:09 2003 ------------------------------------------------------------------------------ Property Level ------------------------------------------------------------------------------ CreationClassName equal Status : ok … property value was : Linux_ComputerSystem Name equal Status : ok … property value was : testsystem.ibm.com EnabledStatus equal Status : ok … property value was : 2 ------------------------------------------------------------------------------ ****************************************************************************** Linux_ComputerSystem Number of Tests where Status ok : 4 Number of Tests where Status failed : 0 5.3.1 Association Specific Syntax The syntax of the report files is similar to the syntax of the input files. The report file contains some additional information. The header of the report file specifies the class name of the association and the start date of the test. Each “source to target” line of the input file has its own “chapter”. Each header specifies the source class name and the target class name. Each paragraph contains the full qualified object path of the source instance, the return codes and status descriptions for the test to check the number of returned references and the test to compared the referenced instances with the instances reported by the getInstance() call. The report file ends with a summary to report the overall statistic of the tests. It contains the number of succeeded and failed tests.

Page 28: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

28

****************************************************************************** class : Linux_RunningOS Fri Apr 4 8:43:09 2003 ****************************************************************************** sourceClass : Linux_ComputerSystem targetClass : Linux_OperatingSystem ------------------------------------------------------------------------------ source : testsystem.ibm.com:5988/root/cimv2:Linux_ComputerSystem.CreationClass Name=Linux_ComputerSystem,Name=testsystem.ibm.com check number of returned instances : RC 1 Status : ok … counted instances 1 equal 1 check returned referenced instances : RC 1 Status : ok ****************************************************************************** sourceClass : Linux_OperatingSystem targetClass : Linux_ComputerSystem ------------------------------------------------------------------------------ source : testsystem.ibm.com:5988/root/cimv2:Linux_OperatingSystem.CSCreationCl assName=Linux_ComputerSystem,CSName=testsystem.ibm.com,CreationClassName=Linux_OperatingSystem,Name= testsystem.ibm.com check number of returned instances : RC 1 Status : ok … counted instances 1 equal 1 check returned referenced instances : RC 1 Status : ok ------------------------------------------------------------------------------ ****************************************************************************** Linux_RunningOS Number of Tests where Status ok : 4 Number of Tests where Status failed : 0

5.4 Class Specific Test Cases 5.4.1 Class Level The number of reported instances by a provider has to be in sync with the system view.

Test Return Code Return Code Description

check if in enumInstanceNames() an exception occurred 2 failed - enumeration - exception occurred -

<wbemcli output >

-eq (equal) 3 failed – counted instances <instance count> not equal <expected number>

Page 29: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

29

… “volatile” 6 warning – counted instances <instance count> not equal <expected number>

1 ok - counted instances <instance count> equal <expected number>

-lt (less than) 4 failed – counted instances <instance count> not less than <expected number>

… “volatile” 7 warning – counted instances <instance count> not less than <expected number>

1 ok - counted instances <instance count> less than <expected number>

-gt (greater than) 5 failed – counted instances <instance count> not greater than <expected number>

… “volatile” 8 warning – counted instances <instance count> not greater than <expected number>

1 ok - counted instances <instance count> greater than <expected number>

5.4.2 Property Level The property values have to be in sync with the system view, an expected value or with values predefined by the implemented model. In addition the property values have to take observance of threshold values.

Test Return Code Return Code Description

check if in enumInstances() an exception occurred 2 failed - enumeration - exception occurred -

<wbemcli output > check if property name wa s found in the instance

3 failed - <property name> not found in CIM Instance <object path>

-eq (equal) 4 failed – property value <property value> not equal <system value>

1 ok - property value <property value> equal <system value>

-lt (less than) 5 failed – property value <property value> not less than <system value>

1 ok - property value <property value> less than <system value>

-gt (greater than) 6 failed – property value <property value> not greater than <system value>

1 ok - property value <property value> greater than <system value>

-grep 7 failed – property value does not fulfill regular expression

1 ok - property value fulfill regular expression -set 8 failed – property value not set 1 ok - property value set to <property value>

Page 30: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

30

-empty 9 failed – property value set to <property value> 1 ok - property value not set

5.4.3 Instance Level The property values have to be in sync with the system view. The system view is represented by the generated <class name>.instance file. Each “chapter” in this file represents one instance. This test checks if the values in the instance file are consistent with the reported CIM instances.

Test Return Code Return Code Description

check if in enumInstances() an exception occurred 2 failed - enumeration - exception occurred -

<wbemcli output > check if property name was found in the instance

3 failed - key <key name> not found in CIM Instance <instance>

check if instance with the current key value was found within the reported instances 4 failed – instance with <key name> = <key

value> not found check if instance with the current key value was found within the reported instances and class is defines as “volatile”

7 warning – instance with <key name> = <key value> not found

check if property name was found in instance 5 failed – <name> not found in CIM instance

<instance> check if property value is equal to system value

6 failed – property value <property value> not equal to system value <system value>

1 ok

5.5 Association Specific Test Cases 5.5.1 Association Level The first step is an enumInstanceNames() call to the instance provider of the source class. Each reported object path is used as input for the associators() call to the association provider. The test is divided into two parts. The first one checks, if the number of referenced instances is in sync with the system view. The second one checks, if the returned instances are equal to the instances reported by the corresponding getInstance() call to the responsible instance provider (in this case the one of the target class).

Test Return Code Return Code Description

check if in enumInstanceNames() an exception occurred 2 failed - enumeration <source class> - exception

occurred - <wbemcli output > check if in associators() an exception occurred

12 failed - exception occurred - <wbemcli output >

check if associators() reported the right target class type

13 failed - returned instance is of wrong class type - <wbemcli output >

Page 31: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

31

check if the instance, returned by getInstance() is equal to the instance, returned by associators()

14 failed - instances are not equal - <wbemcli output >

check if in getInstance() an exception occured

15 failed - exception occurred - <wbemcli output >

check if in getInstance() another exception than CIM_ERR_NOT_SUPPORTED occured

16 warning - volatile class type returned with exception - <wbemcli output >

-eq (equal) 3 failed - counted instances <instance count> not equal <expected number>

-eq (equal) volatile 6 warning - counted instances <instance count> not equal <expected number>

1 ok - counted instances <instance count> equal <expected number>

-lt (less than) 4 failed - counted instances <instance count> not less than <expected number>

-lt (less than) volatile 7 warning - counted instances <instance count> not less than <expected number>

1 ok - counted instances <instance count> less than <expected number>

-gt (greater than) 5 failed - counted instances <instance count> not greater than <expected number>

-gt (greater than) volatile 8 warning - counted instances <instance count> not greater than <expected number>

1 ok - counted instances <instance count> greater than <expected number>

Page 32: SBLIM: Test Suite - SourceForgesblim.sourceforge.net/doc/SBLIMTestSuite.pdf ·  · 2005-02-03test scenarios and the possibility to perform automated function verification tests against

32

6 Specification Test The specification test will make use of the meta-information from the model. Within this test type, it is possible to figure out, if the provider implements the class definition in the required manner. For example, the definition of a class marks a certain property as required. But the provider does not return a value for this property. This is a violation of the CIM specification. Currently there is no hard requirement against this test scenario known. So a complete description and the implementation are moved to the future.