Assessment of the Adequacy of a QA Test Environment
The success of any software development group is measured by whether there are problems related to the software change. Assuming the user-defined requirements were built into the software change, problems with software changes would be defects/bugs related to software modules associated with the change request. Adequate testing is the basis for eliminating these defects before a software change is migrated to the production environment.
Although there are various levels of testing which may be performed, the QA test is the most important test since it is performed when all of the related components of the change has been completed and the environment is frozen to prevent other changes from occurring. In this article, the term 'QA test' is used to describe this type of testing. However, the term 'UAT' or 'User Acceptance Testing' is also used to describe the same type of testing within the industry.
There are two major objectives for assessing the adequacy of the QA test environment. The first objective is to determine whether the QA environment is established in a manner in which sufficient testing can occur to disclose defects. This objective requires a review of the various testing approaches used by an environment and comparing them to the capabilities of the test environment. The second objective is to determine whether the environment has been established to preserve the integrity of the test performed. The function of this objective is to ensure that only those individuals who are responsible for the test can impact the data which is used in the test. If possible, even those individuals responsible for the test should be restricted from directly altering the data except through the application-defined functions which are used to update and process the data. Overall the latter objective is much less critical then the objective to ensure that a test environment is established which can perform the required level of testing.
Change control reviews assess whether the integrity of the programs are preserved as they are migrated through the libraries used in the change migration process. This type of review is beyond the scope of this article. In addition, the operation and maintenance of the QA test environment is not part of scope of this article. The intent of this article is to assess the adequacy of how the environment was established.
The first question that in order to assess the test capabilities of a QA test environment is to determine whether all production related functions can be tested. In most cases it is impracticable to be able to test all production components. For instance, it would not be a cost benefit to establish QA printers which are routed to specific destinations, create tapes for all outside delivery, and establish QA links for all external transmissions.
One of the factors which determines whether most production related functions can be tested is based on whether production data is for QA testing or if test data is built from scratch. If production files are being used, the next step is to determine whether production files are being used as input to the QA environment or if production files are being copied to a QA equivalent set of files. If production files are being used as the input files then the test would be limited to a one day cycle since the same data would be available in the future for a retest. Maintaining a full set of production files as QA files may not be cost effective which is the reason that many installations do not maintain fill production copies but instead cut-down files. However, a determination would have to be made as to whether these cut down files would also limit the type of test which could be performed.
The next step in the assessment process should be to analyze the type of batch processing which is being supported. The first step is to determine whether the QA batch environment is established to operate the same as the production environment. If it is, then determine whether system resources are available during the business week to run both production and QA batch processing. If there are not sufficient system resources to run both batch and production, then the QA batch cycle would only run on weekends which limits the testing which can be performed. As an alternative, the QA batch environment can be established to run specific portions of the equivalent production batch processing which is related to specific test requirements. This approach makes better use of the system resources but requires additional operation maintenance of the QA batch environment.
Based on the QA files being used, a determination must also be made of the length of time in which they are maintained and the frequency in which the files are being refreshed. The answers to these question determines the data which is available for retesting once software problems have been fixed.
The last critical step is to determine whether the software environment has been established to allow all software modules to be tested and moved to production without requiring an alteration to meet the production processing requirements. This point is mentioned because in environments such as the IBM MVS environment, certain types of software modules require a unique version for the QA and production environments. For instance, control cards which are used to specify VSAM DEFINES would have a separate control card module for QA and production since the VSAM dataset is unique to each processing environment. In addition, unless production application PROCS have been defined with symbolics for the dataset high level qualifiers, volumes (i.e., for non-SMS environments), and printer destination, separate application PROCS would have to be defined for the production and QA test environments.
Prior to performing an assessment of the security related to the QA environment, it is assumed that a change control review has been performed. This is because the programs migrated to the production environment have a higher risk level then the controls which preserve the integrity of the QA test environment.
The first step to secure the QA test environment is to ensure that a standardized dataset naming convention is used. This is critical in order to identify all of the QA datasets to ensure that they are properly secured. The next step is to identify the datasets which are actually used in the online and batch environment. Depending on the platform, there may not be an automated method to extract this information. For instance, in an IBM MVS environment, the batch datasets can be identified within the QA batch jobs which are defined. The online datasets within a CICS environment could be extracted from the CSD (CICS System Definition) groups which are established for the QA environment.
The dataset names are then compared to ensure that they adhere to a standardized naming conventions and analyzed to determine the individuals who have the ability to alter them. Within the online environment, no individual should require alter access to the online datasets since it is updated by the online environment. An exception could be made for input files which have to be altered to create test conditions which cannot be created through the online edit features of the application.
Individuals should also not have alter capabilities to the QA batch processing files since they should be defined to the scheduling system. In an environment which does not have batch QA environment established, it is common for programmers to submit jobs using their own Ids which update datasets to which they have access.
In general, all individual user ability to alter QA files should be restricted to emergency Ids.
The establishment of properly controlled QA test environments which have the testing functions needed to provide defect-free software changes is not common throughout the industry. The reason is attributed to the cost of establishing and maintaining the environment along with the required disciplines that must be maintained by the development group to ensure that the QA environment remains synchronized with the production environment.
For a free proposal to perform an audit of your organization or provide SOX support & testing services, contact Mitchell Levine of Audit Serve at (203) 972-3567 or via e-mail at Levinemh@auditserve.com.
Copyright 2006, Audit Serve, Inc. All rights reserved. Reproduction, which includes links from other Web sites, is prohibited except by permission in writing.
Join 3,500 other subscribers
Advertise with Us