|Sign-Up for Free Exclusive Services:||Portals|||||eNewsletters|||||Web Seminars|||||dataWarehouse.com|||||DM Review Magazine|
|Covering Business Intelligence, Integration & Analytics||Advanced Search|
Development Best Practices:
This month's column will continue with software testing best practices (see my April column for Part 1) with the focus on creating test strategies and test plans. A solid testing strategy provides the framework necessary to implement your testing methodology. A separate strategy should be developed for each system being developed taking into account the development methodology being used and the specific application architecture.
The heart of any testing strategy is the master testing strategy document. It aggregates all the information from the requirements, system design and acceptance criteria into a detailed plan for testing. A detailed master strategy should cover the following:
Restate the business objective of the application and define the scope of the testing. The statement should be a list of activities that will be in scope or out of scope. A sample list would include:
The system under test should be measured by its compliance to the requirements and the user acceptance criteria. Each requirement and acceptance criteria must be mapped to specific test plans that validate and measure the expected results for each test being performed. The objectives should be listed in order of importance and weighted by Risk.
Features and Functions to be Tested
Every feature and function must be listed for test inclusion or exclusion, along with a description of the exceptions. Some features may not be testable due to a lack of hardware or lack of control etc. The list should be grouped by functional area to add clarity. The following is a basic list of functional areas:
The approach provides the detail necessary to describe the levels and types of testing. The basic V-Model shows what types of testing are needed to validate the system.
More specific test types include functionality, performance testing, backup and recovery, security testing, environmental testing, conversion testing, usability testing, installation and regression testing. The specific testing methodology should be described and the entry/exit criteria for each phase noted in a matrix by phase. A project plan that list the resources and schedule for each testing cycle should also be created that maps the specific testing task to the overall development project plan.
Testing Process and Procedures
The order of test execution and the steps necessary to perform each type of test should be described in sufficient detail to provide clear input into the creation of test plans and test cases. Procedures should include how test data is created, managed and loaded. Test cycles should be planned and scheduled based on system availability and deliverable dates from development. All application and environmental dependencies should be identified along with the procedures necessary to gain access to all the dependent systems.
Every level of testing must have a defined set of entry/exit criteria which is used to validate that all prerequisites for a valid test have been met. All mainstream software testing methodologies provide an extensive list of entry/exit criteria and checklist. In addition to the standard list, additional items should be added based on specific testing needs. Some common additions are, environmental availability, data availability, and validated code which is ready to be tested.
Each level of testing should define specific pass/fail acceptance criteria, to ensure to ensure that all quality gates have been validated and that the test plan focuses on developing test that validate the specific criteria defined by the user acceptance plan.
All testing tools should be identified and their use, ownership and dependencies defined. The tools category includes manual tools, such as templates in spreadsheets and documents as well as automated tools for test management, defect tracking, regression testing and performance/load testing. Any specific skill sets should be identified and compared against the existing skills identified for the project to highlight any training needs.
A plan to address the resolution of failed tests needs to be created that lists the escalation procedures to seek correction and retest of the failed tests along with a risk mitigation plan for high-risk test. Defect tracking should include basic metrics for compliance based on number and type of defect found.
Roles and Responsibilities
A matrix listing the roles and responsibilities of everyone involved in the testing activities, along with the anticipated amount of their time allocated to the project, must be prepared.
The entire testing process should be focused on process improvement. The strategy should list ways to monitor progress and provide constant feedback. This feedback can serve to enhance the process, deliverables and metrics used in the testing. Root cause analysis should be performed on all reported defects to help isolate the true nature of the problem and prevent unnecessary repeat offenses.
All deliverables should be defined and their location specified. Common deliverables are test plans, test cases, test scripts, test matrix and a defect log.
All testing activities should be combined into one master testing schedule. The schedule should include an estimate of time for each task and the dependences for each. Testing resources should be assigned to each task and quality gates should be listed to insure oversight of the entire process.
All the requirements of the testing environment need to be listed. Common ones include a description of the environment's use, management, hardware and software, specific tools needed, data loading and security requirements.
The skills of all personnel involved in the testing effort need to be assessed and the gaps noted so that a comprehensive training program can be designed. Specialty skills that will not be filled with in-house staff will require job descriptions and budgeting.
Risk and Contingencies
Planning for risk in advance and ways to mitigate it are essential for a robust strategy. A risk assessment that is prioritized by severity of risk and covers technology, resource, schedule and environmental issues should feed a detailed plan to mitigate each red flag.
Approvals and Workflow
All items on the critical path must go through an approval cycle. The procedures for approval and escalation must be well defined and assigned to resources prior to the start of the testing.
The above covers the main sections of a well-drafted and documented testing strategy. The more detail that you include in the strategy document, the less ambiguity and chance for deviation there will be throughout the project
The completion of the strategy signals the beginning of the test planning phase. For each type of testing identified in the master test strategy there should be a test plan identifying the components to be tested, the location of the test data, the test environment needs, the test procedures, resources required, and the tests schedule. For each plan a series of test conditions should be identified so that test cases with expected results can be generated for later execution.
In summary the strategy and planning documents are the most critical documents to any successful testing. A good source of detail on testing documents is IEEE Std 829-1998. It provides the specific form and content for a basic set of software testing documents. A set of basic software test documents is described. This standard specifies the form and content of individual test documents.
Robert Wegener is the director of solutions for RCG Information Technology's Web services. He has more than 20 years of information and business engineering experience in operations, customer service, transportation, finance, product development, telecommunications and information systems. Wegener also has extensive experience in various process and development methodologies, quality assurance and testing methodologies. He can be contacted by e-mail at firstname.lastname@example.org.
|E-Mail This Column|