Portals eNewsletters Web Seminars dataWarehouse.com DM Review Magazine
DM Review | Covering Business Intelligence, Integration & Analytics
   Covering Business Intelligence, Integration & Analytics Advanced Search
advertisement

RESOURCE PORTALS
View all Portals

WEB SEMINARS
Scheduled Events

RESEARCH VAULT
White Paper Library
Research Papers

CAREERZONE
View Job Listings
Post a job

Advertisement

INFORMATION CENTER
DM Review Home
Newsletters
Current Magazine Issue
Magazine Archives
Online Columnists
Ask the Experts
Industry News
Search DM Review

GENERAL RESOURCES
Bookstore
Buyer's Guide
Glossary
Industry Events Calendar
Monthly Product Guides
Software Demo Lab
Vendor Listings

DM REVIEW
About Us
Press Releases
Awards
Advertising/Media Kit
Reprints
Magazine Subscriptions
Editorial Calendar
Contact Us
Customer Service

Development Best Practices:
Software Testing, Part 2 - Strategies and Plans

online columnist  Robert  Wegener     Column published in DMReview.com
May 12, 2005
 
  By Robert Wegener

This month's column will continue with software testing best practices (see my April column for Part 1) with the focus on creating test strategies and test plans. A solid testing strategy provides the framework necessary to implement your testing methodology. A separate strategy should be developed for each system being developed taking into account the development methodology being used and the specific application architecture.

The heart of any testing strategy is the master testing strategy document. It aggregates all the information from the requirements, system design and acceptance criteria into a detailed plan for testing. A detailed master strategy should cover the following:

Project Scope

Restate the business objective of the application and define the scope of the testing. The statement should be a list of activities that will be in scope or out of scope. A sample list would include:

  • List of software to be tested
  • Software configurations to be tested
  • Documentation to be validated
  • Hardware to be tested

Test Objectives

The system under test should be measured by its compliance to the requirements and the user acceptance criteria. Each requirement and acceptance criteria must be mapped to specific test plans that validate and measure the expected results for each test being performed. The objectives should be listed in order of importance and weighted by Risk.

Features and Functions to be Tested

Every feature and function must be listed for test inclusion or exclusion, along with a description of the exceptions. Some features may not be testable due to a lack of hardware or lack of control etc. The list should be grouped by functional area to add clarity. The following is a basic list of functional areas:

  • Backup and recovery
  • Workflow
  • Interface design
  • Installation
  • Procedures (users, operational, installation)
  • Requirements and design
  • Messaging
  • Notifications
  • Error handling
  • System exceptions and third-party application faults

Testing Approach

The approach provides the detail necessary to describe the levels and types of testing. The basic V-Model shows what types of testing are needed to validate the system.

 

Figure 1

More specific test types include functionality, performance testing, backup and recovery, security testing, environmental testing, conversion testing, usability testing, installation and regression testing. The specific testing methodology should be described and the entry/exit criteria for each phase noted in a matrix by phase. A project plan that list the resources and schedule for each testing cycle should also be created that maps the specific testing task to the overall development project plan.

Testing Process and Procedures

The order of test execution and the steps necessary to perform each type of test should be described in sufficient detail to provide clear input into the creation of test plans and test cases. Procedures should include how test data is created, managed and loaded. Test cycles should be planned and scheduled based on system availability and deliverable dates from development. All application and environmental dependencies should be identified along with the procedures necessary to gain access to all the dependent systems.

Test Compliance

Every level of testing must have a defined set of entry/exit criteria which is used to validate that all prerequisites for a valid test have been met. All mainstream software testing methodologies provide an extensive list of entry/exit criteria and checklist. In addition to the standard list, additional items should be added based on specific testing needs. Some common additions are, environmental availability, data availability, and validated code which is ready to be tested.

Each level of testing should define specific pass/fail acceptance criteria, to ensure to ensure that all quality gates have been validated and that the test plan focuses on developing test that validate the specific criteria defined by the user acceptance plan.

Testing Tools

All testing tools should be identified and their use, ownership and dependencies defined. The tools category includes manual tools, such as templates in spreadsheets and documents as well as automated tools for test management, defect tracking, regression testing and performance/load testing. Any specific skill sets should be identified and compared against the existing skills identified for the project to highlight any training needs.

Defect Resolution

A plan to address the resolution of failed tests needs to be created that lists the escalation procedures to seek correction and retest of the failed tests along with a risk mitigation plan for high-risk test. Defect tracking should include basic metrics for compliance based on number and type of defect found.

Roles and Responsibilities

A matrix listing the roles and responsibilities of everyone involved in the testing activities, along with the anticipated amount of their time allocated to the project, must be prepared.

Process Improvement

The entire testing process should be focused on process improvement. The strategy should list ways to monitor progress and provide constant feedback. This feedback can serve to enhance the process, deliverables and metrics used in the testing. Root cause analysis should be performed on all reported defects to help isolate the true nature of the problem and prevent unnecessary repeat offenses.

Deliverables

All deliverables should be defined and their location specified. Common deliverables are test plans, test cases, test scripts, test matrix and a defect log.

Schedule

All testing activities should be combined into one master testing schedule. The schedule should include an estimate of time for each task and the dependences for each. Testing resources should be assigned to each task and quality gates should be listed to insure oversight of the entire process.

Environmental Needs

All the requirements of the testing environment need to be listed. Common ones include a description of the environment's use, management, hardware and software, specific tools needed, data loading and security requirements.

Resource Management

The skills of all personnel involved in the testing effort need to be assessed and the gaps noted so that a comprehensive training program can be designed. Specialty skills that will not be filled with in-house staff will require job descriptions and budgeting.

Risk and Contingencies

Planning for risk in advance and ways to mitigate it are essential for a robust strategy. A risk assessment that is prioritized by severity of risk and covers technology, resource, schedule and environmental issues should feed a detailed plan to mitigate each red flag.

Approvals and Workflow

All items on the critical path must go through an approval cycle. The procedures for approval and escalation must be well defined and assigned to resources prior to the start of the testing.

The above covers the main sections of a well-drafted and documented testing strategy. The more detail that you include in the strategy document, the less ambiguity and chance for deviation there will be throughout the project

The completion of the strategy signals the beginning of the test planning phase. For each type of testing identified in the master test strategy there should be a test plan identifying the components to be tested, the location of the test data, the test environment needs, the test procedures, resources required, and the tests schedule. For each plan a series of test conditions should be identified so that test cases with expected results can be generated for later execution.

In summary the strategy and planning documents are the most critical documents to any successful testing. A good source of detail on testing documents is IEEE Std 829-1998. It provides the specific form and content for a basic set of software testing documents. A set of basic software test documents is described. This standard specifies the form and content of individual test documents.

...............................................................................

For more information on related topics visit the following related portals...
Benchmarking/Best Practices and Project Management / Development.

Robert Wegener is the director of solutions for RCG Information Technology's Web services. He has more than 20 years of information and business engineering experience in operations, customer service, transportation, finance, product development, telecommunications and information systems. Wegener also has extensive experience in various process and development methodologies, quality assurance and testing methodologies. He can be contacted by e-mail at rwegener@rcgit.com.

Solutions Marketplace
Provided by IndustryBrains

Design Databases with ER/Studio: Free Trial
ER/Studio delivers next-generation data modeling. Multiple, distinct physical models based on a single logical model give you the tools you need to manage complex database environments and critical metadata in an intuitive user interface.

Data Quality Tools, Affordable and Accurate
Protect against fraud, waste and excess marketing costs by cleaning your customer database of inaccurate, incomplete or undeliverable addresses. Add on phone check, name parsing and geo-coding as needed. FREE trial of Data Quality dev tools here.

Free EII Buyer's Guide
Understand EII - Trends. Tech. Apps. Calculate ROI. Download Now.

cost-effective Web server security
dotDefender protects sites against DoS, SQL Injection, Cross-site Scripting, Cookie Tampering, Path Traversal and Session Hijacking. It is available for a 30-day evaluation period. It supports Apache, IIS and iPlanet Web servers and all Linux OS's.

Click here to advertise in this space


E-mail This Column E-Mail This Column
Printer Friendly Version Printer-Friendly Version
Related Content Related Content
Request Reprints Request Reprints
Advertisement
advertisement
Site Map Terms of Use Privacy Policy
SourceMedia (c) 2006 DM Review and SourceMedia, Inc. All rights reserved.
SourceMedia is an Investcorp company.
Use, duplication, or sale of this service, or data contained herein, is strictly prohibited.