Portals eNewsletters Web Seminars dataWarehouse.com DM Review Magazine
DM Review | Covering Business Intelligence, Integration & Analytics
   Covering Business Intelligence, Integration & Analytics Advanced Search

Resource Portals
Business Intelligence
Customer Data Integration
Data Integration
Data Quality
Data Warehousing Basics
More Portals...


Information Center
DM Review Home
Web Seminars & Archives
Current Magazine Issue
Magazine Archives
Online Columnists
Ask the Experts
Industry News
Search DM Review

General Resources
Industry Events Calendar
Vendor Listings
White Paper Library
Software Demo Lab
Monthly Product Guides
Buyer's Guide

General Resources
About Us
Press Releases
Media Kit
Magazine Subscriptions
Editorial Calendar
Contact Us
Customer Service

Development Best Practices:
Software Testing

online columnist  Robert  Wegener     Column published in DMReview.com
April 21, 2005
  By Robert Wegener

This month's column will cover software testing best practices. As opposed to a check-off list of best practices, this article will look at how an organization can improve testing by integrating testing into the development life cycle.

The first step on the road to improved quality of the software testing function is to perform an assessment of your development and testing organizations. The assessment should start with a comparison of your organization based on the Capability Maturity Model (CMM). The model was developed by the Software Engineering Institute (SEI) to advance software engineering methodologies. It is very similar to ISO 9001 but extends it by adding a framework for continuous process improvement.

The model consists of five levels. Level 1 is called the Initial Level and depicts an organization that takes an ad hoc approach and has very little repeatable processes. This level relies on individual success over team success. Level 2 is called the Repeatable Level and incorporates basic management techniques that are established, defined and documented. Level 3 is called the Defined Level and describes an organization that has its own software process documented and integrated into the development life cycle. Level 4 is the Managed Level and adds a layer of monitoring and control over the development life cycle through data collection and analysis. The final level, Level 5, is called the Optimizing level. At this level all processes are constantly being improved by the use of feedback and the introduction of innovative processes. You can find more information at http://www.sei.cmu.edu/cmm/

Here is a quick road map to get you to each level:

Level 1: Initialize

  • Identify current state
  • Identify current resources
  • Assess the current testing capability state

Level 2: Repeatable

  • Institutionalize basic testing techniques and methods
  • Initiate a test planning process
  • Develop testing and debugging goals

Level 3: Defined/Integration

  • Control and monitor the testing process
  • Integrate testing into the software life cycle
  • Establish a technical training program
  • Establish a software testing organization

Level 4: Management and Measurement

  • Develop a software quality evaluation
  • Establish a test measurement program
  • Establish an organization-wide review program

Level 5: Optimization/Defect Prevention and Quality Control

  • Optimize the test process
  • Implement quality control measures
  • Apply process data for defect prevention

The next step is to get your organization to at least Level 3, before you start buying tools. It is more important to have robust artifacts and well-trained staff than having expensive tools on the shelf. The primary artifacts that are necessary for any testing organization include: test strategy, test costing, resource planning, test plans, test cases, test scripts and test management artifacts to track defects and to add traceability.

After you have your artifacts in place and your staff trained, your next step is to focus on the who, what, when and how much of testing. Testing is viewed differently by many people. Here is just a sample of testing definitions:

  • "The process of operating a system or component under specified conditions, observing or recording the results, and making an evaluation of some aspect of the system or component." IEEE/ANSI, 1990
  • "The process of executing computer software in order to determine whether the results it produces are correct." Glass '79
  • "The process of executing a program with the intent of finding errors." Myers '79
  • "Program testing can be used to show the presence of bugs, but never their absence." Dijkstra '72
  • "The aim is not to discover errors but to provide convincing evidence that there are none, or to show that particular classes of faults are not present." Hennell '84
  • "Testing is the measure of software quality." Hetzel '85

The key point in testing is to provide a stable working system that recognizes failures and has a defined path for remediation and recovery. Complex systems must be designed to handle failures and good testing practices focus on how the systems display, log and recover from errors.

Let's start by what should be tested. The basic V-Model shows the direct correlation between system development and the validation required for each (see Figure 1).

Figure 1

Now that we have an idea of what to test, we will look at who is to do the testing. Starting from the bottom of the V-model with construction, it should be apparent that the developers will test the components. Developers are responsible for their individual contributions to the application and must ensure that the individual pieces they created meet the specification, adhere to coding standards and interface properly to the rest of the system.

The next phase of testing covers the stringing of all the components together into a working system. Many organizations call this a string or assembly test, and it is performed by the developers. The V-Model tends to group this into integration testing.

The next phase, system testing, is where an independent group of testers focuses on validating the system as a whole by exercising the architecture through the functional and system requirements. The final test which is performed by the end users of the system validates that the requirements have been met.

Testing is an ongoing process and has to be integrated into the development methodology being used. Test strategies and plans should be designed in conjunction with each feature/function up front so that any requirement specific testing needs can be identified and included in the development of the component and the testing environment.

How much to test is a topic that is beyond the scope of this article. It is safe to say that aside from the granularity of unit testing and white-box testing, that system testing should provide 100 percent coverage of all functionality, including error handling and recoverability functionality. A traceability matrix that shows each function and its related test cases is a must to ensure maximum coverage. The more functionality validated the better. Test functionality should be grouped by categories such as security, interface logic, business rules, usability, error handling, transaction handling, recovery, session handling etc.

Next month's article will continue with testing best practices with a focus on test strategies and test plans.


For more information on related topics visit the following related portals...
Benchmarking/Best Practices.

Robert Wegener is the director of solutions for RCG Information Technology's Web services. He has more than 20 years of information and business engineering experience in operations, customer service, transportation, finance, product development, telecommunications and information systems. Wegener also has extensive experience in various process and development methodologies, quality assurance and testing methodologies. He can be contacted by e-mail at rwegener@rcgit.com.

Solutions Marketplace
Provided by IndustryBrains

Design Databases with ER/Studio: Free Trial
ER/Studio delivers next-generation data modeling. Multiple, distinct physical models based on a single logical model give you the tools you need to manage complex database environments and critical metadata in an intuitive user interface.

Save on Business Intelligence and Data Warehousing
Leverage Open Source database software and PC-based commodity hardware for an unsurpassed price/performance value. ExtenDB transforms the economics in developing a Business Intelligence infrastructure.

Help Desk Software Co-Winners HelpSTAR and Remedy
Help Desk Technology's HelpSTAR and BMC Remedy have been declared co-winners in Windows IT Pro Readers' Choice Awards for 2004. Discover proven help desk best practices right out of the box.

Data Mining: Strategy, Methods & Practice
Learn how experts build and deploy predictive models by attending The Modeling Agency's vendor-neutral courses. Leverage valuable information hidden within your data through predictive analytics. Click through to view upcoming events.

Click here to advertise in this space

E-mail This Column E-Mail This Column
Printer Friendly Version Printer-Friendly Version
Related Content Related Content
Request Reprints Request Reprints
Site Map Terms of Use Privacy Policy
SourceMedia (c) 2005 DM Review and SourceMedia, Inc. All rights reserved.
Use, duplication, or sale of this service, or data contained herein, is strictly prohibited.