Portals eNewsletters Web Seminars dataWarehouse.com DM Review Magazine
DM Review | Covering Business Intelligence, Integration & Analytics
   Covering Business Intelligence, Integration & Analytics Advanced Search

View all Portals

Scheduled Events
Archived Events

White Paper Library
Research Papers

View Job Listings
Post a job


DM Review Home
Current Magazine Issue
Magazine Archives
Online Columnists
Ask the Experts
Industry News
Search DM Review

Buyer's Guide
Industry Events Calendar
Monthly Product Guides
Software Demo Lab
Vendor Listings

About Us
Press Releases
Advertising/Media Kit
Magazine Subscriptions
Editorial Calendar
Contact Us
Customer Service

Data Integration:
Solution Integration: An Approach to Ensure Failure

online columnist Greg Mancuso and Al Moreno     Column published in DMReview.com
August 12, 2004
  By Greg Mancuso and Al Moreno

During the past few months we have dealt with difficult issues facing company IT groups with regard to various data integration issues. This month we are going to deviate slightly, dealing instead with the more critical topic of business and functional integration. If you look up the word "integration" in a thesaurus you will get: addition, mixing, incorporation, combination, amalgamation and assimilation. All of these terms are significant when you look at what is considered when organizations are trying to determine what hardware and software is required in order to process the company's information. Unfortunately, organizations still remain that attack this issue with no functional or strategic planning in place.

In order to determine what is best in any technical situation, one needs to completely understand the requirements, the goals and the objectives. A mind-set tends to exist within some organizations that if they select and deploy the best hardware and the best software, all of their data integration and analytical needs will be magically resolved. Nothing could be farther from the truth. The fact remains that there is a critical link between the hardware, the software, the tools and the business requirements.

Organizations that insist on shopping for hardware and/or software without first assembling the organization's needs are setting themselves up to learn a very expensive lesson. Experience teaches us that the single most critical factor for success is that an organization understands what drivers are creating the current need for a new solution. How many individuals make major purchases without first determining what they want or need? Make no mistake, without understanding what you need; you are not likely to select the products or solution that will solve that need or help you deal effectively with your business objectives.

Consider an organization that uniquely approached the hardware and software selection process before any requirements were really understood or formulated. The organization had an idea and rough thoughts of the end goal - the need for construction of a database. However, no analysis was done to determine what type of database was needed - OLTP, OLAP or hybrid. All that was really known was that a very large database was needed. There was only a preliminary concept about how the data needed to be stored, accessed or reported. To complicate the situation, the organization had little expertise at crafting very large solutions on any database platform. The only hardware and software facts they were certain of were that they needed a data collection application and for that they needed hardware and a database.

Some organizations choose to test the various capabilities of the tools being considered for their solutions using the proof of concept (POC) approach. The purpose of the POC is to evaluate how discrete functional requirements are satisfied by various components of the planned solution. Organizations should design a POC that simulates the company's "real-world" environment in a controlled setting and with limited data volumes and/or sources. Of course, in order to adequately plan and execute a POC, the basic tenet of success - understanding of the requirements - must be in place.

The organization mentioned above chose to use this approach; however, the requirements were not well understood. This means that rather than plan a real-world POC, the company planned for several POCs. Each POC was to evaluate one technology layer without concern to any of the related tools and components. For example, one POC tested the various database platforms. Another POC tested the hardware infrastructure. Still another POC tested the ETL process and other support software. Each tool and platform vendor was tasked with performing its own POC given only the most basic instructions - make the "<insert technology component here>" work. There was no guidance as to what platforms, tools, hardware or the like should be mated when designing the POC solution. While one might consider such an approach reasonable, a major flaw here is that using this process puts no thought into how each of the hardware platforms might impact design of the application or that each database provides various and differing methods to maximize performance.

Even allowing for the obvious issue of not understanding what the business actually needed, the time and effort required to perform and evaluate so many POC results would be prohibitive, mostly unnecessary, and would have been better spent creating a full design that could actually be deployed as a viable solution. Basically, each POC was allegedly designed to identify the best database platform, the best hardware platform, and the best application mix (custom developed, commercial ETL, etc.). Unfortunately, each of these components was reviewed and tested in a separate, unrelated POC. Further, without any controls or master integration plan, the results created more questions than answers. It soon became obvious that the best hardware and the best database and the best tools did not necessarily equate to the best integrated solution.

Oftentimes, software performs better on one platform over another, as the technology sector is full of strategic partnerships between various vendors; and these partnerships often involve codeveloped applications and/or hooks into one another's products. The problem further compounds when performing benchmarks in a vacuum (one for database, one for hardware, one for software, etc.) as there is no way to accurately assess the inevitable interactions between the different components or to devise any accurate way to measure what environmental combination works best in the particular given situation. Without being able to understand what platforms you are designing for, applications are created generically and cannot be optimized since hardware/software/tools may impose a particular set of design implications.

Not surprisingly, the end result was that every vendor managed to pass the initial POC benchmark requirements, and the company had no clue how to move forward since each of the vendors used a different combination of hardware, software and tools. Without any type of formal test plan or stringent requirement that the POC environment mirror the conceptual production environment, the client knew no more at the end of the long drawn-out process than they did initially. The company was left with the knowledge that there were many possible combinations that would meet or exceed their stated benchmark goals.

Also, the end result of performing many unrelated POCs is that an organization is unable to use any architecture from any of the vendors since each component was tested in a vacuum. There can never be any way to determine if the "best" component from each POC would truly satisfy the business requirements for an "integrated" solution.

In summary, companies considering the purchase of new technology must ask themselves critical questions. What are the business requirements and what technical requirements result from these business specifications? What interactions and technical integration is required? How will the hardware and software work together to solve the technical project requirements? Is my proposed solution the best for my needs based on measurable criteria that has measurable results and outcomes?

The ability to effectively integrate hardware and software to solve very specific issues is one the critical success factors of any major project. IT groups should require that vendors to provide truly "integrated" solutions based on tried and tested reference architectures. These architectures must be previously implemented and have the ability to show how they solved a similar technical set of criteria to the one applicable to the specific situation or set of requirements. In addition, companies need to have a sound methodology in their selection process. Those tasked with selection must also be open and willing to forego making decisions based solely on past experiences as the DBMS and software markets have greatly evolved thereby giving today's buyers choices that truly solve complex and specialized technical requirements.

In order to be able to answer all the critical questions, a company must have a detailed technical understanding of what they want and need to accomplish. This means that if they lack the internal subject matter expertise, they must be willing to go outside and secure the proper resources as part of project. Cost factors, attempts to limit budgets and a lack of understanding of the true users' needs are almost always guaranteed to produce either failure or undesirable long -term results. IT groups who move forward without doing the analysis and without the expertise must then face the two most devastating project results: significant cost overruns and unhappy end users.


For more information on related topics visit the following related portals...
Data Integration.

Greg Mancuso and Al Moreno are principals with Sinecon, a business intelligence consultancy specializing in data integration and BI/DW solution architecture design. Together they have more than 29 years of data warehouse and business intelligence experience and have implemented many large-scale solutions in both the U.S. and European markets. They may be reached at gmancuso@sinecon-llc.com or amoreno@sinecon-llc.com.

Solutions Marketplace
Provided by IndustryBrains

Design Databases with ER/Studio: Free Trial
ER/Studio delivers next-generation data modeling. Multiple, distinct physical models based on a single logical model give you the tools you need to manage complex database environments and critical metadata in an intuitive user interface.

Free EII Buyer's Guide
Understand EII - Trends. Tech. Apps. Calculate ROI. Download Now.

Data Quality Tools, Affordable and Accurate
Protect against fraud, waste and excess marketing costs by cleaning your customer database of inaccurate, incomplete or undeliverable addresses. Add on phone check, name parsing and geo-coding as needed. FREE trial of Data Quality dev tools here.

Data Mining: Levels I, II & III
Learn how experts build and deploy predictive models by attending The Modeling Agency's vendor-neutral courses. Leverage valuable information hidden within your data through predictive analytics. Click through to view upcoming events.

Use MS Word as your Report Generator
Create reports in PDF, RTF, HTML, TXT, XLS & more. Use MS Word to design the reports and reduce development time by 90%. Easy-to-use custom secure report generation - Fast! Free Demo.

Click here to advertise in this space

E-mail This Column E-Mail This Column
Printer Friendly Version Printer-Friendly Version
Related Content Related Content
Request Reprints Request Reprints
Site Map Terms of Use Privacy Policy
SourceMedia (c) 2006 DM Review and SourceMedia, Inc. All rights reserved.
SourceMedia is an Investcorp company.
Use, duplication, or sale of this service, or data contained herein, is strictly prohibited.