Portals eNewsletters Web Seminars dataWarehouse.com DM Review Magazine
DM Review | Covering Business Intelligence, Integration & Analytics
   Covering Business Intelligence, Integration & Analytics Advanced Search

View all Portals

Scheduled Events
Archived Events

White Paper Library
Research Papers

View Job Listings
Post a job


DM Review Home
Current Magazine Issue
Magazine Archives
Online Columnists
Ask the Experts
Industry News
Search DM Review

Buyer's Guide
Industry Events Calendar
Monthly Product Guides
Software Demo Lab
Vendor Listings

About Us
Press Releases
Advertising/Media Kit
Magazine Subscriptions
Editorial Calendar
Contact Us
Customer Service

Information Strategy:
The Master Data Management Challenge

  Column published in DM Review Magazine
May 2005 Issue
  By Jane Griffin

In my last column, I talked about building a culture of data quality. This month I would like to talk about implementing an enterprise data quality initiative. The best place to begin a data quality initiative is with the data that tells you how much money you're making and how much money you're spending - for example, data about your customers, vendors and products. This is your organization's master data. It's also the most valuable nonmonetary asset your organization owns.

Let's start with a more precise definition of master data. Master data can be defined as the information required to create and maintain an enterprise-wide "system of record" for your core business entities, in order to capture business transactions and measure results for these entities. Obviously, this master data spans all organizational business functions.

For example, in the sales, marketing and customer service functions, master data can consist of customer numbers, service codes, warranty information, distribution information and partner information. In the supply chain function, master data might include information on products, item codes and suppliers. In the finance function, master data might include information on cost centers, department codes and company hierarchies. As you can see, master data is really the core data about your company. Despite its importance, master data is often woefully inconsistent across the company. Data values that should uniquely describe entities are often different in different business units. Relationships between entities such as products and suppliers are often defined differently in different information systems. Numerous hierarchies also exist for master data entities. Further, identifiers that are supposed to be unique to each entity are sometimes either used multiple times or used incorrectly.

The cost of this inconsistency can be enormous. For example, inconsistent data can lead to operational inefficiencies and higher manufacturing costs. Most critical these days, however, is the fact that inconsistent data can hamper your company's ability to comply with federal regulations such as Sarbanes-Oxley.

Obviously, it's crucial that master data be managed effectively. Fortunately, there are established methodologies out there to help you implement a master data management (MDM) initiative. Most of them will probably work, but any methodology you choose should enable you to work with your data from the source systems all the way through to the end user. It should also enable you to manage data quality effectively for the long term.

The first step in the process is to select the initial entities that you want to include in your MDM initiative. At a minimum, you should include entities related to your customers, suppliers, products and finances. Once you have selected the initial entities to include in the MDM initiative, the next step should be to analyze each entity across your information systems for accuracy, completeness, structure, business rules compliance and uniformity.

As far as accuracy is concerned, let's say that the  "product code" field should only contain numbers, but in some instances contains text. If this is the case, you probably have a data accuracy problem. Data completeness simply means that a field is used more times than not. For example, consider whether most of your systems use the term "item code," instead of the term "product code." Because this probably means that "product code" is only used in 5 percent to 10 percent of your systems, it's probably better to standardize all systems using "item code" instead.

Data structure analysis focuses on examining how the various entities in your information systems are related to one another. For instance, if the way in which you identify entities such as, "customer," "product code," etc., is not unique in each instance of that entity (maybe it's possible for two products to have the same code in different systems), you have a data structure problem.

In terms of analyzing an entity for business rules compliance, that simply means asking the question of whether information is entered into the system as business rules that your company has defined dictate they should be. Finally, uniformity analysis focuses on determining whether or not the same data item has the same form from system to system.

Now for the part that makes an MDM initiative different from other data quality initiatives: these analyses are performed not only on one system, but across the company's information architecture. By performing this cross-system analysis, you will help ensure that your company's data is accurate, uniform and complete. This will give you the ability to get, "one single version of the truth" from your company's information systems. Once you have profiled your enterprise master data, what you do next should be supported by enterprise data governance process and data management architecture. We'll cover those topics in future columns.

The old axiom, "garbage in, garbage out," has never been truer. For this reason, it's critically important that your data be of the highest quality.


For more information on related topics visit the following related portals...
Data Management and Master Data Management.

Jane Griffin is a Deloitte Consulting partner. Griffin has designed and built business intelligence solutions and data warehouses for clients in numerous industries. She may be reached via e-mail at janegriffin@deloitte.com

Solutions Marketplace
Provided by IndustryBrains

Design Databases with ER/Studio: Free Trial
ER/Studio delivers next-generation data modeling. Multiple, distinct physical models based on a single logical model give you the tools you need to manage complex database environments and critical metadata in an intuitive user interface.

Data Quality Tools, Affordable and Accurate
Protect against fraud, waste and excess marketing costs by cleaning your customer database of inaccurate, incomplete or undeliverable addresses. Add on phone check, name parsing and geo-coding as needed. FREE trial of Data Quality dev tools here.

Free EII Buyer's Guide
Understand EII - Trends. Tech. Apps. Calculate ROI. Download Now.

Data Mining: Levels I, II & III
Learn how experts build and deploy predictive models by attending The Modeling Agency's vendor-neutral courses. Leverage valuable information hidden within your data through predictive analytics. Click through to view upcoming events.

Use MS Word as your Report Generator
Create reports in PDF, RTF, HTML, TXT, XLS & more. Use MS Word to design the reports and reduce development time by 90%. Easy-to-use custom secure report generation - Fast! Free Demo.

Click here to advertise in this space

View Full Issue View Full Magazine Issue
E-mail This Column E-Mail This Column
Printer Friendly Version Printer-Friendly Version
Related Content Related Content
Request Reprints Request Reprints
Site Map Terms of Use Privacy Policy
SourceMedia (c) 2006 DM Review and SourceMedia, Inc. All rights reserved.
SourceMedia is an Investcorp company.
Use, duplication, or sale of this service, or data contained herein, is strictly prohibited.