DM Review Published in DM Review in May 2005.
Printed from

Information Strategy: The Master Data Management Challenge

by Jane Griffin

In my last column, I talked about building a culture of data quality. This month I would like to talk about implementing an enterprise data quality initiative. The best place to begin a data quality initiative is with the data that tells you how much money you're making and how much money you're spending - for example, data about your customers, vendors and products. This is your organization's master data. It's also the most valuable nonmonetary asset your organization owns.

Let's start with a more precise definition of master data. Master data can be defined as the information required to create and maintain an enterprise-wide "system of record" for your core business entities, in order to capture business transactions and measure results for these entities. Obviously, this master data spans all organizational business functions.

For example, in the sales, marketing and customer service functions, master data can consist of customer numbers, service codes, warranty information, distribution information and partner information. In the supply chain function, master data might include information on products, item codes and suppliers. In the finance function, master data might include information on cost centers, department codes and company hierarchies. As you can see, master data is really the core data about your company. Despite its importance, master data is often woefully inconsistent across the company. Data values that should uniquely describe entities are often different in different business units. Relationships between entities such as products and suppliers are often defined differently in different information systems. Numerous hierarchies also exist for master data entities. Further, identifiers that are supposed to be unique to each entity are sometimes either used multiple times or used incorrectly.

The cost of this inconsistency can be enormous. For example, inconsistent data can lead to operational inefficiencies and higher manufacturing costs. Most critical these days, however, is the fact that inconsistent data can hamper your company's ability to comply with federal regulations such as Sarbanes-Oxley.

Obviously, it's crucial that master data be managed effectively. Fortunately, there are established methodologies out there to help you implement a master data management (MDM) initiative. Most of them will probably work, but any methodology you choose should enable you to work with your data from the source systems all the way through to the end user. It should also enable you to manage data quality effectively for the long term.

The first step in the process is to select the initial entities that you want to include in your MDM initiative. At a minimum, you should include entities related to your customers, suppliers, products and finances. Once you have selected the initial entities to include in the MDM initiative, the next step should be to analyze each entity across your information systems for accuracy, completeness, structure, business rules compliance and uniformity.

As far as accuracy is concerned, let's say that the  "product code" field should only contain numbers, but in some instances contains text. If this is the case, you probably have a data accuracy problem. Data completeness simply means that a field is used more times than not. For example, consider whether most of your systems use the term "item code," instead of the term "product code." Because this probably means that "product code" is only used in 5 percent to 10 percent of your systems, it's probably better to standardize all systems using "item code" instead.

Data structure analysis focuses on examining how the various entities in your information systems are related to one another. For instance, if the way in which you identify entities such as, "customer," "product code," etc., is not unique in each instance of that entity (maybe it's possible for two products to have the same code in different systems), you have a data structure problem.

In terms of analyzing an entity for business rules compliance, that simply means asking the question of whether information is entered into the system as business rules that your company has defined dictate they should be. Finally, uniformity analysis focuses on determining whether or not the same data item has the same form from system to system.

Now for the part that makes an MDM initiative different from other data quality initiatives: these analyses are performed not only on one system, but across the company's information architecture. By performing this cross-system analysis, you will help ensure that your company's data is accurate, uniform and complete. This will give you the ability to get, "one single version of the truth" from your company's information systems. Once you have profiled your enterprise master data, what you do next should be supported by enterprise data governance process and data management architecture. We'll cover those topics in future columns.

The old axiom, "garbage in, garbage out," has never been truer. For this reason, it's critically important that your data be of the highest quality.

Jane Griffin is a Deloitte Consulting partner. Griffin has designed and built business intelligence solutions and data warehouses for clients in numerous industries. She may be reached via e-mail at

Copyright 2006, SourceMedia and DM Review.