Portals eNewsletters Web Seminars dataWarehouse.com DM Review Magazine
DM Review | Covering Business Intelligence, Integration & Analytics
   Covering Business Intelligence, Integration & Analytics Advanced Search
advertisement

RESOURCE PORTALS
View all Portals

WEB SEMINARS
Scheduled Events

RESEARCH VAULT
White Paper Library
Research Papers

CAREERZONE
View Job Listings
Post a job

Advertisement

INFORMATION CENTER
DM Review Home
Newsletters
Current Magazine Issue
Magazine Archives
Online Columnists
Ask the Experts
Industry News
Search DM Review

GENERAL RESOURCES
Bookstore
Buyer's Guide
Glossary
Industry Events Calendar
Monthly Product Guides
Software Demo Lab
Vendor Listings

DM REVIEW
About Us
Press Releases
Awards
Advertising/Media Kit
Reprints
Magazine Subscriptions
Editorial Calendar
Contact Us
Customer Service

Meta Data & Knowledge Management:
Revisiting the Top 10 Meta Data Repository Mistakes, Part 1

  Column published in DM Review Magazine
May 2002 Issue
 
  By David Marco

In March and April of 1999, I wrote a two-part column for DM Review regarding the top 10 mistakes to avoid when building a meta data repository. Now that it is 2002, it is time to update this list to reflect the changes that have occurred in the meta data repository market space. This month we will examine the first five mistakes on our list.

1. Not Defining the Tangible Business and Technical Objectives of the Meta Data Repository. This is the top mistake that most companies make. Quite often, the meta data repository team will neglect to clearly define the specific business and technical value that their meta data repository will provide. It is critical to define these objectives up front as they will guide all proceeding project activity. When selling the concept of meta data to your corporation's senior management, there are only two things that they understand: increasing revenues or decreasing costs. If you are not talking about increasing revenues or decreasing expenses, you are the IT version of the schoolteacher on the old Peanuts cartoon: blah, blah, blah, blah, blah.

Clear business and technical objectives are definable and measurable. This activity is imperative, as once the meta data repository is completed the management team will have to justify the cost expenditures of the initiative. Keep in mind that a meta data repository, like a data warehouse, is not a project; it is a process. The repository will need to grow to support the ever-expanding role of the data warehouse/data marts and operational systems that it supports. In addition, as business users become more sophisticated, their demands will substantially increase. Once a cost justification can be quantified for the initial release of the repository, the process for gaining funding for the follow-up releases is greatly simplified.

2. Examining Meta Data Tools Before Defining Requirements. It is surprising how often I receive calls from companies asking me to suggest a meta data tool for their repository project. My standard response is, "What are your repository's requirements?" Typically the reply from the other end of the line is silence. This situation is highly concerning. The meta data repository requirements must guide the tool selection process, not follow it. When a tool is selected before requirements are defined, quite often the requirements are forced to match the tool capabilities, rather than solve business problems.

As we discussed, clear requirements for the meta data project are critical as they provide the lighthouse for all subsequent project activities. Without this beacon, it becomes all too probable that the project's course will go awry.

3. Selecting a Meta Data Tool Without Conducting an Evaluation. All of the major meta data vendor tools maintain and control the repository in a different manner. Finding the tool that best suits your company requires careful analysis. Educated consumers will be the most satisfied because they understand exactly what they're buying and what they're not buying.

Remember that whichever tool is purchased, not one of them will make meta data integration "easy," regardless of the marketing materials or salesperson's hype. A successful meta data project requires knowledge, discipline, talented employees and good old-fashioned hard work, just like any other major IT endeavor. While none of the tools eliminate these needs, for some companies it is better to purchase a tool and work around its limitations, as opposed to building everything from scratch.

4. Not Creating a Meta Data Repository Team. Very often, companies neglect to form a dedicated meta data repository team. The team should be responsible for maintaining, controlling and providing access into the meta data repository. The typical meta data repository team at full staff will consist of one or two data modelers, two meta data integration developers, two meta data access developers, one or two business analysts, a meta data repository architect and a project leader. Keep in mind that some of the roles can be fulfilled by the same resource, depending on the size and schedule of the effort.

It is important for the meta data repository project leader to report to the same person as the head of the business intelligence system. This creates a peer-level relationship between the meta data repository and the data warehouse team leaders. The meta data repository team and the business intelligence team must work together because each team's work directly impacts the other. Flawed or muddled data warehouse architecture will directly impact the quality of the meta data repository. Conversely, a poorly designed repository will greatly reduce the effectiveness of the data warehouse.

5. Having Too Many Manual Processes in the Meta Data Integration Architecture. The process for loading and maintaining the meta data repository needs to be as automated as possible. Less-than-successful meta data implementations typically contain far too many manual processes in their integration architectures. The task of manually keying meta data becomes much too time- consuming for the meta data repository team. With careful analysis and some development effort, the vast majority of these manual processes can be removed.

Often, much of the business meta data will require some sort of manual activity just to capture the information. Additional processes will most likely need to be developed to allow the business leaders and analysts to modify the business meta data. Unfortunately, some companies manually key a great deal of their business meta data, which makes the repository non- scalable, stale and impossible to maintain over time.

Next month's column will revisit the remaining mistakes.

...............................................................................

Check out DMReview.com's resource portals for additional related content, white papers, books and other resources.

David Marco is an internationally recognized expert in the fields of enterprise architecture, data warehousing and business intelligence and is the world's foremost authority on meta data. He is the author of Universal Meta Data Models (Wiley, 2004) and Building and Managing the Meta Data Repository: A Full Life-Cycle Guide (Wiley, 2000). Marco has taught at the University of Chicago and DePaul University, and in 2004 he was selected to the prestigious Crain's Chicago Business "Top 40 Under 40."  He is the founder and president of Enterprise Warehousing Solutions, Inc., a GSA schedule and Chicago-headquartered strategic partner and systems integrator dedicated to providing companies and large government agencies with best-in-class business intelligence solutions using data warehousing and meta data repository technologies. He may be reached at (866) EWS-1100 or via e-mail at DMarco@EWSolutions.com.

Solutions Marketplace
Provided by IndustryBrains

Data Validation Tools: FREE Trial
Protect against fraud, waste and excess marketing costs by cleaning your customer database of inaccurate, incomplete or undeliverable addresses. Add on phone check, name parsing and geo-coding as needed. FREE trial of Data Quality dev tools here.

Backup SQL Server or Exchange Continuously
FREE WHITE PAPER. Recover SQL Server, Exchange or NTFS data within minutes with TimeSpring?s continuous data protection (CDP) software. No protection gaps, no scheduling requirements, no backup related slowdowns and no backup windows to manage.

Speed Databases 2500% - World's Fastest Storage
Faster databases support more concurrent users and handle more simultaneous transactions. Register for FREE whitepaper, Increase Application Performance With Solid State Disk. Texas Memory Systems - makers of the World's Fastest Storage

Manage Data Center from Virtually Anywhere!
Learn how SecureLinx remote IT management products can quickly and easily give you the ability to securely manage data center equipment (servers, switches, routers, telecom equipment) from anywhere, at any time... even if the network is down.

Design Databases with ER/Studio: Free Trial
ER/Studio delivers next-generation data modeling. Multiple, distinct physical models based on a single logical model give you the tools you need to manage complex database environments and critical metadata in an intuitive user interface.

Click here to advertise in this space


View Full Issue View Full Magazine Issue
E-mail This Column E-Mail This Column
Printer Friendly Version Printer-Friendly Version
Related Content Related Content
Request Reprints Request Reprints
Advertisement
advertisement
Site Map Terms of Use Privacy Policy
SourceMedia (c) 2006 DM Review and SourceMedia, Inc. All rights reserved.
SourceMedia is an Investcorp company.
Use, duplication, or sale of this service, or data contained herein, is strictly prohibited.