Portals eNewsletters Web Seminars dataWarehouse.com DM Review Magazine
DM Review | Covering Business Intelligence, Integration & Analytics
   Covering Business Intelligence, Integration & Analytics Advanced Search
advertisement

RESOURCE PORTALS
View all Portals

WEB SEMINARS
Scheduled Events

RESEARCH VAULT
White Paper Library
Research Papers

CAREERZONE
View Job Listings
Post a job

Advertisement

INFORMATION CENTER
DM Review Home
Newsletters
Current Magazine Issue
Magazine Archives
Online Columnists
Ask the Experts
Industry News
Search DM Review

GENERAL RESOURCES
Bookstore
Buyer's Guide
Glossary
Industry Events Calendar
Monthly Product Guides
Software Demo Lab
Vendor Listings

DM REVIEW
About Us
Press Releases
Awards
Advertising/Media Kit
Reprints
Magazine Subscriptions
Editorial Calendar
Contact Us
Customer Service

Meta Data & Knowledge Management:
Managed Meta Data Environment: A Complete Walk-Through, Part 6

  Column published in DM Review Magazine
September 2004 Issue
 
  By David Marco

This column is adapted from the book Universal Meta Data Models by David Marco & Michael Jennings (John Wiley & Sons).

In the last several columns, I presented the six major components of a managed meta data environment (MME): meta data sourcing layer, meta data integration layer, meta data repository, meta data management layer, meta data marts and meta data delivery layer. This installment will discuss the MME's fourth component, the meta data management layer.

The meta data management layer provides systematic management of the meta data repository and the other MME components (see Figure 1). As with other layers, the approach to this component greatly differs depending on whether a meta data integration tool is used or if the entire MME is custom-built. If an enterprise meta data integration tool is used for the construction of the MME, then a meta data management interface is most likely built within the product. If it is not built in the product, you would need to do a custom build. The meta data management layer performs the following functions: archive, backup, database modifications, database tuning, environment management, job scheduling, load statistics, purging, query statistics, query and report generation, recovery, security processes, source mapping and movement, user interface management and versioning.

Archive. The archive function allows the meta data architect to set the criteria or event that triggers the MME archiving process. It should be able to archive all of the meta data in the meta data repository and the related meta data marts, and it should allow for specific meta data tables to be archived when necessary.

Backup. Backup functionality is often confused with archiving. Archiving targets the storage of older and less-needed versions of the MME, while backups refer to the process of making sure the current MME is stored in a separate database so that a backup version can be brought online if the production version of the MME is corrupted or if any of its components fail. Often, the backup strategy is implemented at a hardware level through the use of disk mirroring. Best practices in this area include storing the copy in a different physical location than the production version of the MME.

Database Modifications. Because the meta model is implemented in an open, relational database, often tables and columns within the meta model need to be added, modified or deleted. The meta data management layer needs to not only assist in this process, but also track the changes that have been made to the MME.

Database Tuning. Tuning of the meta data repository and its associated meta data marts is a very important part of the meta data management layer. First, identifying indexes ensures that reports run efficiently. When analyzing physical meta model structures, it is common to only see indexes on primary keys. This is typically a sign of the lack of or poor indexing strategies. Second, database tuning helps you identify and remove dormant meta data within the repository. A large MME that has been in production for a few years commonly has a good deal of dormant meta data. A sound MME will contain meta data that provides operational measurement statistics on the use of meta data in the MME to assist in the identification of dormant meta data.


Figure 1: Meta Data Management Layer

Environment Management. Many meta data professionals make the mistake of believing that when they implement a MME, they are implementing and maintaining one system. In reality, they are building and maintaining three (possibly more) systems: production, testing and development.

The production version of the MME is the system that is in the "production environment" of an organization and is the version of the MME that the end users access. The testing environment is the version used to test changes to system "bugs" found in the production version of the MME. The development version of the MME is used to test future major MME enhancements.

The names and number of MME environments differ based on the organization's internal IT standards; however, the three environments mentioned here are the most common. In any event, a good meta data management layer can handle any required number of environments and names. The environment management portion of the meta data management layer needs to organize and control the management and migration between these three system versions.

Job Scheduling. The program and process jobs that are executed to load and to access the MME need to be scheduled and managed. The job-scheduling portion of the meta data management layer is responsible for both event-based and batch-triggered job scheduling.

Load Statistics. The meta data extraction and integration layers of the MME generate a great deal of valuable MME loading statistics. These MME load statistics need to be historically stored within the meta data repository portion of the MME. Examples of the most common types of load statistics include:

  • How long did it take a particular process tread to run (clock time and CPU time)?
  • How long did the entire meta data extraction and integration layers take to run (both clock and CPU time)?
  • What errors were encountered in the meta data extraction and integration layers?
  • What were the categories (e.g., informational, warning, severe, critical, etc.) of the errors that were logged?
  • How many rows were inserted, changed or deleted in each table of the meta model?

Next month I will finish walking through the functions of the meta data management layer.

...............................................................................

Check out DMReview.com's resource portals for additional related content, white papers, books and other resources.

David Marco is an internationally recognized expert in the fields of enterprise architecture, data warehousing and business intelligence and is the world's foremost authority on meta data. He is the author of Universal Meta Data Models (Wiley, 2004) and Building and Managing the Meta Data Repository: A Full Life-Cycle Guide (Wiley, 2000). Marco has taught at the University of Chicago and DePaul University, and in 2004 he was selected to the prestigious Crain's Chicago Business "Top 40 Under 40."  He is the founder and president of Enterprise Warehousing Solutions, Inc., a GSA schedule and Chicago-headquartered strategic partner and systems integrator dedicated to providing companies and large government agencies with best-in-class business intelligence solutions using data warehousing and meta data repository technologies. He may be reached at (866) EWS-1100 or via e-mail at DMarco@EWSolutions.com.

Solutions Marketplace
Provided by IndustryBrains

Autotask: The IT Business Solution
Run your tech support, IT projects and more with our web-based business management. Optimizes resources and tracks billable project and service work. Get a demo via the web, then try it free with sample data. Click here for your FREE WHITE PAPER!

Manage Data Center from Virtually Anywhere!
Learn how SecureLinx remote IT management products can quickly and easily give you the ability to securely manage data center equipment (servers, switches, routers, telecom equipment) from anywhere, at any time... even if the network is down.

Increase Your Existing Data Center Capabilities
Robert Frances Group believes that the enterprise data center is undergoing significant and sustained transformation?and that Azul Systems has delivered on a new approach to delivering processing and memory resources to enterprise applications...

Data Validation Tools: FREE Trial
Protect against fraud, waste and excess marketing costs by cleaning your customer database of inaccurate, incomplete or undeliverable addresses. Add on phone check, name parsing and geo-coding as needed. FREE trial of Data Quality dev tools here.

Free EII Buyer's Guide
Understand EII - Trends. Tech. Apps. Calculate ROI. Download Now.

Click here to advertise in this space


View Full Issue View Full Magazine Issue
E-mail This Column E-Mail This Column
Printer Friendly Version Printer-Friendly Version
Related Content Related Content
Request Reprints Request Reprints
Advertisement
advertisement
Site Map Terms of Use Privacy Policy
SourceMedia (c) 2006 DM Review and SourceMedia, Inc. All rights reserved.
SourceMedia is an Investcorp company.
Use, duplication, or sale of this service, or data contained herein, is strictly prohibited.