Portals eNewsletters Web Seminars dataWarehouse.com DM Review Magazine
DM Review | Covering Business Intelligence, Integration & Analytics
   Covering Business Intelligence, Integration & Analytics Advanced Search

View all Portals

Scheduled Events

White Paper Library
Research Papers

View Job Listings
Post a job


DM Review Home
Current Magazine Issue
Magazine Archives
Online Columnists
Ask the Experts
Industry News
Search DM Review

Buyer's Guide
Industry Events Calendar
Monthly Product Guides
Software Demo Lab
Vendor Listings

About Us
Press Releases
Advertising/Media Kit
Magazine Subscriptions
Editorial Calendar
Contact Us
Customer Service

Knowledge: The Essence of Meta Data:
The Meta Data Support Model, Part 2

online columnist R. Todd Stephens, Ph.D.     Column published in DMReview.com
June 16, 2005
  By R. Todd Stephens, Ph.D.

Last month we took an in depth look at the first of two parts of this model. The model itself defines a support model that can be applied to any technology. We will focus on the lower section of the model this month and how these elements can be applied to the meta data services organization. Figure 1 provides an image of the model described here.

Figure 1: Meta Data Support Model

Focusing on the middle section, the meta data support model asks five basic questions of the technology environment:

  • What meta data products and services are available to me?
  • How can I utilize these products and services within my environment?
  • Who can help me in case I need some professional guidance?
  • Are the meta data applications ready for enterprise usage?
  • How am I doing in comparison to others and against best practices?

Who Can Help Me?

The customer doesn't want to feel alone and without support. Many times consultants and IT projects come into the environment and leave without a trace of letting the customer know how they can get further help. The customer support model provides three specific areas where the support team can provide help and assurance. The areas include an online environment, subject matter expert (SME) network and external resources.

Online Environment

Everything is going online; all business and applications are going to be made available online in the near future. This same approach needs to be taken into account by the meta data support organization. The Web allows customers to access information at any time during the day or night. Automating processes by utilizing the Web creates an environment of self-service where the vast majority of business and application functions are accessible 24 hours a day. When you start a meta data business initiative you will need to review your business processes that impact customers. This is a great time to determine how you can simplify those processes or interactions and at the same time provide new services over the Web. This could include incorporating impact analysis programs and other customer-focused initiatives. The Internet lends itself to direct interaction with each and every customer. As your online environment design and development evolve, you will be continually refining them, expanding the programs that have been successful and curtailing your efforts regarding the ones that have generated little bottom-line return on investment (Ruud & Deutz, 2002).

External ResourcesThe Web is made up of 90 billion pages, and there are plenty of resources available to us on the topic of meta data. Bringing in external resources can add substance to your implementation. Many organizations utilize industry experts such as Gartner to simply verify their strategy or support the implementation plans. The academic world publishes an enormous amount of research as well; IEEE and ACM are great sites. Many conferences actually publish their work online or by a simple e-mail, you may be able to request the article from the author. External resources such as books, online articles, subject matter experts and industry leaders are excellent resources for the meta data implementation environment.

SME Network

User groups are independently run, volunteer groups that meet on a regular basis to discuss and share information on a variety of technology topics. Joining a developer or meta data user group is an excellent, inexpensive way to receive technical education and meet with your peers to get more out of the latest platforms, products, technologies and resources. Collaboration, blogs, Wiki, community of practice (CoP) and discussion groups are also forms of SME networks. The basic purpose of collaboration is the creation of value in a manner that involves shared efforts. In order for collaboration to be successful, it must address common objectives, shared resolve and partnership behaviors. This, for most companies, is much harder than it looks. Creating strategic and economic value through collaborative programs requires a balancing act of concepts, processes, resources and behaviors that are tendered by special leadership (Sloan, 2005).

The success of collaboration requires three primary elements. The first and most important is a collaborative culture that recognizes the value of collaboration and rewards those who model collaborative behavior. The second is the establishment of a solid collaboration technology foundation that minimizes choices among similar products but provides the widest range of channels to accommodate varying communication needs within and between business processes. The third is the presence of processes for aligning investments with the business, discovering collaborative opportunities, methodologies for modeling collaborative behavior and integration with planning, to provide perspectives and priorities for investments in collaborative work (Rasmus, 2003).

Is it Ready?

Information technology routinely engages in project work, skunk works and pilot type projects where the customer isn't really sure if the application is production ready. Meta data groups need to ensure that they communicate that the products and services are production ready, covered under service level agreements (SLA) and supported from a hardware and software perspective.


Operations consist of many different areas including hardware support, operating system support, licensing, backups, disaster recovery, application support, networking and even audits. The size of the repository application should not impact the operations, since processes and procedures should be standardized in such a way that we can add additional applications without reinventing the wheel. One area that seems to get overlooked is the program and data controls which ensures the quality and integrity of the meta data itself. Input controls should be reviewed on a regular basis as well as auditing the data itself in order to define any gaps in the system itself.

Technical Support

Technical support provides a user interface that can answer any questions a user may have about the meta data environment. Meta data services should provide an extensive list of customer support options, including Web-based and telephone access to technical experts and product updates so that issues can be resolved quickly and effectively. The technical support group may offer a range of product and industry knowledge and hands-on experience, by phone 24 hours a day. Or, you can take advantage of the online support services to receive product documentation, updates, application notes and access to the searchable knowledge base.

How am I Doing?

People want to know how well they are doing in comparison to other groups and industry wide comparisons. Are we utilizing the repository correctly? Are we getting enough value from the repository to defend the investment? Is the repository growing in utility? These are some of the questions that the user community is looking for.


One service that can be offered is that the meta data services group can review how an organization is actively utilizing the repository within their environment. The repository should be integrated into the day-to-day functions of the business. For example, we can review how the architecture community is utilizing the enterprise asset collection to manage the environment. The architecture environment can act as a governance body for enterprise assets which requires the organization to know what assets they have and how they are being used. Assets can and should be mapped to a domain model which also requires an association with the repository environment. Architecture is just one example of an organization that should utilize the enterprise asset catalog. Customers should be able to request a SME review from the meta data group at anytime. Assurance that they are doing the right thing can mean all the difference in the world.


Metrics have always been an important part of information technology, unfortunately they are generally an afterthought to the implementation. The natural progression of a system that moves from innovation, incubation and migration (or the SDLC of choice) is to eventually measure the impact and value the system brings to the technology portfolio.

In the world of meta data, there are an infinite number of possible metrics to be reviewed and captured. A couple of metrics that move to the forefront are the requirements that revolve around content and usage. Content is basically the metrics around what information you have inside the repositories. Without considering how the data is used; content focuses on the what. Perhaps the most obvious example of content metrics is the object count. An object count sounds like a simply concept except for the requirement of how you define an object. In the database world, there are plenty of options to consider when counting objects: entities, attributes, tables, databases, fields and specific component of meta data descriptors. Other assets types, such as Web services, simplify the object count process since the object itself is the service. As with any object, there is a specific meta-model that contains the meta data descriptors used to describe the asset. We can measure the breath and scope of these meta data elements for each object type as well as the percentage of completeness of the model itself. Some objects may have an extended meta-model with 20 meta data elements while others may only contain a few. The number of attachments is another measurement that we can take on a specific asset. The thinking here is that objects that have extended unstructured documents are better documented than those with only a few attachments. Examples of attachments could include logical models, UML models, user guides, installation instructions, etc.

The other key metric is usage. Remember, you can have all of the content in the world but with out usage you haven't done much more than build a neat inventory. Usage is the key to delivering long-term value-add to the organization. The first usage metric class is focused on the user. Many Web-based applications utilize three high-level classifications for user traffic. A "hit" is each individual file sent to a browser by the Web server. A "page view" can be described as each time a visitor views a Web page on your site, irrespective of how many hits are generated. Web pages are comprised of files. Every image in a page is a separate file. When a visitor looks at a page (i.e., a page view), they may see numerous images, graphics, pictures, etc. and generate multiple hits. For example, if you have a page with 10 pictures, then a request to a server to view that page generates 11 hits (10 for the pictures and one for the html file). A page view can contain hundreds of hits. This is the reason that we measure page views and not hits.

Additionally, there is a high potential for confusion here, because there are two types of "hits." The hits we are discussing in this article are the hits recorded by log files and interpreted by log analysis. A second type of "hits" are counted and displayed by a simple hit counter. Hit counters record one hit for every time a Web page is viewed, also problematic because it does not distinguish unique visitors. The third type of class is a visitor who is a human being, and their actions are "human" events, because only humans navigate the Internet (Opentracker, 2005). We can also track the length of time a person stays on the repository, what time of day is most popular and which day compromises the heaviest traffic. These time-based metrics are important to ensure the repository is up and operational 100 percent of the time during high traffic periods.

Now, if we move away from the user and focus the attention on the actual page or artifact, other metrics provide insight. We can tell which of the asset pages is viewed the most and which artifact has the highest download rate. These simple metrics may alter the way you present artifacts and even generate new classifications. Having links on the repository that represent most popular, most downloaded or latest additions add value to the meta data environment. These classifications are defined as usage-based classifications. In other words, the use of the repository actually defines the classification of the assets. Assuming your repository has some advanced features, you can measure how many subscriptions per asset you have, how many transactions may be processed by the component or what is the reuse level within the application. Remember, you can generate any number of metrics but we should only focus on the ones that can generate action, support the expansion of the brand and help managers understand the environment.

The described model provides a framework to implement a support group for meta data within an enterprise of any size. Most organizations are not going to invest enormous amount of funds (although they should) into the repository environment. For this reason, we should try to move as many business functions to the online environment as possible.

Perhaps the day will come where meta data investment will be standard across the industry, but we still have a ways to go. That is a good thing in the sense that it forces us to continuously review our business model and ensure value is being delivered every day. One instruction I give to my team is that every time you get a question or inquiry, we should review the online environment to see if the question could have been answered by the information posted online. We should always strive to deliver personal support but including a link with the e-mail will reinforce the fact that the information is readily available. Business and application functions should be moved to the online environment as much as possible. Chief executives recognize that technology - effectively planned, implemented and supported and can measurably contribute toward the realization of business objectives. The support model simply allows your meta data world to evolve and produce additional value over time.


Check out DMReview.com's resource portals for additional related content, white papers, books and other resources.

R. Todd Stephens, Ph.D. is the director of Meta Data Services Group for the BellSouth Corporation, located in Atlanta, Georgia. He has more than 20 years of experience in information technology and speaks around the world on meta data, data architecture and information technology. Stephens recently earned his Ph.D. in information systems and has more than 70 publications in the academic, professional and patent arena. You can reach him via e-mail at Todd@rtodd.com or to learn more visit http://www.rtodd.com/.

Solutions Marketplace
Provided by IndustryBrains

Data Validation Tools: FREE Trial
Protect against fraud, waste and excess marketing costs by cleaning your customer database of inaccurate, incomplete or undeliverable addresses. Add on phone check, name parsing and geo-coding as needed. FREE trial of Data Quality dev tools here.

Speed Databases 2500% - World's Fastest Storage
Faster databases support more concurrent users and handle more simultaneous transactions. Register for FREE whitepaper, Increase Application Performance With Solid State Disk. Texas Memory Systems - makers of the World's Fastest Storage

Manage Data Center from Virtually Anywhere!
Learn how SecureLinx remote IT management products can quickly and easily give you the ability to securely manage data center equipment (servers, switches, routers, telecom equipment) from anywhere, at any time... even if the network is down.

Design Databases with ER/Studio: Free Trial
ER/Studio delivers next-generation data modeling. Multiple, distinct physical models based on a single logical model give you the tools you need to manage complex database environments and critical metadata in an intuitive user interface.

Free EII Buyer's Guide
Understand EII - Trends. Tech. Apps. Calculate ROI. Download Now.

Click here to advertise in this space

E-mail This Column E-Mail This Column
Printer Friendly Version Printer-Friendly Version
Related Content Related Content
Request Reprints Request Reprints
Site Map Terms of Use Privacy Policy
SourceMedia (c) 2006 DM Review and SourceMedia, Inc. All rights reserved.
SourceMedia is an Investcorp company.
Use, duplication, or sale of this service, or data contained herein, is strictly prohibited.