Portals eNewsletters Web Seminars dataWarehouse.com DM Review Magazine
DM Review | Information Is Your Business
   Information Is Your Business Advanced Search
advertisement

RESOURCE PORTALS
Business Intelligence
Compliance
Corporate Performance Management
Data Management
Data Modeling
Data Quality
Data Warehousing Basics
ETL
Master Data Management
View all Portals

WEB SEMINARS
Scheduled Events

RESEARCH VAULT
White Paper Library
Research Papers

CAREERZONE

Advertisement

INFORMATION CENTER
DM Review Home
Newsletters
Current Magazine Issue
Magazine Archives
Online Columnists
Ask the Experts
Industry News
Search DM Review

GENERAL RESOURCES
Bookstore
Buyer's Guide
Glossary
Industry Events Calendar
Monthly Product Guides
Software Demo Lab
Vendor Listings

DM REVIEW
About Us
Press Releases
Awards
Advertising/Media Kit
Reprints
Magazine Subscriptions
Editorial Calendar
Contact Us
Customer Service

BI Strategy:
Strategies for Enterprise Content Management

online columnist Rich Cohen     Column published in DMReview.com
April 13, 2006
 
  By Rich Cohen

In last month's column, I discussed how to integrate enterprise content management (ECM) into your company's existing BI infrastructure. The topic was well-received, but I have had many requests to delve more deeply into the topic of the ECM lifecycle. I think it is also important to talk about ECM because it's becoming increasingly difficult to use and interpret structured content - such as data-based content - without having access to supporting unstructured content to fill in the gaps in the corporate knowledge base. So, in this month's column, I'd like to discuss some useful strategies for managing enterprise content (EC).

To review, the phases of the EC lifecycle (as illustrated in Figure 1) are:

  • Create/contribute/aggregate
  • Manage
  •  Assemble
  • Deliver

 Figure 1: The EC Life Cycle

Before I discuss ECM strategies, I want to preface the conversation by saying that ECM is an enormous topic. There are literally thousands of white papers, brochures, etc. available that debate every nuance of ECM. For this column, however, I'm going to focus on three areas: developing the right standards, governance requirements and processes for ECM; choosing the right technology components to fit your needs; and providing easy access to EC for consumers.

Let's start with the creation of EC. EC usually comes in variations of three forms: paper or electronically stored documents, digitized content, and Web content. At times, there may be some gray areas or ambiguities as to what type of content a specific item may be (for instance when a paper document is scanned in and then, perhaps, subsequently takes the form of HTML text) but the strategies for managing the creation of that content are basically the same.

The creation/contribution/aggregation phase usually begins with the manufacture and authentication of EC - no matter the format. It is critical to have a review and approval processes, driven by the relevant business area, to validate content contributions by using business rules and templates that help contributors create meaningful content through checks and balances that ensure accuracy. To create the content, it is often best to use standardize on desktop publishing and other (digital, Web creation, etc.) content creation tools throughout the company to regulate content and make it easier to contribute.

It is also critical to pick the right technology to manage EC. Once it has been contributed and/or aggregated, approved content is stored in a central repository where it is organized and codified for broad enterprise use and dissemination. One key to successful EC management is to focus on the data taxonomy and structure as well as the storage and retrieval mechanisms and processes. It is one thing to aggregate data, but if the processes by which the data is organized and stored in the central repository are cumbersome, or the data structures are not optimal, response times will be slow and it will take longer to aggregate and then disseminate the EC.

Architectural flexibility and scalability are the keys to making the most out of the technology you choose for the management phase of the EC lifecycle. Whatever tools you choose should be able to effectively manage version control, workflow coordination and enterprise collaboration functions. They should lend themselves to being implemented using different data structures and repository strategies such as multiserver, distributed or federated repositories.

Finally, the technology package you choose should be scalable to grow as the amount and types of EC increase so that you are not constantly put in the position of needing to re-architect your ECM solution when content types change or - as will inevitably be the case - EC surpasses database content as the predominant information source for the company.

Management involves more than technology, however. I've heard it said that EC, once created, lives forever. That's often the case, so once it's no longer needed, except for reference purposes, EC can be archived, based on business rules. This way, project contributions remain in the archives even after teams have disbanded, enabling new project teams to benefit from past work.

However, if content has truly outlived its use, it may be destroyed via standardized and documented protocols that take into account the sensitivity of the documents to be destroyed. In other words, do not throw sensitive materials in the trash or merely delete files - shred them or implement true, tested data-file destruction algorithms. More often than not, though, the value in archived content means that it can be reused, starting the lifecycle again.

The assembly of codified, standardized EC for distribution - via production servers - is pretty straightforward. In fact, I am often conflicted in breaking out management and assembly as separate EC lifecycle phases, because they are truly symbiotic. Simply put, what is needed most is technology that will accommodate different data structures and taxonomies and will facilitate efficient data storage and retrieval, easy movement of static content, standardized presentation templates, and other repository content from content repositories to production servers. Again, flexibility and scalability are keys to success. The toolset you choose should work with different management strategies and be scalable to grow with the volume of EC.

The final phase of the EC lifecycle, delivery, is also straightforward. What counts most in the delivery phase is a high level of usability and the ability to deliver content globally, through multiple channels and in a variety of formats. The user interface, regardless of delivery channel, should be easy to intuit and use. As far as channels go, it almost goes without saying that the chosen technology should have a high degree of support for XML-formatted EC and should deliver the EC through as many channels as is practicably possible - from PDAs to desktops to portable MP3 players if needed.

Finally, I believe the best delivery mechanism is a centralized content portal that can be customized based on user roles and security levels and that can deliver content in multiple file formats such as popular graphics tools, MP3 files, standard word processing documents and streaming video.

Once again, I have just begun to scratch the surface of the ECM debate. Hopefully, I have laid out enough information, however, to give you some food for thought about industry-leading strategies for creating, managing and delivering EC.

Whether you take my advice or not, it is critical that you do think about it. Enterprise content is mushrooming at an incredible rate, and I believe it will soon surpass structured content as the knowledge source for many workers and consumers, if it hasn't already. It is almost like the days when computers came into widespread corporate use (not that I remember that). The early adopters and pioneers of ECM technologies will almost certainly reap the best and most benefits from their investments.

 

...............................................................................

For more information on related topics visit the following related portals...
Content Management, Enterprise Information Management and Unstructured Data.

Rich Cohen is a principal in Deloitte Consulting LLP's Information Dynamics practice where he is responsible for the strategy, development and implementation of data governance, data warehousing, decision support and data mining engagements to support the emergence of world-class business intelligence applications. Cohen has more than 27 years of experience in the design, development, implementation and support of information technology in a variety of industries. Over the last 18 years, he has had extensive experience in the creation of technology strategies, implementations and deployment of CRM and business intelligence solutions to drive improved business performance.

Solutions Marketplace
Provided by IndustryBrains

Design Databases with ER/Studio: Free Trial
ER/Studio delivers next-generation data modeling. Multiple, distinct physical models based on a single logical model give you the tools you need to manage complex database environments and critical metadata in an intuitive user interface.

SAP & TommorrowNow Support
SAP & TomorrowNow offer competitor users full support cost savings through 2015 up to 50%. You get the time you need to make informed decisions about future migrations.

Validate Data at Entry. Free Trial of Web Tools
Protect against fraud, waste and excess marketing costs by cleaning your customer database of inaccurate, incomplete or undeliverable addresses. Add on phone check, name parsing and geo-coding as needed. FREE trial of Data Quality dev tools here.

DeZign for Databases - Database Design Made Easy
Create, design & reverse engineer databases with DeZign for Databases, a database design tool for developers and DBA's with support for Oracle, MySQL, MS SQL, MS Access, DB2, PostgreSQL, InterBase, Firebird, NexusDB, dBase and Pervasive.

Data Mining: Levels I, II & III
Learn how experts build and deploy predictive models by attending The Modeling Agency's vendor-neutral courses. Leverage valuable information hidden within your data through predictive analytics. Click through to view upcoming events.

Click here to advertise in this space


E-mail This Column E-Mail This Column
Printer Friendly Version Printer-Friendly Version
Related Content Related Content
Request Reprints Request Reprints
Advertisement
advertisement
Site Map Terms of Use Privacy Policy
SourceMedia (c) 2006 DM Review and SourceMedia, Inc. All rights reserved.
SourceMedia is an Investcorp company.
Use, duplication, or sale of this service, or data contained herein, is strictly prohibited.