|Sign-Up for Free Exclusive Services:||Portals|||||eNewsletters|||||Web Seminars|||||dataWarehouse.com|||||DM Review Magazine|
|Covering Business Intelligence, Integration & Analytics||Advanced Search|
My previous columns have focused on building a data management Center of Excellence to help coordinate the stewardship of corporate information. Data stewardship isn't something you "just do," though. It's a complex, ongoing task that requires companies to become information innovators. There are five building blocks that support any enterprise data stewardship program (DSP). How well you understand and use these building blocks will largely determine how well you manage your corporate information.
Data stewardship can be defined as custodianship of the access, integrity and content of a company's information. Although many IT authors have preached on the need for sound data stewardship for more than a decade, it always amazes me how many companies haven't bought into the philosophy. These companies often have no defined data ownership model. They also often don't have policies and standard processes in place for data control, which creates silos of data management. Or, for companies that do have them, they aren't enforced or they have an unclear mandate with poorly defined authority.
Those companies that are moving toward sound data stewardship, however, do things differently. They have several characteristics in common. These companies have formed a centralized stewardship committee with business and IT representation and have tasked it with developing a stewardship policy with clearly defined roles, procedures and data definitions standardized across the company. The policy includes a system of audits and checks implemented to test and confirm adherence to policy. The stewardship committee is often proactively involved in policy implementation.
To move toward better practices, however, it is critical to have senior management buy-in to support data as a corporate asset. It is also crucial to foster corporate-wide cultural adoption of data stewardship tenants. On the technical side, leading practices in data stewardship include creating an actively managed meta data environment that is accessible to end users so that they can have a better understanding of the corporate data structure and be readily involved in change management.
Figure 1 depicts the building blocks in a sound data stewardship program.
Figure 1: Data Management Building Blocks
The next block in the DSP is data setup and maintenance. This block is primarily concerned with how data is organized, stored, and maintained in corporate IT systems. In companies that don't have a consistent DSP, data setup and maintenance is chaotic; it is performed by multiple groups and in multiple systems. There is often a high volume of informal communication combined with a lack of defined setup and maintenance processes. These companies also typically have redundant data storage procedures in that they store identical data under different names and use different data types.
Companies that are on the path to a sound DSP have solved these problems, to a degree. They have defined data setup and maintenance processes that are proactively coordinated by a centralized group. They also use workflow automation tools for data setup and use exception monitoring for maintenance. In addition, there is usually a single IT system designated for data set-up and maintenance for each logical group of data.
Those companies moving toward better practices for data setup and maintenance have gone the extra mile, however. In these companies, setup and maintenance functions can be performed in any IT system, as needed, and are available in near real-time across the enterprise. There is also agreement across departments on data fields included in the data setup function. Finally, in these companies, the role of the centralized setup and maintenance group has evolved to include data subject-matter experts, in addition to operations support.
Expanding on the data setup and maintenance process, the next building block is data management. Data management is the function of coordinating the movement of data through the enterprise technical architecture. Data management also encompasses coordinating data security, controlling access to data, and managing changes in data as they occur in the course of business operations. In some companies today, however, data management is a pain point. In these companies, there are usually poor processes in place to manage data, and typically there is no formal data management organization.
As a result, changes to data are made extemporaneously, on an as-needed basis, by business units, with no consistent quality checks performed or audit trails established. Or, if they do track changes and perform quality checks, these companies use informal tools (e.g. spreadsheets and home-grown databases). Also, unfortunately, these companies usually have no formalized failover strategy to manage recovery from a catastrophic systems failure event. They are treading on dangerous ground.
The path to data management improvement includes establishing a formal process for data identification and management that requires audit trails, effective dating mechanisms, and policies to resolve data exceptions. It also includes implementing role-based security to coordinate data access. Finally, those companies that are determined to improve their data management procedures often have a defined failover strategy that is based on their organizational disaster recovery strategy.
Those companies that are moving toward better data management practices have formalized their data management structures to an even greater degree. These companies have role-based data access and formalized security processes. They also have formalized, flexible, and robust processes for managing their data, which results in fewer data exceptions and/or anomalies. When exceptions do occur, automated processes are in place to handle them so they don't disrupt workflow. Lastly, at leading-practice companies, there are multiple production sites that provide real-time failover operations so that catastrophic systems failure events cause minimal operational disruption.
Building upon sound data management, the next block is data quality assurance. This is the process of confirming the reliability and accuracy of corporate data. It involves cleaning, updating, and standardizing data. It is also perhaps the most publicized aspect of enterprise data stewardship, and it is quite possibly the thorniest. Poor data quality leaves companies with poor access to timely, accurate, consistent information. These companies often have no single version of the truth for corporate data and, thus, cannot effectively analyze their operations or accurately forecast the future. These companies also often have multiple informal data control efforts in place due to a lack of confidence in their data quality.
Companies endeavoring to improve their data quality have determined the most accurate data sources within the enterprise, and they have cross-validated those sources so that information produced from multiple systems gives the same answers. These companies also have published policies, service level agreements, and metrics for managing and measuring data quality. There are also clearly defined "data owners" in these companies who have responsibility for data related to their departments and who are empowered to resolve quality issues.
For those companies with better data stewardship programs, data quality has become embedded in their very culture. Consistently high data quality is assumed because they have implemented automated audits, checks, and initiatives to continuously test and improve data quality. Most of these companies also have a formal data quality scorecard based on commonly agreed upon metrics, and they distribute the scorecard to the entire company.
The other four building blocks culminate to form an integrated technical architecture. In companies that don't have an appropriate DSP, the technical architecture is not really an architecture; rather it is a morass of disparate, non-integrated IT systems. Further, there is an absence of automated interfaces between most systems, or if interfaces do exist, they are highly customized to individual systems. There is also poor workflow and an absence of rules engines. Companies seeking to improve their technical architectures have 1) integrated their IT infrastructure to move data between systems on a near-real-time basis, 2) implemented some basic automation and standardized data formats for internal and external interfaces, and 3) employed basic workflow and rules engines to better manage data flow.
Companies with sound programs for integrated technical architectures go even further. These companies employ a common messaging infrastructure with support for multiple data dissemination models (e.g., publish/subscribe, push/pull). There is also a single point for all inbound and outbound communication, as well as enterprise-wide use of standard data formats. Business users have the advantage of being able to use sophisticated workflow and rules engines to manage processes.
There is a continuum of data stewardship practices. No company gets it all wrong. No company gets it all right. However, those companies that continually seek better ways of managing their corporate information put themselves in a better position to handle the complexities of today's globally competitive business environments. Their access to timely, accurate, consistent information gives them a leg up on competitors that don't have access to such quality information. However, by implementing the practices I've outlined here for the five building blocks of data stewardship, you should be able to produce and maintain better quality information and, thus, move closer to the "get it right" end of the data stewardship spectrum.
Rich Cohen is a principal in Deloitte Consulting LLP's Information Dynamics practice where he is responsible for the strategy, development and implementation of data governance, data warehousing, decision support and data mining engagements to support the emergence of world-class business intelligence applications. Cohen has more than 27 years of experience in the design, development, implementation and support of information technology in a variety of industries. Over the last 18 years, he has had extensive experience in the creation of technology strategies, implementations and deployment of CRM and business intelligence solutions to drive improved business performance.
|E-Mail This Column|