Data Warehousing Lessons Learned:
The Search for an Information Quality Safe Harbor
Progress in improving information and data quality will be constrained by human factors - fear, in particular - unless enlightened management devises and implements an information quality safe harbor.
By definition, a safe harbor provides shelter from the storm. In this case, the storm is the organizational and technical disruption caused when information quality defects surface unexpectedly. As proposed here, an information quality safe harbor consists of three dimensions:
- A commitment by enterprise management - executive sponsorship - to support and even reward (as appropriate) the surfacing of information quality issues with the intention of resolving and eliminating them in the interest of improved customer service, product development and other business imperatives.
- A method of prioritizing the issues that are surfaced and capturing the resolutions for reuse and productivity improvements.
- Assembling a cross-functional "tiger team" to address the issues that are surfaced on a priority basis.
The idea is that the tiger team is tasked with supporting and helping to resolve the data and information quality issues that surface. This is different than what usually happens when an individual staff member surfaces a problem. Often the individual is charged with fixing it, thus increasing her or his workload. Situations do exist in which a given department is better qualified to address an issue than an outsider, especially someone from corporate headquarters. In such an instance, the tiger team is actually in a support role, with leadership provided by the department in question. However, in other cases, the issues may very well be cross-functional or multidivisional; and no one individual, department or function could adequately address the matter on its own. A management process for cross-functional problem solving is required.
Without an information quality safe harbor, it is possible to make progress with information quality, but only to a limited extent. The obvious issues will be surfaced - data redundancies, missing fields, ambiguous data elements. In many firms, the data and information quality situation is so bad that significant cost efficiencies can be gained from the low-hanging fruit. However, in other situations, the real tough issues - databases not synchronized, business processes misaligned with the technology and dysfunctional staff attitudes toward quality - will get swept under the rug.
Individuals reporting data and information problems should not be subject to punishment (the obvious exception being intentional wrongdoing). Many IT applications and processes have never worked right. Some enterprise resource planning (ERP) systems are so large and complex that it is difficult to say if they are functioning as designed. They are consistently and sometimes inconstantly inaccurate and have been so since the day of their introduction. The individual, team or project surfacing the information quality issues is likely to be at risk in a variety of ways. In short, this risk is similar to the classic risk of being the messenger who is shot for delivering bad news. However, in this case, the risk is nontrivial and carries potential enterprise-wide impact. Fear may result in glossing over or denial of quality issues. Therefore, it must be managed with a process appropriate to enterprise management. Wise corporations will strive to drive out fear and establish a safe harbor for quality improvement processes by both explicit rule-making as well as executive actions and leadership.
Conduct an information quality survey. Ask the information consumers, "On a scale of one to 10, how reliable (credible) is the information reported by System A?" Ask the same question of the information producers and information managers. Any discrepancies in responses between these three groups can be interesting sources of insight and quality improvement. For example, if the information consumers think the information is trustworthy but the information producers do not, there may be a problem that requires investigation because the producers suspect they are producing defective output. Notice that in many situations they are unlikely to admit to a problem in the absence of an information quality safe harbor or equivalent assurances. Conversely, if the information consumers lack trust in the information but the producers and information managers find it credible, it may be useful for IT management to undertake confidence-building measures to surface any unresolved issues and align perceptions.
Like so many tasks in IT, defining, designing and implementing information quality is a bootstrap operation requiring iteration, a process of learning from mistakes and a commitment to business results. That is precisely why a policy that establishes an information quality safe harbor is critical. Without an accpted commitment to identifying, surfacing and communicating information quality mistakes (lessons learned), people will experience fear and uncertainty about their jobs and shut down the information quality assessment, communication and improvement process. In any case, be prepared for "roll up the sleeves" hard work. This is both a top-down and bottom-up task because the impact on information quality of relations between systems can only be evaluated by including both sides of the interface.
For more information on related topics visit the following related portals...
Lou Agosta, Ph.D., is a business intelligence strategist with IBM WorldWide Business Intelligence Solutions focusing on competitive dynamics. He is a former industry analyst with Giga Information Group and has served many years in the trenches as a database administrator. His book The Essential Guide to Data him at LoAgosta@us.ibm.com
Provided by IndustryBrains
|Verify Data at the Point of Collection: Free Trial|
Protect against fraud, waste and excess marketing costs by cleaning your customer database of inaccurate, incomplete or undeliverable addresses. Add on phone check, name parsing and geo-coding as needed. FREE trial of Data Quality dev tools here.
|Design Databases with ER/Studio: Free Trial|
ER/Studio delivers next-generation data modeling. Multiple, distinct physical models based on a single logical model give you the tools you need to manage complex database environments and critical metadata in an intuitive user interface.
|Free EII Buyer's Guide|
Understand EII - Trends. Tech. Apps. Calculate ROI. Download Now.
|Data Mining: Levels I, II & III|
Learn how experts build and deploy predictive models by attending The Modeling Agency's vendor-neutral courses. Leverage valuable information hidden within your data through predictive analytics. Click through to view upcoming events.
|Click here to advertise in this space|