Portals eNewsletters Web Seminars dataWarehouse.com DM Review Magazine
DM Review | Information Is Your Business
   Information Is Your Business Advanced Search
advertisement

RESOURCE PORTALS
Business Intelligence
Compliance
Corporate Performance Management
Data Quality
Data Warehousing Basics
ETL
Master Data Management
Real-Time Enterprise
View all Portals

WEB SEMINARS
Scheduled Events
Software Demo Series

RESEARCH VAULT
White Paper Library
Research Papers

CAREERZONE

Advertisement

INFORMATION CENTER
DM Review Home
Newsletters
Current Magazine Issue
Magazine Archives
DM Review Extended Edition
Online Columnists
Ask the Experts
Industry News
Search DM Review

GENERAL RESOURCES
Tech Evaluation Center:
Evaluate IT solutions
Bookstore
Buyer's Guide
Glossary
Industry Events Calendar
Software Demo Lab
Vendor Listings

DM REVIEW
About Us
Press Releases
Awards
Advertising/Media Kit
Reprints
Magazine Subscriptions
Editorial Calendar
Contact Us
Customer Service

Improve Supply Chain Visibility by Harvesting Data from Your ESB

  Article published in DM Direct Newsletter
August 19, 2005 Issue
 
  By Tieu Luu

A company's supply chain processes are usually supported by multiple homegrown or commercial applications with separate solutions for sales, manufacturing and procurement, using some type of integration solution linking them together to facilitate interdependencies in the processes. A typical flow through these systems starts with the creation of an order in the sales systems that will generate transactions that are sent to the manufacturing department's planning and production systems which will, in turn, generate requisitions and purchase orders in the procurement systems that are then transmitted to the company's suppliers. Each of these transactions represents a key step in the organization's supply chain and by harvesting the data from these transactions an organization can attain visibility into these key steps. Linking together the data from these transactions can provide additional insight. For example, linking together a purchase order with the corresponding shipping notice that is received from the supplier can provide performance metrics such as the amount of time it takes between contract execution to receipt of goods. This information can then be fed back into the supplier selection process as part of the selection criteria.      

Leveraging the Enterprise Service Bus for Visibility

With the enterprise service bus (ESB) quickly becoming the foundation for the next wave of integration, a lot of companies will increasingly use an ESB to integrate their internal supply chain systems as well as to integrate their business processes with partners. Once the ESB is deployed and the systems and processes are integrated, it will be a liquid goldmine flowing with valuable data in the transactions that are executing among the company's various supply chain systems and partners. By adding a simple data extraction service to the bus, you can harvest data from those transactions and achieve an end-to-end view of the supply chain. This can also be an excellent source to augment traditional business intelligence systems to provide a real-time view of events as they occur.

Most ESBs already have some type of auditing service that captures and stores information about the transactions passing through. Typically, this is just used to capture system performance and operational metrics but there is no reason why such services couldn't be enhanced for business intelligence purposes as well. This may require capturing additional information from the transactions and storing them in a dimensional model that is more suitable for querying and reporting from a business perspective.

Because of their data transformation capabilities, ESBs contain valuable meta data in the form of source-target mappings that helps you to understand the various syntaxes and semantics of the enterprise's data. Additionally, if the integration solution is developed using best practices, it will have a canonical representation of that data. This canonical representation can be used to create the dimensional model to store the harvested data since it represents the enterprise's agreed upon syntax and semantics. The mapping logic used in the transformations can be reused to transform the harvested data into the canonical format for storage.

Built into the transformations are also significant cleansing and validation capabilities. This can reduce a lot of the work in the cleansing and validation steps that need to be performed while harvesting the data. By extracting the data from the transactions at the appropriate points in the processes, i.e., after they've passed the validation steps, you can guarantee that you have quality data for reporting.

Strategically Placed Data Extraction Service

The technical solution for this can be built as service that is strategically placed at various points in the processes (see Figure 1) that you want to extract data from. Externally, it would appear as a simple, asynchronous service that accepts messages with various formats. Internally, it would perform a series of steps that can be implemented as sequential process linking together several services. The first service would be a pass-through data extraction service that accepts the messages and makes a copy of the ones that you're interested in harvesting data from. These messages are then fed into a transformation service that will transform them into a canonical format and perform some additional validation and cleansing. Next, these messages are sent to a matching and correlation service that applies rules-based logic to identify redundant transactions and correlate related data that is manifested in different transactions. The transformation of all the messages into a single canonical format will make it much easier to do this. The final service takes these messages, parses the XML and stores the relevant elements into an operational data store.

A visual reporting tool can be built off this operational data store to serve real-time reporting needs. The data from the operational data store can also be loaded into a larger data warehouse that combines this data with other sources for more strategic analysis. The complexity of the solution will depend on your analytical needs.

Figure 1: Data extraction service strategically placed at key points in supply chain processes executing in ESB.

Analyze the Processes and Data Flowing Through the ESB

You can use the ESB's graphical process design tools to analyze the business processes executing in the bus to identify which points you can tap into for visibility. Trace these down to individual service end points' WSDLs to see the messages that are passing through those points so that you can see what data is available to be harvested. Identify the core data elements from each of these

Since the data being collected is purely transactional data, it alone will usually not be sufficient to give a comprehensive understanding of what's going on at that point in the supply chain. For example, if you are harvesting data from advanced shipping notices to determine what products are coming into or leaving the enterprise, the advanced shipping notice may not contain very detailed information on the product that's being shipped. The advanced shipping notice may just contain some type of product id, so it would be necessary to link that id to the corresponding data from an item master database to obtain a comprehensive view of exactly of what's being shipped. Alternatively, an analyst may want to start with the item master and look at the advanced shipping notices that are executing against each item to determine real-time inventory levels for those items. In either case, the data derived from the transactions will have to be combined with other data from master or reference databases before it becomes useful information for analysis.    

Another characteristic of the data that is harvested from the transactions is that it is transient data which means that there may be certain activities downstream or additional logic in the target systems that may change that data. Thus, you need to have mechanisms in place to validate the data captured from the ESB against the target systems to ensure that any changes to the data occurring during downstream activities will not give you inaccurate results. Alternatively, you may just want to target for harvesting only those transactions with data that's not likely to be changed by any downstream activities.      

Most importantly you need to ensure that you are getting a complete view of the enterprise by targeting the transactions flowing in the ESB. For example, if you were tracing purchase orders to track how much was spent on certain products, you would have an inaccurate view if there were purchasing transactions that did not pass through the ESB. Similarly, you need to make sure you don't account for the same transaction more than once. One transaction may pass through the ESB several times as it is routed to different systems in the enterprise. Additionally, the same data may be manifested in different transactions. Thus you need to make sure you can uniquely identify the data from those transactions as they pass through various points in the business processes.   

The time, effort and money invested to migrate to the use of an enterprise service bus are significant. However, once you have it successfully deployed, you can get a lot more out of your investment than just solving your integration needs. By analyzing the processes and data flowing through your ESB, you will discover that there is a wealth of information there that can be harvested to give you visibility and insight into key points of your supply chain. With better visibility and insight, you can begin to see improvements in the management of your supply chain such as better inventory control, reduced lead times and earlier problem detection and resolution.

...............................................................................

For more information on related topics visit the following related portals...
Data Integration and Supply Chain.

Tieu Luu is an associate with Booz Allen Hamilton, designing enterprise data and integration architectures for large federal agencies. Prior to Booz Allen Hamilton, Luu held lead engineering positions at Grand Central Communications and Mercator Software where he worked on the development of integration platforms. You can reach him at luu_tieu@bah.com.

Solutions Marketplace
Provided by IndustryBrains

Free DB Modeling Trial with ER/Studio
Design and Build More Powerful Databases with ER/Studio.

Speed Databases 2500% - World's Fastest Storage
Faster databases support more concurrent users and handle more simultaneous transactions. Register for FREE whitepaper, Increase Application Performance With Solid State Disk. Texas Memory Systems - makers of the World's Fastest Storage

Backup SQL Server or Exchange Continuously
FREE WHITE PAPER. Recover SQL Server, Exchange or NTFS data within minutes with TimeSpring?s continuous data protection (CDP) software. No protection gaps, no scheduling requirements, no backup related slowdowns and no backup windows to manage.

Priority Electronics - Laptop Batteries and Parts
Sells top quality laptop parts including hard drives, batteries & keyboards. Laptop battery store.

Click here to advertise in this space


E-mail This Article E-Mail This Article
Printer Friendly Version Printer-Friendly Version
Related Content Related Content
Request Reprints Request Reprints
advertisement
Site Map Terms of Use Privacy Policy
SourceMedia (c) 2007 DM Review and SourceMedia, Inc. All rights reserved.
SourceMedia is an Investcorp company.
Use, duplication, or sale of this service, or data contained herein, is strictly prohibited.