Portals eNewsletters Web Seminars dataWarehouse.com DM Review Magazine
DM Review | Covering Business Intelligence, Integration & Analytics
   Covering Business Intelligence, Integration & Analytics Advanced Search
advertisement

RESOURCE PORTALS
View all Portals

WEB SEMINARS
Scheduled Events

RESEARCH VAULT
White Paper Library
Research Papers

CAREERZONE
View Job Listings
Post a job

Advertisement

INFORMATION CENTER
DM Review Home
Newsletters
Current Magazine Issue
Magazine Archives
Online Columnists
Ask the Experts
Industry News
Search DM Review

GENERAL RESOURCES
Bookstore
Buyer's Guide
Glossary
Industry Events Calendar
Monthly Product Guides
Software Demo Lab
Vendor Listings

DM REVIEW
About Us
Press Releases
Awards
Advertising/Media Kit
Reprints
Magazine Subscriptions
Editorial Calendar
Contact Us
Customer Service

Engineering the Predictive Enterprise

  Article published in DM Direct Newsletter
May 5, 2006 Issue
 
  By Doug Freud and Rohit Tandon

According to Forrester, by mid 2006 more than 85 percent of consumer-oriented firms plan to target marketing messages to one or more inbound interaction channels.1 But, of the firms that do this today, most use only basic business rules instead of predictive analytics to determine the message the customer receives.

Predictive analytics, which examines historical information using data mining algorithms, allows businesses to make more accurate predictions on future events. IDC found that business applications involving predictive analytics software generate an average ROI of 145 percent, much higher compared to other types of business intelligence (BI) software.2 Regardless of why companies are not implementing these applications, it is becoming increasingly apparent across a variety of industries that businesses must evolve their processes to include predictive analytics, or they will not remain competitive.

The Time is Now

What has changed? Descriptive and predictive analytics are not new! Many techniques and algorithms have been readily available in traditional workbenches for the last 25 years. What has changed substantively is not the ability to analyze data, but the need to deploy predictive analytics in order to remain competitive. The following table indicates that consumers are getting more sophisticated, and that they are not nearly as likely to stick to a product or service out of brand loyalty. Even the most loyal demographic, such as the older consumers, are now more likely to switch to a different company because it is easier to do so. Commoditization decreases brand loyalty and increases competition. A case in point is the deregulation and subsequent commoditization of the telecom industry that led to decreased brand loyalty and increased competition. One of the first problems that the telecom sector targeted to alleviate was customer churn using predictive analytics.

Figure 1: Percentage of Consumers Who Try to Stick to Well-Known Brands3

In order to efficiently attract new customers, retain current customers and maximize customer value organizations need to optimize decision-making by using predictive analytics that are embedded in the business process. Perhaps Amazon's embedding of  "book recommendations" (as part of the business process) based on association rules or market basket analysis is the most conspicuous example.

In addition to these competitive pressures, several technology enablers, such as database vendors integrating predictive capabilities service-oriented architecture (SOA) that allows integration of applications/business processes with actionable predictive analytics will accelerate adoption of predictive analytics, and in real-time in some cases.

Companies that do not make data-driven decisions will lose customers at a faster rate, fail to take advantage of cross-selling opportunities and slow overall growth.

Touchpoints, Interactions and the Cost of Switching

Technology has fundamentally altered the way customers and prospects interact with businesses by lowering the cost of entry for nontraditional competitors. Furthermore Internet usage has crossed the chasm and is now pervasive. Inefficient judgment (or rules-based decision-making combined with empowerment of the consumer via the Internet will endanger many organizations. The companies that will survive and flourish are the ones that recognize the inflection point and modify their strategy from a product approach to a customer-centric focus. Predictive analytics is a key theme in the customer-centric strategy because it helps understand customer behavior and upcoming needs (lifecycle or up/cross-sell).

One of the results for the explosion of technology is that businesses must provide customers with their choice of interaction channels. For example, in the financial services industry the days of providing brick and mortar branches are no longer sufficient to attract and retain high-value customers. How many times do you visit your bank's branch office versus its online banking Web site? Customers and prospects expect the flexibility of multiple interaction channels, including secure Web sites, online chat and responsive service via email. The expectation is not only that such touchpoints exist, but also that they are well coordinated with more traditional touchpoints such as a brick and mortar branch, the ATM and the call center. Failing to meet these expectations is likely to cause dissatisfaction, and may prompt customers to reevaluate their loyalty.

Exogenous changes such as the federal "do-not-call" registry and customer privacy acts are exercising hitherto unknown pressures such as the reduction in outbound campaigns. This in turn means that when the customers call into inbound call centers, businesses seize the opportunity and do so in real-time using predictive analytics. For example, instead of routing all calls to interactive voice response (IVR), businesses should build an IVR bypass that in real-time identifies the caller, uses their propensity to purchase and bypass IVR and route to a call center agent for up/cross-sell or remediation. In addition, the "predictive recommendation engines" should be able to capture customer information in real-time and recalculate a customer's attrition, value and propensity to purchase scores.

The Internet lowers the cost of switching by making it easier to shop and compare, and it enables the change without the consumer having to physically go to a place of business to open or close accounts. The consumer now has the ability to make real-time decisions about the organizations they buy from. If customers can make decisions in real-time, it is imperative that the enterprises enable real-time decisions also powered by predictive analytics. For example, if the system is able to assess the likelihood of attrition of high-value customers beforehand, some kind of enticing offer should be presented via a reps work list or, depending on privacy settings, the next time the customer calls in.

In addition to empowering the consumer, the Internet also lowers the cost of entry for nontraditional competitors. Companies can compete without the expense of building brick and mortar branches and associated personnel costs. Furthermore, they have the added advantage of designing an IT infrastructure that does not need to accommodate and integrate numerous legacy systems. In order to remain competitive and adapt to the changes in business conditions, existing financial services behemoths will need to adopt a customer-centric focus powered by predictive analytics.

Predictive Enterprise Starts with a 360-Degree View of the Customer

Although building an analytical warehouse in a large company with numerous lines of business and customer touchpoints is an expensive undertaking, it is the critical first step in engineering the predictive enterprise. Building the warehouse, however, is only the first of many hurdles to be cleared before achieving a return on investment. The warehouse typically supports traditional BI systems such as ad hoc reporting, ROLAP, MOLAP and KPI dashboards. All of these systems enable knowledge workers to understand the business and make decisions that help improve underlying processes.

Although the ROI on these systems is valuable, it is also often difficult to measure. The warehouse, if designed properly, will also support predictive analytics. The potential ROI on predictive analytics is not only substantive, but in many instances is easier to measure.

The industry as a whole is starting to embrace predictive analytics, and most organizations are building analytical models within various lines of business. According to a recent Accenture Research Report, Insight Driven Marketing, E*TRADE is using data across lines of business and various customer touchpoints to create insights that feed effective marketing campaigns.4 The key to their improved marketing effectiveness was building a 360-degree view of the customer combined with traditional BI reporting and predictive analytics.

Figure 2: Creating the Customer View

By building an analytical warehouse that includes interaction data and by exploiting traditional BI combined with the power of predictive analytics, companies are able to:

  1. Understand current and historical customer behavior;
  2. Predict future events;
  3. Act on those predictions (real-time decisions); and
  4. Realize a feedback loop and optimization.

Although the ROI is potentially substantive, there are numerous pitfalls to successfully implementing predictive analytics. Organizations should not focus only on ROI, but instead should think in terms of total cost of ownership (TCO). Although if architected correctly, the ROI can also be measured using a feedback loop into the BI systems.

Figure 3: Measuring ROI with a Feedback Loop

Reducing TCO by Integrating Predictive Analytics within the IT Infrastructure

The key for achieving a positive ROI for predictive analytics is being able to act on predictions, and for most organizations, this presents one of the more significant challenges. Gartner recently identified this challenge and labeled it the "Execution Gap."5 Their hypothesis is that for most companies, converting data into knowledge is not the problem, but converting knowledge into compelling offers and making them operational via the IT infrastructure is where many companies struggle.

There are a number of reasons why the execution gap exists and to understand the causes we will examine the typical processes involved in building a predictive analytics model.

The end result of most predictive analytics techniques is a models that uses historical data to forecast an outcome or amount, and to provide some indication of confidence of that prediction. A typical marketing example for predictive analytics is where an algorithm uses inputs like age, gender and purchase history to predict likelihood of a future sale. a A model is derived using historical information from a previous campaign, which includes who bought and who did not buy. The entire database is then scored and rank ordered, and the organization targets the customers/prospects that are most likely to buy. An overly simplified set of steps:

  1. Extract, clean, prepare relevant data;
  2. Use appropriate algorithm to build model on a sample of learning data;
  3. Validate model with different data; and
  4. Use model to score.

The algorithms to create models can be as simple as a linear regression approach to the latest machine-learning techniques. In the end, the model tells us what data is important and how to mathematically combine that data to make a prediction. Although the creation of the model via data mining algorithms involves complicated mathematics, the scoring of these models within the IT infrastructure does not seem like it should be such a formidable challenge. In theory, after the model is built, scoring is no more complicated than solving for x for each customer when the unknowns have been solved. Although it does not seem like a difficult problem, there are a number of significant challenges to overcome.

Approaches to Scoring

The process of picking a scoring approach should rely on underlying business requirements, which should be established before models are built as part of the underlying methodology. There are numerous considerations to think about in the requirement definition process, including what applications might want to include scores, the business costs of incorrect predictions and the importance of the temporal dimension when scoring.

If, for example, a customer talking to a support representative reveals that a piece of data important to a model has changed (e.g., martial status, employment; location) is there a requirement for the model to be dynamically rescored and a recommendation presented within a CRM operational system? Are the model and the ability to score it built into the CRM application? If so, how do you update the application when the next version of the model is published? What if the next version of the model uses a different underlying algorithm or requires different data?

Determining the scoring approach will depend on a number of factors, including the channel (e.g. direct mail, Web, call center, branch), the likelihood of change in the predictors in the model (e.g., gender does not change often, but other fields like job status and residence are more likely to change), and other optimization considerations. Gartner has classified scoring approaches as follows:

Figure 4: Gartner's Scoring Approach Classification

Before an organization implements a real-time scoring approach, they should determine that it fulfills a business requirement. The following graphic displays the continuum of scoring options.

Figure 5: Contiuum of Scoring Options

No matter what scoring approach or combinations of approaches are implemented it should be apparent that scoring and integrating results into enterprise applications is not a trivial matter. In order to coherently integrate and support predictive analytics at the enterprise requires a governance model, careful requirement definition, and the ability to create a flexible model management capability. In addition, the enterprise needs to use an architecture that can propagate scores to CRM processes at varying frequencies and receive feedback on "hit/miss" ratios to measure model performance. An SOA is one way to accomplish this.

Other major issues to look at are data volumes, data security, data movement and performance considerations. The implication is that to control the TCO predictive models should be created, stored, scored and managed within the database. If an organization attempts to use tools that require data movement outside of the database as part of a production system, the potential integration costs are substantial, and exposure to potential security risks is higher due to data movement outside of the environment. With the trend of exponential growth in data volumes expected to continue it is easy to ascertain that in situ or database resident data mining techniques will prove to be more efficient from a cost response time and performance point of view in the longer term.

References:

  1. Elana Anderson with Eric Schmitt, Tenley McHarg, and Sally M. Cohen. "Inbound Marketing Goes Mainstream." Forrester, 19 September 2005.
  2. "Predictive Analytics and ROI: Lessons from IDC's Financial Impact Study." IDC, September 2003.
  3. "Insight Driven Marketing." Accenture Research Report, 2001.
  4. Accenture.
  5. Gareth Herschel. "Management Update: Applying Analytical Techniques to Gain Customer Insights." Gartner, 11 June 2003.
...............................................................................

For more information on related topics visit the following related portals...
Analytics, CRM, Data Management and ROI.

Doug Freud is technical manager of Analytics & Data Mining at Oracle Corporation. He has over fifteen years of experience in helping organizations design, develop and implement analytically based solutions. His areas of expertise include data mining, text mining and statistics. Prior to joining Oracle, Freud was at SPSS Inc. and worked both in the marketing and professional services organizations.  He may be reached at doug.freud@oracle.com

Rohit Tandon is director of Business Intelligence and Data Warehousing at Oracle Corporation, where he is responsible for strategy, planning and implementation of solutions for his client base. He has 15 years of IT consulting experience, with the last 10 years focused on program managing and architecting BI/DW and predictive analytics solutions for large clients. His current focus is around creation of technology strategies and their implementation, allowing enterprises to move toward customer centricity. He may be reached at rohit.tandon@oracle.com.



E-mail This Article E-Mail This Article
Printer Friendly Version Printer-Friendly Version
Related Content Related Content
Request Reprints Request Reprints
advertisement
Site Map Terms of Use Privacy Policy
SourceMedia (c) 2006 DM Review and SourceMedia, Inc. All rights reserved.
SourceMedia is an Investcorp company.
Use, duplication, or sale of this service, or data contained herein, is strictly prohibited.