The Analytics Paradox: Why good forecasts go bad and how to predict more profitably

Shares

There is a universal paradox that says: “You are free to choose but you are not free from the consequence of your choice.” In the world of big data and analytics, this couldn’t be a truer statement. In a bid to stay ahead of the game, companies, large and small are investing heavily on building their data analytics infrastructure for faster, better and more profitable decision-making capability.

But, in our interview with Bruce Rebhan, Sr. Director for Analytics & Business Intelligence at Amtrak, a company that employs more than 20,000 people and serves more than 30 million passengers annually, he emphasized that, once the analytics group has produced their first big ‘a-ha!’ moment of insight, will the senior management be ready? Moreover, are they ready for the consequence of their action (or inaction)?

This fascinating interview reveals why good forecasts can sometimes go bad, how analytics can effectively bring about culture change and how it can really spur revenue growth – if used the right way. Read on.

 

Corinium: You’ve had experience in both IT and corporate strategic roles in the past. How have you successfully bridged the gap between these sometimes conflicting departments?

Bruce Rebhan: The single best thing that organizations can do in order to improve the quality of their partnership with IT is to make sure that they understand the traditional software development process that IT organizations usually follow to deliver systems to their clients. That process is called the Software Development Lifecycle, or SDLC.

The SDLC process, for example, usually starts with requirements gathering, where IT asks the business what the system they’re building needs to accomplish. This is a logical step– before IT can design and build technology, they need to have a very good understanding of what that technology needs to be able to do, and how it will be used. However, this process step assumes that we can find people who are able to articulate everything that the system could possibly need to do in great detail.  This process works very well if we are automating a process. For example, maybe we’re building an ecommerce site to handle shopping and sales transaction processes. That process is a very well known quantity to the sales and marketing team, and a business leader can define that process in great detail from beginning to end, and IT will have good, detailed requirements to work with. But when you are talking about data and analytics, the outcomes we are trying to deliver are often widely unknown quantities, and the business side won’t really have a very good answer if they are asked “What do you need?” Instead, at the start, the two teams will need to work together to answer some pretty superficial questions (e.g. what were our sales by channel last week?). But then the answers to the first set of questions spur a new set of diagnostic questions (which channels are growing and which are shrinking) which in turn lead to more valuable questions (what are the likely drivers of revenue trends in a particular channel?)

Another old habit of IT projects is to divide the work into isolated work streams which enable teams to work largely alone. So a client team comes together with the requirements team and then the requirements guy works with the design team to design a solution, and then the design team meets with the development team who will actually build the product. This assembly line approach means that any miscommunication, assumptions or changes are going to propagate across the process without being flagged- and you will end up with something that does not get used.

The approach I’ve used with my teams is to steal from the product development discipline and methodology. If you talk to a product manager, they will usually gravitate toward 3 things: customer centricity, speed to market, and failing fast.

A product owner has to be customer centric- someone who understands the business customer very well, and knows how they might use better data to do their job. Instead of asking what the customer wants, the product owner should become an expert in the customer’s problems. If I know your problems and challenges and opportunities very well, then I don’t need to ask what solution you need, I can develop a solution for you.

 

If I know your problems and challenges and opportunities very well, then I don’t need to ask what solution you need, I can develop a solution for you.

 

And then speed to market comes into play- I want to put that idea in front of the customer very fast in the form of a prototype or wire frame or excel mock up or anything that I can use to provide the essence of the solution in a way that my customer can grasp. When the customer sees the proposed solution, their reaction will be the form of ‘I like that’ and ‘I don’t really like or need that’ and ‘I see you did X. But could you do Y instead?’ which leads us to a new iteration of the product.

Now we see the benefits of failing fast- the customer feedback will be so rich in content, because they are reacting to a real solution in front of them instead of trying to imagine a perfect, final solution in their head. And then you start the process of iterating on that initial solution- enhancing the good things, replacing the bad things, adding more automation to make it faster cheaper and more robust. And as you are iterating, it becomes the two parties iterating together, because you have a good sense of a shared goal.

That’s a big piece of what my role has become- talking to business leaders to really understand their business challenges in depth, to understand the problems well enough that data solutions start to present themselves. And after I get feedback from business leaders, I work with the data and analytics teams to enhance the good parts and replace what doesn’t work with something that does.

 

Corinium: As an analytics and Business Intelligence (BI) expert, how do you effectively predict customer behaviour?

Bruce Rebhan: It might sound obvious, but before we build a forecast model, the first thing we need to do is decide exactly what we’re trying to predict. Also, how accurate do we have to be- what is the margin for error? How far in advance do we have to predict it? And most importantly- what kinds of actions would I expect to take upon making successful predictions, and how much value would that action deliver to the bottom line? These questions should lead us to an ROI value, where I understand both the value of predicting some aspect of customer behaviour, and also understand the high level investment needed to build or enhance my forecasting discipline. I have seen several cases where a forecast was being asked for, without any sense of what the recipients of that forecast would actually do with it.

Target Stores got a lot of attention a few years ago for using customer purchase data to predict which of their customers might be pregnant- certain combinations of vitamins, new furniture and accessories, and new clothes turned out to be really good indicators. I bring up that story because Target picked a very high ROI problem to solve- pregnant women represent a real gold mine to retailers since they are about to spend a great deal of money on products that they haven’t historically purchased. So if Target can identify a pregnant woman early and give her coupons to capture more of that spend, Target not only gets the dollars from the end consumer, but in my opinion it also gives them better footing with the CPGs who make all those baby products.

Now, to actually start forecasting behaviour, I would start with a really simple model, one that assumes that thing you are trying to forecast can be predicted by evaluating just a few different variables, even if you know that what you’re trying to forecast is much more complex. The key thing is to make your forecast, and compare your predicted values to the actual values that occur, so that you can measure the accuracy or error rate of the model. That error rate is going to give you a baseline for the performance of your model. If the performance is particularly good or bad on a certain day or for a certain segment, it should prompt you to want to understand why- and that will lead you to test some ideas about new variables that drive performance and eventually improve your model. If you are publishing a forecast of a discrete number (say revenue or number of units purchased) you should always publish a range of values. A range gives your audience a good idea of how confident you are in your prediction, and helps put your prediction into context. Always keep testing new things, and keep in mind that reality changes over time. If customers are becoming more price sensitive for example, then your model is going to start producing a worse error rate unless you modify your model to keep it in tune with the changing nature of reality.

 

Always keep testing new things, and keep in mind that reality changes over time. If customers are becoming more price sensitive for example, then your model is going to start producing a worse error rate unless you modify your model to keep it in tune with the changing nature of reality.

 

Corinium: How do you support the Chief Revenue Officer (CRO) in the ultimate goal of revenue generation across all channels?

Bruce Rebhan: I am very fortunate to work with great leadership in Amtrak’s revenue organization, particularly Jason Molfetas, our Chief Marketing and Business Development Officer, and Bob Dorsch, who heads up Product Management. I think the most important thing I can do for them is to help them find better ways to measure performance of the organization. The word “better” can take many different meanings- it could mean more accurate, or more interactive, or a more detailed level, or fresher data. For example, one company I worked with had regular promotions that they ran for different channels and different markets. At the time, they measured the performance of the promotion by the amount of revenue generated from people clicking on the offer and using the discount to buy the product. I was able to convince the marketing leaders that we needed to compare promotions to control groups of customers who didn’t receive the promotion. That control group exists to help us understand what purchase behaviour looks like if we don’t run a promotion at all. So with a promotion, maybe I get incremental purchases that I wouldn’t get without a discount- but maybe I’m also discounting purchases that customers would have made even without a promotional discount. A control group lets me make a direct comparison – so maybe one promotion gives me a 10% lift in volume at a 5% drop in price- that’s a winner. But another promotion gives me a 5% lift in volume and a 10% drop in price, that’s no good. And what we found was that one of our promotions was capturing a lot of revenue, but it was because that promotion was targeting a really rich customer segment, not because it was a particularly well designed campaign.

I also try to champion the customer-centric view as much as possible. Organizations are usually organized around their own internal dynamics, not around their customer base. Most companies can very quickly tell you their numbers and trends for number of units sold, and price per unit, because that data is easy to get. And then we try to slice and dice those numbers by store, or channel, or product family, because that’s how we collected the data. But many companies won’t be able to tell you the trend for number of customers over time, or how different customer segments have performed, or the purchase patterns of customers. And that is for a very simple reason- that many companies only collect the data that is required to support sales transactions- product, price, store location, discount, date and time. In order to understand the customer I need to know some basics like age, where they live, their life style and spending, their likely income- this is data that I don’t usually need to complete a sales transaction and so I might not collect it at the point of sale. So part of my job is to think about the data we need for analysis, and make sure that the front line systems will be collecting that data.

 

Corinium: What is the most frequent reason that organisations fail in their attempts to become more data driven?

Bruce Rebhan: I think there are two big reasons behind a company’s failure to become more data driven. The first one is refusing to acknowledge and address the sizable culture shift that has to occur. Many organizations focus on building out their big data / technology infrastructure, and their data governance, and invest a lot of time hiring and/or developing the analytics skill sets needed. These are tangible and visible investments. These are important, but they are not enough by themselves, because at some point, that analytics group is going to produce their first big ‘a-ha’ moment of insight, where they come up with a set of analyses that questions a major element of conventional wisdom. Will senior leadership be ready for that moment, will they be able to set aside what they know about the business to consider whether this analysis might have some truth to it? For example, maybe one of the company’s legacy products turns out to play a very minor role in a customer’s repeat purchase pattern. If the culture shift is not in place, then those business leaders will ignore the findings, or declare the analysis to be untruthful, and business as usual continues on.

 

If senior leaders see one division of the company accelerating its progress against goals by leveraging the numbers, then other divisions will start to get asked about their numbers. And then the next thing you know, you start to see your company become a data-driven organization!

 

Another big obstacle to success is the failure to take action based on the analytics and insights that are produced. If the analysis says that certain products are unprofitable, what will we do about that, how do we ensure that the organization is going to connect the dots? I look for ways to institutionalize the actions that we should be taking based off the results of the analytics. If we are spending the time and money to monitor the numbers and publish a dashboard, then we also need to be tracking the actions we are taking. If the analytics start to indicate an emerging issue with a given set of customers or channels, then it should communicate with the consumer of the dashboard who needs to take action- raise prices, or push a new promotion, etc. Organizations that are publishing dashboards or other analytics should also be tracking which parts of the business are more likely to take action based on the numbers, and publicize the results that those teams are getting. If senior leaders see one division of the company accelerating its progress against goals by leveraging the numbers, then other divisions will start to get asked about their numbers. And then the next thing you know, you start to see your company become a data-driven organization!

 

***

About Bruce Rebhan:

Bruce Rebhan is Sr. Director, Analytics & Business Intelligence at Amtrak. Executive with more than 20 years of experience leading global teams at corporate and management consulting environments. Drives cultural, strategic and technical change for enterprises seeking outsize returns from investments in data analytics and technology.

Track record of elevating the importance of data and analytics by authoring Board-level strategies and roadmaps, combined with quick wins that highlight the potential value delivered.

Applies BI to drive action and measureable impact to top and bottom line. Brings a multi-disciplinary focus that ties together revenue strategy, technology architecture, product management and data analysis.

MBA with Honors from the Kellogg School of Management at Northwestern University. MS in Signal Processing & Analysis from the University of Southern California. Simultaneous degrees (BA English Literature and BS Electrical Engineering) from the University of California Berkeley.

Save

Save

Save

Save

Save

Shares

Related Posts

Leave a Reply

Corinium Global Intelligence is registered in England & Wales, number 08520994. Registered office:
Brook House, School Lane, South Cerney, Cirencester, GL7 5TY.

Share This