The Rise of Predictive Analytics: Inside Scoop with Eric Siegel

0
717

Share on LinkedIn

CustomerThink Founder/CEO Bob Thompson interviews Eric Siegel of Prediction Impact about the growth in predictive analytics applications and how companies can reap the rewards.

Interview recorded April 2, 2013. Transcript edited for clarity.

Bob Thompson:
Hello, this is Bob Thompson of CustomerThink, and welcome to another episode of my Inside Scoop interview series. This time, my guest is Eric Siegel, President of Prediction Impact, a firm that provides analytic services to businesses. He’s been an analytic expert for many, many years, and is also the founding Conference Chair of a great conference called Predictive Analytics World and another conference, Text Analytics World. Both are excellent and I highly recommend.

Today, we’re going to be talking about Eric’s new book called – no surprise here – Predictive Analytics. The sub-title, I find quite intriguing: “The Power To Predict Who Will Click, Buy, Lie or Die.” Very clever. Eric, welcome to Inside Scoop. It’s great to catch up with you again.

Eric Siegel:
Great to be here, thanks, Bob.

Bob Thompson:
All right, well first of all, I just want to say congrats on your new book. I have read through it, and it’s really a fine piece of work. This topic of predictive analytics or just analytics, in general, it’s a little bit geeky, let’s be honest.

Eric Siegel:
I hope so.

What is Predictive Analytics?

Bob Thompson:
It can be a little bit daunting for business people, but you did a very nice job of not only demystifying what it means, but providing some wonderful examples. We’ll get into that in just a moment here. Let’s start with something very, very basic for those who are still learning about predictive analytics. Could you give a quick definition of predictive analytics and maybe contrast it with some other forms of analytics that people might have heard of?

Eric Siegel:
Sure. Well Bob, you just mentioned the sub-title of the book. That’s the shortest definition, “The Power To Predict Who Will Click, Buy, Lie or Die.” Those are just four examples. This is technology that provides organizations the power not just to predict the future, but to influence the future because it’s prediction on a per individual basis. In most applications I discuss in the book, it’s per individual people. So, you, I, everyone is being predicted by all kinds of organizations, companies, governments, law enforcement, universities, non-profits, even presidential campaigns.

By making a prediction of whether you’re going to click or buy, comply, steal something, whether you’ll thrive in healthcare applications, whether you’re going to donate, whether you’ll crash your car for insurance, which candidate you may vote for, millions of per person decisions and actions can be driven by them. It’s the most actionable form of business intelligence in that sense, quite distinct from forecasting.

Everyone’s familiar with this idea of looking to the future; overall is the economy going to go up or down, is this presidential candidate versus the other going to conquer over one particular state of the United States? Instead, it’s a prediction for each individual voter. Nate Silver became very famous for forecasting the election outcome, and he won in that competition, but the Obama campaign analytics team was competing to win the election, itself, by way of driving campaign activities on a per voter basis.

Bob Thompson:
The plain definition would seem to be simply that predictive analytics is about predicting, but doing it at an individual level. Don’t you find that there’s some confusion out in the market, Eric? For example, I see very, very commonly, probably 80, 90 percent of the time, people will show a correlation, here are two things that are correlated, and then they’ll immediately leap to a conclusion, well if you do one of these things, it will drive the other thing. In other words, to confuse correlation with causation. The reason I think predictive analytics is so cool and so powerful is that, in fact, it should be truly predictive. What makes it predictive versus just an interesting correlation?

Eric Siegel:
It is predictive because it’s an interesting correlation. We generally don’t have the luxury of having conclusions about causality, which is exactly the problem with most kinds of analytics in reports, in that they show you trends, and they kind of inform intuition and hunches, that is to say conjecture on what the causality might be. But the fact is in general, causality is a very elusive, sort of philosophically profound thing generally out of our grasp in a concrete way, whereas, correlations that are predictive, they help us predict.

So, if there’s a predictive model that has been attained by an algorithm, looking over the data – the data’s the history, so you’re learning from history – and this model looks at an individual, say, voter or customer, looks at several or dozens of attributes and together says, “Well, these things indicate this person’s much more likely than average to exhibit the behavior, buy something or what have you,” that’s a correlation. Or those individual attributes are correlated in a combination that is an even stronger correlation. If the correlation says, “We can predict and, therefore, act upon this expected outcome,” we’re not saying it’s causative. If the person is likely to behave this way because, let’s say, they’re female, it’s not necessarily because they’re female, in terms of the way the world works, but the fact that we know they’re female can help us predict.

Bob Thompson:
Right, but don’t these models need some real world validation to make sure if you’re using it to predict the future that, in fact, these things actually do predict the future?

Eric Siegel:
They do predict the future, but we don’t know if it’s because of causation. We don’t know if it’s cause and effect, but we do know and we do validate. All models are always validated over data that’s held aside, not used to create the model. That’s the best practice.

Bob Thompson:
I last interviewed you four years ago. A lot has changed. There’s a lot of hype right now about big data. Analytics is on the rise, there are lots of books out including your new book. What do you see as the big changes over the last few years?

Eric Siegel:
The biggest change is just the penetration, in terms of how much analytics is being used by organizations. The vast majority of large organizations, for example, are using it. There’s deep penetration in certain financial marketing and fraud detection applications, in terms of it being really widespread. And then just in general, there’s great awareness. More and more people are seeing the successful case studies and realizing that they need to start moving on it. In terms of the actual deployments, there’s always new sort of surprises, in terms of new ways this can be used, more than just for target marketing and predicting if you’re going to buy. There’s also new advanced methods.

A couple that I cover in the book, one is called an “ensemble model.” If you have a model that’s made, let’s say, relatively simply, like business rules, essentially, also known as a decision tree, it’s got a bunch of rules in it, it can be a pretty effective model. But if you really want to supercharge and amp up the model and improve the precision, it turns out you don’t necessarily need to get really sophisticated mathematically. You just clump a bunch of models together and have them vote. That’s very much the same thing as wisdom of the crowds of people. We find that’s the same thing with a crowd of models instead of people.

Bob Thompson:
Didn’t Nate Silver do something like this by aggregating results from multiple polls?

Eric Siegel:
Yes.

Bob Thompson:
He didn’t do the original polling. He actually aggregated results from other organizations, right? Is this something like what you’re talking about?

Eric Siegel:
Yeah, that’s a great comparison. It’s totally analogous. In fact, I would say that both tap into the same principle, in that they’re going to have certain errors. In wisdom of a crowd of people, a crowd of polls, a crowd of predictive models, three different things, but in any case, the individual models are like people who are going to make errors, that are just going to have errors in judgment. But when you sort of aggregate them together, it kind of comes out in the wash. They compensate for one another’s weaknesses essentially all the same. The difference here with Nate Silver is that it wasn’t predictive analytics, in that what he was doing was coming up with overall measures across all the voters in an individual state like Ohio versus predictive analytics, for example, being used by the Obama campaign, where they actually made predictions per individual voter.

What’s the Value of Big Data?

Bob Thompson:
Right, right, I understand. So, we’ve talked a little bit about big data. I want to explore that just briefly here. When I look through the examples that you gave, some of them might suggest some big data. Many of them really look like applications for regular old data, which can be plenty big and complicated. In your view, is what I would call hype about big data really justified? Is this really a big opportunity for companies, or should companies be looking at more traditional sources for analytics in transactional data and other things that they’re already collecting but maybe not using?

Eric Siegel:
I would say that all the hype around big data is sort of justified. The original definition of big data was pretty strict. It was the data had to be so large, it wasn’t manageable by a traditional database, so it was sort of unwieldy, super big data. But the fact is there’s more challenge in learning from a smaller amount of data, and tens of thousands of rows of customer data or even hundreds of thousands does not qualify for that definition, but I would say today, most uses that I see of big data, they’re really just talking about data and the fact that, indeed, there’s more of it today as there will be tomorrow, than there was yesterday, and sort of all the excitement around the value of data. The problem is it’s sort of a silly buzzword because it’s just a grammatically incorrect way to say a lot of data, and it does not speak to the value. The value of data is it’s an experience from which to learn how to predict.

Predicting Mortgage Churn

Bob Thompson:
Let’s go into an example. Again, one of the things I love about your book is it covers a lot of territory. You’ve got examples in government and healthcare, in business, our personal lives. It’s really amazing. Let’s stick with business though, for this discussion. Can you share an example from your book of how predictive analytics helped a business solve a problem and get some kind of business benefit from it?

Eric Siegel:
Sure. There’s a juicy case study from the central chapter of the book where you get to delve into predictive model details in an accessible form, and it’s Chase Bank. I mentioned a moment ago that one of the main changes we’re seeing lots of new uses of predictive modeling, both new, in terms of what gets predicted and how those predictions are used. For example, Google, beyond predicting what’s going to be a good search result – and they do use predictive modeling for that, as you might expect – they also have a clever use of it for ads.

What Chase Bank did, they were predicting for mortgage-holders a certain kind of mortgage risk, which wasn’t whether you’ll never get paid back at the bank, which is your traditional use of the word, “risk,” in financial applications, but rather, predicting whether you’ll get paid back too fast, that is to say they’re going to pay off the whole mortgage all of a sudden much more quickly than originally planned and then you’re going to lose out on all the expected future interest payments, which is why you were in the mortgage business in the first place. It usually indicates that they have defected to one of your neighboring banks to refinance the mortgage. It may also be that they’ve sold their house. Either way, you lost the customer.

Bob Thompson:
So, the risk here is that they’re just going to refi and take their business elsewhere, and you’ve lost that income stream?

Eric Siegel:
Right, exactly. Now traditionally, that’s called attrition modeling or churn modeling, predicting which customer’s leaving, and is used by marketing to target outreach for retention purposes. In this case, though, what Chase Bank actually did was use that prediction to inform the estimate of the value of that mortgage, to help make intelligent mortgage decisions, one mortgage at a time, as to whether to sell the mortgage to the neighboring banks, and to make a better decision than that sort of dictated by the market standard value, based on a simpler profile of the mortgage. They attributed a major windfall in the first year of driving those should I sell yes, no per mortgage decision over millions of mortgages, just from the first year of improving their decision-making with those predictions.

Adoption Challenges

Bob Thompson:
I want to spend a few minutes and really talk about the challenges of companies, in general, in getting these sorts of benefits. Do you feel like predictive analytics has really hit the mainstream? I read a blogger who’s been in the space for a while who made that statement, and I kind of questioned whether it was true. What do you think? Is it mainstream or is it still kind of coming into its own?

Eric Siegel:
Well, I’d say it’s still coming into its own. I think it’s mainstream if you mean most people have heard of it. I think most Fortune 500 managers have heard of it and maybe half of them actually know the specific ways defined to differentiate from forecasting, so that’s pretty mainstream. In terms of the way organizations are using it, I think that most larger organizations, although using it for certain applications within certain initiatives and generally seeing great success would probably correctly self-assess by saying they’re really only scratching the surface, in terms of the potential. I do think that we’re really in the process now of not just a technology shift, but a culture shift. Embracing this technology is more than just a technical endeavor, very much so.

Bob Thompson:
So, what’s holding people back? The technology has been in the market for many, many years. There’s been companies selling the technology. Services firms like yours have been around for 10, 15, 20 years. What’s holding back the mainstreaming of this technology adoption?

Eric Siegel:
I think the challenges for increased adoption are probably three-fold: in culture, implementation and also personnel. Just in terms of culture, there’s sort of an understanding what this stuff is. It’s not as complex as you would have thought, but it does take a little bit of attention to detail to understand what the value proposition is.

And then in implementation, not just on the technical side, but in terms of operations, the predictions, the per individual predictions are generated by this great technology, which has now migrated well outside the lab and is being deployed. That’s great, but once you have these predictions, that’s not enough, you have to act on them. So, they have to be integrated into the marketing, risk management or fraud detection, presidential campaign or whatever it is operating. They have to be integrated. They have to take these per case predictions into account to change what they’re currently doing and essentially improve it in some way. There’s going to be certain resistance, certain trust issues, and there’s just simply changes to process.

And then the third one I mentioned is personnel. Oftentimes, a great deal of this can be outsourced, the core technology. There’s lots of consultancies. The value of having someone in-house who’s an expert, who has experience deploying these projects at other organizations or at your organization already, is very large, in part because not only the technical side of it, but really having the vision of what needs to happen at the organization in helping drive it, seeing the carrot at the end of the stick and pushing things forward. Those technologies are in great demand at this time.

Role of Data Scientist

Bob Thompson:
It seems like you’re talking about a couple of different things are important from an organization point. One is the leadership. Leadership needs to appreciate what the technology can do and really create a culture where the managers will look to data and to analytics to inform their decisions, or at least partly inform their decisions rather than going based on history and maybe some habits that they’ve had. I certainly heard that in many interviews that I’ve had that if companies don’t encourage this from fairly high levels, it’s very easy just to stick in a rut. We’ve always done it this way. Why do we need to improve this process? We’re making enough money, and so on.

I’ve also been hearing a lot about this so-called data scientist over the last year or more. What’s your take on that, Eric? Is that the real problem – there are not enough data scientists out there? Or is it perhaps that maybe some of the technology is still too complicated, where you need somebody that’s very technical, very sophisticated in this area to make it work?

Eric Siegel:
That’s a great question. It’s so multi-faceted, and it depends on the context in the organization. The fact is, no matter how advanced a technology gets and how friendly the user interface that wraps it up is, predictive analytics at some stage in the project does need expertise. I don’t necessarily mean a Ph.D., a real research-focused background involved with the inner workings behind the predictive model, but it’s still relatively technical just to get a sense of well, what do I have to do with today’s data that we have sitting in today’s database to put it in the right form and format so it’s ready to be used by this technology, so that its what’s called training data, data from which to learn, basically a bunch of rows and one example, an individual per row. That transformation is not really rocket science, but it does take a great attention to detail and a level of sort of database programming.

Attention to detail is like “what do I need to predict about these people?” It’s not just responses or purchases, it’s who’s going to purchase more than three items within this time window that’s going to give us some profit margin more than this? I’m just making this up, but these are the levels of detail that define this. It’s often a yes/no prediction goal. To define it is a quasi-technical thing. There’s a bunch of things that you can’t get around, and they’re specific to your organization. You need people who have experience to help with to help drive the process and make sure they’re staying on track.

Bob Thompson:
So this adoption problem, in your view, is not going to be solved just by easier to use software. There’s still somebody or a group of people that need to be able to marry technology capabilities with business problems. And leadership, I would expect, that can help make sure that change actually happens.

Eric Siegel:
Exactly, well put.

Bob Thompson:
I know you work with clients and you run these conferences and speak about this a lot and do research. Is there anything that really stands out in your mind aside from these resource issues we just discussed that you’ve seen as a common problem or obstacle that companies might want to watch out for?

Eric Siegel:
Sure. Probably the first common obstacle that most practitioners and consultants would name has to do with doing the technical side before the business side, so figuring out what you’re going to predict, analyzing the data, creating the predictive models before it’s become entirely clear exactly how that’s going to be valuable in action by the organization and that you’re getting buy-in for the initiative. So, sort of putting the cart before the horse. I’d also reference people, there’s a great list by John Elder. I think it’s called “Top 10 Data Miner Mistakes.” He’s mostly referring to predictive analytics. Data mining can mean other things, as well. There’s a YouTube Video, there’s a chapter in one of his books, etc., so that would be a way to sort of see a more comprehensive view. There are a bunch of well trodden pitfalls.

Bob Thompson:
OK, but putting the technology cart before the business horse is certainly one I’ve heard about in a lot of other disciplines, CRM and so on. So, it certainly makes sense here, as well.

Eric Siegel:
Yeah.

Bob Thompson:
Eric, thank you so much, it’s great talking with you. I think the leverage of predictive analytics is so huge that I would certainly like to see more companies use it. Since CustomerThink is about customer-centric business, there are tremendous applications in there, as well, to do things that will help companies provide the right products, services and experiences for customers. I know you’ve illustrated some of those in your book, as well. Again, congrats on your book, and I appreciate your time today on Inside Scoop.

Eric Siegel:
My pleasure. Thanks for the opportunity, Bob.

Eric Siegel
Eric Siegel, PhD, founder of Predictive Analytics World and Text Analytics World, author of "Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die," and Executive Editor of the Predictive Analytics Times, makes the how and why of predictive analytics understandable and captivating. Eric is a former Columbia University professor who used to sing educational songs to his students, and a renowned speaker, educator and leader in the field.

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here