Saving Lives With Analytics

Fortune has a brief article on aneurysm-spotting analytic software developed by IBM in collaboration with Mayo Clinic (HT to Satish Bhat for bringing this article to my attention).

To help in their aneurysm hunt, radiologists at Mayo Clinic use special software developed with IBM that analyzes a three-dimensional brain scan. Computer algorithms process information in the images, pick out abnormal areas where fragile blood vessels might be hiding, and flag the potential trouble spots for Mayo doctors. So far the results are promising. In trials the software found 95% of aneurysms; a typical radiologist would have found 70%.

95% vs 70%. How many lives saved as a result? I couldn’t find anything in the article on this question so I did some Googling.

Here’s what I found:

perhaps 25,000 to 50,000 people a year in the U.S. have a brain hemorrhage caused by a ruptured aneurysm.

Of these 25,000-50,000 people,

One-third to nearly half of patients have minor hemorrhages or “warning leaks” that later lead to a severe devastating brain hemorrhage days later.

So 8,000-25,000 people come in with a “warning leak”. Every one of their brain scans is presumably looked at by a radiologist. According to the Fortune article, radiologists have only a 70% success rate so let’s assume that 30% of the scans (i.e. 2,500 to 7,500 people) are mistakenly thought to be normal and, therefore, left untreated. They return days later with a burst aneurysm. What happens next?

The overall death rate once the aneurysm ruptures is about 40%

So, between 1000-3000 patients will die because the aneurysm wasn’t caught during the first visit.

Now, let’s look at how the analytic software will perform. According to Fortune, the software yields a 95% success rate so 5% of the scans (i.e. 400 to 1200 people) will be mistakenly thought to be normal and left untreated. Of these patients, between 160-480 patients will die (using the same 40% death rate as before).

Incremental lives saved? Between 800-2500 patients annually. Wonderful! Kudos to IBM and Mayo.

Here’s a little (hopefully) self-explanatory graphic. The blue box represents the incremental lives saved by the software; the red represents the lives that could be saved if the software’s accuracy goes to 100%.

p.s. I realize that numerous assumptions have been made in this back-of-the-envelope assessment. Feel free to criticize and improve. I just wanted to get a quick sense for how many lives would be impacted.

Factoids, Stories and Insights

Recently, The Economist had a special report titled “Data, data everywhere“. The report examines the rapid increase in data volumes and what the implications are. The report got the attention of the blogosphere (example) and I recommend taking a look if you haven’t already.

When I read articles like these, I try to extract three categories of “knowledge” for future use: factoids, stories, and insights.

  • Factoids are simply data points that I feel might come in handy someday
  • Stories are real-world anecdotes. The most memorable ones have an “aha!” element to them.
  • Insights are observations (usually at a higher level of abstraction than stories) that make me go “I never thought of that before. But it makes total sense.”

Think of this crude categorization as my personal approach to dealing with information overload. Of course, there’s a fair amount of subjectivity here: what I think of as an insight may be obvious to you and vice-versa.

So what did I make of The Economist article? There were numerous factoids that I cut-and-stored away (too many to list here but email me if you want the list), a few memorable stories, and a couple of insights.

Let’s start with the stories.

In 2004 Wal-Mart peered into its mammoth databases and noticed that before a hurricane struck, there was a run on flashlights and batteries, as might be expected; but also on Pop-Tarts, a sugary American breakfast snack. On reflection it is clear that the snack would be a handy thing to eat in a blackout, but the retailer would not have thought to stock up on it before a storm.

Memorable and concrete. Neat.

Consider Cablecom, a Swiss telecoms operator. It has reduced customer defections from one-fifth of subscribers a year to under 5% by crunching its numbers. Its software spotted that although customer defections peaked in the 13th month, the decision to leave was made much earlier, around the ninth month (as indicated by things like the number of calls to customer support services). So Cablecom offered certain customers special deals seven months into their subscription and reaped the rewards.

Four months before the customer defected, early-warning signs were beginning to appear. Nice but not particularly unexpected.

Airline yield management improved because analytical techniques uncovered the best predictor that a passenger would actually catch a flight he had booked: that he had ordered a vegetarian meal.

Hey, I knew this all along! Over 20 years, I have ordered vegetarian meals almost every time and have almost never missed a flight.

Just kidding. This came out of left field, I have never seen it before. While the claim that airline yield management improved substantially due to this single discovery feels like a stretch, the story is certainly memorable.

Sometimes those data reveal more than was intended. For example, the city of Oakland, California, releases information on where and when arrests were made, which is put out on a private website, Oakland Crimespotting. At one point a few clicks revealed that police swept the whole of a busy street for prostitution every evening except on Wednesdays, a tactic they probably meant to keep to themselves.

Worry-free Wednesdays! Great story, difficult to forget.

Let’s now turn to the two insights that stood out for me.

a new kind of professional has emerged, the data scientist, who combines the skills of software programmer, statistician and storyteller/artist to extract the nuggets of gold hidden under mountains of data.

This wasn’t completely new to me (I have friends whose job title is “Data Scientist”) but seeing the sentence in black-and-white crystallized the insight for me and made me appreciate the power of the trend. Particularly the point that the data scientist needs to be at the intersection of programming, stats and story-telling.

As more corporate functions, such as human resources or sales, are managed over a network, companies can see patterns across the whole of the business and share their information more easily.

What the author means by “managed over a network” is “managed in the cloud”. In my experience, data silos are all too common and this often leads to decisions being optimized one silo at a time, even though optimizing across silos can produce dramatic benefit.

I had not appreciated that, as data for more and more business functions gets housed in the cloud, data silos will naturally disappear and it will become increasingly easier to optimize across functions.

Well, that was what I gleaned from the article. If you “extract knowledge” in a different way than factoids/stories/insights, do share in the comments – I would love to know.

Applying Behavioral Economics To Retail

Recently, the McKinsey Quarterly published a brief article titled “A marketer’s guide to behavioral economics“. The author recommends four strategies for marketers, all inspired by research in behavioral economics.

Behavioral economics is, of course, a large and established field of academic research, complete with a Nobel Laureate (Daniel Kahneman). The academic work has been popularized in a number of books (examples: Nudge, The Winner’s Curse) over the past decade.

In my previous work at ProfitLogic/Oracle as well as my current consulting work with retailers, I have been on the lookout for opportunities to help my clients exploit these findings. Sadly, I have not come up with anything that isn’t already well-known or already being applied.

Against this backdrop, I was curious if the McKinsey article had new insights to offer; something that I could make Monday Morning useful for retailers.

Let’s take a look at the four recommendations from McKinsey. Continue reading Applying Behavioral Economics To Retail

The Analytic Entrepreneur Series: Dr. Robert Phillips Interview Follow-up

My interview with Dr. Bob Phillips generated much interest and some great follow-on questions from readers. Bob responded to these questions in detail as a comment to the original post.  I have reproduced the questions and Bob’s answers below to make it easier for readers.

When enumerating the reasons why companies don’t implement analytics, he listed “budget, organizational bandwidth, politics, or sheer inertia”. i wish you would have drawn him into expanding on that, particularly in the context of the current economy. one would suspect that feelings of uncertainty about the future would tend to hit this sector twice over: few companies have the stomach for investment and fewer still believe in solutions that promise better decision-making when the future looks so murky.
-narayan

Bob: I definitely believe that the current financial situation has made some companies more reluctant to invest in new things. In many cases, the budget for innovative approaches has simply disappeared. Selling to financial services companies, I have seen plenty of that over the last year! Clearly some fi-serve companies have also been rattled by the apparent failure of many of their current systems that failed to anticipate the meltdown. On the other hand, these situations create opportunities to sell to companies who are looking for a new way of doing things. So, at least at Nomis, we are begining to see a definite pick up in business and interest over the last four months.

Great interview with Bob! I would like to know from Bob if he has had customers tell him “Prove it to me that your analytics works by sharing both risks and rewards!”. What has Bob’s experience been like in successfully establishing KPI-based compensation for his analytics solutions?

– Satish Bhat

Bob: Regarding the gain-share arrangements that Satish mentions — I have mixed feelings. I think it is important to offer gain-share and be willing to negotiate it if the customer insists but I have had very few cases in which we have ended up concluding a gain-share agreement. Obviously, there is the need to have a very clear metric which both parties agree on as a “fair” measure of benefit regardless of market changes which has often been difficult. Secondly, in the areas in which I have worked, the potential upside has often been so large — $100s of millions in some cases — that companies have realized that they would be better off with a fixed fee. Finally, I have been somewhat hesitant to enter gainshare arrangements because I don’t like to give customers a motivation — even unconsciously — to try to minimize the benefits that one of our solutions has provided. I would rather have them out saying that my company’s solution made them $500 million to the world rather than being concerned about how large they can say the benefits are publicly.

But, I know that some companies have been quite successful with gain-sharing engagements, so it is probably partly just me.

Had a question for Bob – he alluded to budget, organizational bandwidth, politics, or sheer inertia being reasons why analytics isn’t implemented.

More often than not, when I run into cases where analytics isn’t embraced, there is a story that involves investments the company made, low subsequent adoption, hard to tease out and quantify value leading to a spiralling downwards cycle that ends up with a complete loss of faith. I suspect the reasons he mentioned all play a part, but can he comment on the notion of faith in analytics (as oxymoronic as it sounds), and what we in the analytical world can do better?

Vijay Subramanian
Engagement Director, Oracle Retail

Bob: I think there are two issues. One is “faith in analytics”, which varies from company to company and industry to industry with financial services and airlines (to mention two industries with which I am quite familiar) ranking very high and, perhaps hotels (especially ten years ago) ranking lower. Not much you can do about this except acknowledge that, in some cases, you are not only selling your solution, you are also selling the idea of analytics. The second issue is coming in, in the wake of a failed engagement in which someone else’s solution, for whatever reason, did not deliver the promised benefits and ended up abandoned. I guess the biggest reason that I see for this is a combination of lack of appropriate senior commitment at the company initially combined with a lack of follow-up and true pro-active support from the solution provider. The first six months following installation of a new analtyic solution is critically important and to be successful often requires heroic efforts on the part of provider that go beyond the traditional definitions of “support” and “maintenance”. The combination of a weakly committed management and an insufficiently active supplier is a recipe for rejection, no matter how “good” the solution might be in technical terms.

Sorry to be so long winded — hope this is useful.

Rama: Bob, Thanks for the thoughtful and detailed answers. And thank you to the readers for great questions.

The Analytic Entrepreneur Series: Dr. Robert Phillips

Bob Phillips, a legend in revenue management and pricing circles, guest-lectured to my MIT Sloan analytics class last week. It was a treat for the students to listen to and interact with someone who I think of as the quintessential “Analytic Entrepreneur”.

As I listened to him, it occurred to me that it would be great to share his ideas and perspectives with the readers of this blog. More generally, there are great Analytic Entrepreneurs like him out there and it would be wonderful to hear about how they view the world.

So, that brings us to … drumroll, please …. the Analytic Entrepreneur Series! Every so often, I will chat with an Analytic Entrepreneur and share the highlights of the conversation here. I am delighted to kickoff the series with Bob Phillips.

First, some background on Bob (detailed bio).

Bob is currently Founder and Chief Science Officer of Nomis Solutions, a company that’s applying price optimization analytics to consumer financial services. In parallel, he is Professor of Professional Practice at Columbia University Graduate School of Business.

Before founding Nomis in 2002, Bob served as CTO of Manugistics and prior to that, he was Founder and CEO of Talus Solutions (I still remember reading the news reports when Manugistics announced its acquisition of Talus for $366m in the Fall of 2000).

Over the past 15 years, he has helped optimize price and revenue across a dizzying variety of industries including airlines, rental cars, hotels, automotive, electric power, freight transportation, and manufacturing. His 2005 book, Pricing and Revenue Optimization, is a model of clear writing and would be my hands-down first choice of textbook if I teach an MBA course on pricing.

Bob has a Ph.D. in Engineering-Economic Systems from Stanford and holds undergraduate degrees in Mathematics and Economics from Washington State.

OK, on to the Q&A.

What led you to the idea for Nomis?
When I was working at Manugistics, we did a small pilot project on pricing optimization for a bank.  While this did not turn into a full-fledged sale, it convinced me that there was an opportunity in this industry.  After I left Manugistics, I partnered with Simon Caufield, who had an extensive background in financial services.  We did some research, talked to a number of different banks and convinced ourselves that there was a major opportunity in helping banks set and adjust rates for loans and deposits better.

Continue reading The Analytic Entrepreneur Series: Dr. Robert Phillips