Monday Morning Analytics

The word “analytics” appears to have a million different meanings. Merely appending the word to just about anything confers an instant halo that hints at intelligence, smartness and numeracy. Naturally, vendors of software for reporting, OLAP and BI have been quick to do this.

In my experience, when I come across the word “analytics”, it typically means data summaries of various stripes. These summaries may be presented in mind-numbingly dense reports, may allow users to drill down into great detail, pivot back and forth and so on. But, at the end of the day, they are just summaries.

They are clearly necessary but far from sufficient. While they can point to where problems or opportunities may lie, they don’t usually indicate what to do next, what action to take.

I meet business decision-makers regularly as part of my work and there is immense frustration at the lack of analytics that are actionable or prescriptive. In the course of a typical workday, the typical manager reads through numerous management reports chockfull of data. But very rarely can they immediately determine what actions they should take to respond to the numbers they see in the reports.

Fortunately, there are exceptions to this dismal state of affairs. There are an increasing number of examples of business problems where analytics have been developed to recommend the best action to take. These analytics don’t just provide insights; they recommend actions, suggest decisions for the decision-maker to consider. In other words, they offer specific advice for what to do on Monday morning.

These Monday Morning Analytics will be a key theme of this blog.

Monday Morning Analytics very often involve analyzing data with models and algorithms drawn from math, statistics, econometrics, machine-learning, or optimization. These models attempt to capture the essential aspects of the business situation in such a way that the effects of future actions can be predicted. Armed with this predictive capability, the decision-maker can pick the action or decision that has the best chance of meeting his objectives i.e. he can optimize.

A slice-and-dice analysis of sales data that reveals that the Western region is underperforming is food for thought, but it is not immediately actionable. An analysis of the same data that uses a model-based approach to rigorously identify the drivers of region performance and then sheds light on which specific factors are driving the Western region’s results would be actionable and thus would fall into the category of Monday Morning Analytics.

This blog will pay particular attention to Monday Morning Analytics. About situations where data is analyzed with thoughtful (perhaps even complex) models and algorithms and the result is a significantly better understanding of what’s really going on in our complex world and therefore a higher chance of making good decisions. About what to do on Monday morning.

Share/Bookmark

9 thoughts on “Monday Morning Analytics”

  1. Adefense actual mystery near bonita springs Florida this convenience store surveillance video shows the newest powerretrenched millionaire he’s bald man along with their the suneyeglass frames on his most important and the tattoo on his best biceps. ! . ! Thin is the moment he found out their own unique ticket came all sievents numbers element of Wednesday’s getting rid of.

  2. Greetings! Very useful advice in this particular post!
    It is the little changes that make the most imnportant changes.
    Thanks a lot for sharing!

  3. By using an an on-line provider, we make it quite possible to be able to store around for the entire best terms and for the actual kind kind of payday financing that suits buyers ideal, without being limited by geographic location.

  4. Hi rama

    Very interesting Blog and specially “the Analytic edge entrepreneur series ” interview with Dr.Robert Philips

    Keep it rolling

    Thanks
    Cheers
    Kris

  5. Very thought provoking article and follow up discussion. It is encouraging to see businesses wanting reliable prescriptive answers based on analytics. Looking forward to reading more on such topics in the future.

  6. David,
    Thanks for your comments. You raise many excellent points.

    Users typically either ignore black box output or use it selectively to support their preferences or biases

    Very true. The reason, of course, is their lack of trust that the model is an accurate representation of the underlying business situation. Building this trust isn’t easy and requires time and effort from both the model-builder and the model-user. The builder can “cut windows” into the black-box so that the user can get an intuitive sense of what’s going on. The user can ask questions of the model for which she knows the answers as a means to validate the model. All these can and do help but require time.

    I have noticed that users often care more about analyzing the input data to complicated models than looking at the output. In the end they revert back to looking at data! It feels like watching a child unwrap a gift and play with the box rather than the toy.

    Nice analogy! I feel it again goes back to that lack of trust in the model. On the one hand, they have a model that they don’t quite buy into, on the other hand they have data that they are comfortable with and find useful (in fact, the modeling exercise may have necessitated the right data being assembled in one place for the first time). Which way will they turn?

    The act of constructing a model can be the catalyst for thinking rationally and using the right data.

    Couldn’t agree more. Unfortunately, doing so is time-consuming and which manager today has the time? I feel this is one of the greatest challenges to the model-building community – how do we build models that managers can learn to trust in and use quickly?

    There must be some overriding principles for good model-based decision support system design.

    There is some very interesting academic work on this going back to the 70s. John Little‘s paper on decision calculus discusses a number of these issues and possible solutions. Note: the link to the paper above points to the download page. You need to be a member of Informs to get full-text access. I found an old pre-publication version here.
    Thanks,
    Rama

  7. Rama,

    Excellent post. I appreciate your characterization of models and algorithms as being tools to *help* the decision maker optimize. A lot of forecasting and optimization applications are built as black boxes that spit out “the” number. Users typically either ignore black box output or use it selectively to support their preferences or biases (e.g., risk models for mortgage backed securities!)

    I have noticed that users often care more about analyzing the input data to complicated models than looking at the output. In the end they revert back to looking at data! It feels like watching a child unwrap a gift and play with the box rather than the toy.

    Yet, without going through the process of developing a rigorous model, they would not or could not have analyzed the data effectively. The act of constructing a model can be the catalyst for thinking rationally and using the right data.

    “What-if-alyzers” that enable users to quickly run model scenarios with control over parameters, assumptions, etc. can greatly increase the adoption prescriptive software. It is okay to give an optimized solution as well, but it is usually not sufficient and often not necessary.

    There must be some overriding principles for good model-based decision support system design.

    It would be great to hear your perspective.

    David Glenn, PhD
    Director of Science Services, SignalDemand

Leave a Reply

Your email address will not be published. Required fields are marked *