Why do we rely on judgment when analytics outperforms it?


Wherever there is uncertainty there has got to be judgment, and wherever there is judgment there is an opportunity for human fallibility.

Donald Redelmeirer, physician-researcher

Recently, I read a fascinating book titled The Undoing Project: A Friendship That Changed Our Mind by Michael Lewis (W.W. Norton & Company, 2017). Lewis also authored the book Money Ball: The Art of Winning an Unfair Game (2003). The Undoing Project is about two psychologists (Danial Kahneman, behavior psychologist, and Amos Tversky, mathematical psychologist) who researched a series of breakthrough studies that led to the field of behavioral economics.

The book validated my years of practical experience using analytics-driven based forecasting, which has outperformed judgment on a regular basis. What’s amazing is that Kahneman and Tversky’s research was conducted in the late 1960’s and early 1970’s. Unfortunately, not many people are aware of the significance of their research into why human judgment is flawed, and why mathematics tends to outperform human judgment close to a 100 percent of the time.

The core key takeaways of their findings are:

  • Predictions are made based on too small data samples.
  • Data that seem to have patterns are technically random, but are nevertheless used to make predictions.
  • Mathematical models based on an expert’s design tend to outperform those same expert’s judgmental predictions.

Kahneman and Tversky used data and analytics to find the true patterns of human behavior to replace the false ones that govern our lives. Similarly, demand analysts and planners need to find the true patterns of human judgmental overrides to replace the false ones that govern our demand forecasts and plans.

The challenge raised by Kahneman and Tversky is that people don’t want their “gut feelings” pinned down or restricted.  Decision makers often don't believe the numbers because they are too weak, they don’t communicate what they're seeking, or the story isn’t strong enough. Sound familiar?

Instead, decision makers use mental rules of thumb — heuristics — to cope with uncertainty, which often doesn’t work well. Misjudgment of the problem is simply the availability heuristic in action, or what is referred to as systematic bias. The brain appears to be programmed to provide as much certainty about a given situation as possible even in the face of much uncertainty.

In other words, we don’t choose between things, we choose between descriptions of things. Decisions we make are driven by the way they are presented. Many experts seem to think the cues they use to make decisions (future predictions, or demand forecasts) are too elusive to model using statistics. This is not always the case, or true.

Flawed decision making

This book exposes the flaws in theories of human behavior, regarding how we make decisions and use judgment to predict uncertainties (future forecasts). Experiments conducted by Kahneman and Tversky, dramatized the weakness of human statistical intuitions. They called the force that leads human judgment astray representativeness or the similarity between whatever we're judging and some model we have in our mind regarding that thing, or past event.

This is further illustrated in the book with an example of “The Man versus Model of Man" (Lewis R. Goldberg, 1976).  Kahenman and Tversky developed a simple model based on factors that surgeons said they use to identify ulcers in patients. The model beat those same doctors at accurately predicting an ulcer in humans from several simple X-rays. The model even beat the best doctor. In practice, the doctors did not abide by their own idea of how to best diagnose an ulcer, or did not follow their characteristics cues or definitions.

What is actually happening when we use judgment to predict uncertain situations?    

People, when faced with a problem that has a statistically correct answer, don’t think like statisticians. We do not use statistical reasoning even when faced with a problem that could be solved with statistical models.  We use subjective probability, the odds you assign to any given situation when you are more or less guessing.

Kahneman and Tversky stress the reality of judgment of responsiveness, meaning the decisions we make, the conclusions we reach, and the explanations we offer are usually based on our judgments of the likelihood of uncertain events, such as the success of a project, the outcome of an election, or the state of a market. People replace the laws of chance with rules of thumb, or heuristics. We make judgments, we argue, we compare whatever we're judging to some model in our minds, not a statistical model. As a result, we don’t always follow a systematic approach when making judgement based predictions. Our emotions in many cases cloud our judgement.

Heuristics and judgmental bias are in many cases the key cause of inaccurate projections. According to Kahneman and Tversky’s research, there are four ways in which we make judgments when we don't know the answer for sure: availability, representative, anchoring, and simulation. We make decisions and predictions based on what seems more representative of our mental model, or rule-of-thumb. We tend to take data at face value without inspecting the data more closely. Consider this quote from the book:

Man's inability to see the power of regression to the mean leaves him blind to the nature of the world around him.

Emotions tend to play a key role in judgmental projections (future forecasts)

In many cases, people don’t understand that random sequences sometimes appear to have patterns. We have an incredible ability to see meaning in these patterns where none exist. In the book Michael Lewis mentions the case of the NBA player with a “hot hand”, as an example. Where one night one of the players makes several consecutive baskets, then everyone starts saying he’s the hot shooter, so feed him the ball. When in fact, that player is no hotter than the last game. Kahneman and Tversky call this the rationale of “false patterns”, which leads to “selective matching”.

Another is the old adage that there’s a correlation between arthritis and the weather. For example, someone occasionally has a very bad arthritis attack on a day that has severe weather conditions.  This then becomes a selective match arthritis with severe weather conditions when in fact it has been proven statistically that there is no correlation between sudden arthritis attacks and severe weather conditions. These are the pitfalls in the human mind when it is required to render judgments in conditions of uncertainty.

Kahneman and Tversky proved through the collection of data and analytics that people form judgments, they don’t just make random mistakes. Most of us have a stereotype of "randomness" that differs from true randomness. Our stereotype of randomness lacks the clusters and patterns that occur in true random sequences. This happens more often with small samples. The smaller the sample size, the more likely that it is unrepresentative of the wider population. As a result, we do not follow the correct rule when left to ourown devices.

Also, memory plays a key role in making judgments and decisions. Scenarios that can be easily recalled, or they are more available to our memories appear to be more probable. Any fact or incident that happened to preoccupy a person is likely to be recalled with ease, and so be disproportionately weighted in any judgment.

People tend to use similarity judgment to make decisions even in the face of additional information that is contrary to the similarity. We tend to ignore all objective data and go with our gut feeling or sense. The very factors that cause us to become more confident in our predictions also lead those predictions to be less accurate.

This book, The Undoing Project by Michael Lewis should be a required reading for all demand analysts, planners, and data scientists, as well as executives. Consider this parting quote from the book:

  The handwriting might have been on the wall all along. The question is: was the ink visible?

And here's a parting question from me: When are we going to undo judgmental overrides to demand forecasts, and start using more data and analytics?


About Author

Charlie Chase

Executive Industry Consultant/Trusted Advisor, SAS Retail/CPG Global Practice

Charles Chase is the executive industry consultant and trusted advisor for the SAS Retail/CPG global practice. He is the author of Next Generation Demand Management: People, Process, Analytics and Technology, author of Demand-Driven Forecasting: A Structured Approach to Forecasting, and co-author of Bricks Matter: The Role of Supply Chains in Building Market-Driven Differentiation, as well as over 50 articles in several business journals on demand forecasting and planning, supply chain management, and market response modeling. His latest book is Consumption-Based Forecasting and Planning: Predicting Changing Demand Patterns in the New Digital Economy. To learn more, please see his Author page.


  1. I read this book, two ways of thinking, in 2017.
    I conducted research with my students on decision-making to open new businesses and even with statistical data, most of the students made divergent decisions from the data found.
    Congratulations excellent article.

Back to Top