Why forecasts are wrong

This week brought big news of one of the most cruel and heartless tyrants of the 21st century.  This man is known for narcissistic behavior, surrounding himself with a cadre of beautiful women, sleeping in a different place every night, picking new favorites each week, and bringing tears and untold suffering to those who didn’t receive his approval. Of course, I’m referring to Simon Cowell (the man perhaps best known for his smug demeanor, oversized ego, and undersized t-shirts).

The big news was that Simon allowed Melanie Amaro, an 18-year old student from Sunrise, Florida, to become the Top 17th X Factor USA contestant for the Live Shows next Tuesday.

Why Forecasts Are Wrong

Forecasts never seem to be as accurate as we would like them to be – or need them to be. As a result, we are tempted to throw money at the problem in hopes of making it go away. While there are plenty of consultants and software vendors willing to take that money in exchange for lots of promises, are these promises ever fulfilled?

How many organizations are you aware of – perhaps even your own – that have thrown thousands or even millions of dollars at the forecasting problem, only to end up with the same lousy forecasts?

The question boils down to: Why do forecasts always seem to be so wrong…and sometimes so terribly wrong?

The Four Reasons

There are at least four types of reasons why our forecasts are not as accurate as we would like them to be.

• Unsuitable software – software that doesn’t have the necessary capabilities, has mathematical errors, or uses inappropriate methods. It is also possible that the software is perfectly sound but due to untrained or inexperienced forecasters, it is misused.

• The second reason is when untrained, unskilled, or inexperienced forecasters exhibit behaviors that affect forecast accuracy. One example is over-adjustment, or as W. Edwards Deming put it, “fiddling” with the process. This happens when a forecaster constantly adjusts the forecast based on new information. Research suggests that much of this fiddling makes no improvement in forecast accuracy and is simply wasted effort.*

• Forecasting should be a dispassionate and scientific exercise seeking a “best guess” at what is really going to happen in the future. The third reason for forecasting inaccuracy is process contamination by the biases, personal agendas, and ill-intentions of forecasting participants. Instead of presenting an unbiased best guess at what is going to happen, the forecast comes to represent what management wants to see happen – no matter what the marketplace is saying.

• Finally, bad forecasting can occur because the desired level of accuracy is unachievable for the behavior being forecast. Consider calling heads or tails in the tossing of a fair coin. It doesn’t matter that we may want to achieve 60, 70 or 90 percent accuracy. The reality is that over a large number of tosses, we will only be right half of the time and nothing can change that. The nature of the behavior determines how well we can forecast it – and this applies to demand for products and services just as it does to tossing coins.

The next series of The BFD posts will explore these reasons why forecasting is so often so poorly done.

----------

*See "Good and Bad Judgment in Forecating: Lessons from Four Companies" by Fildes and Goodman, published in Foresight: The International Journal of Applied Forecasting, Issue 8, Fall 2007, pp. 5-10.  A more detailed and "academic" version of this research was published (along with several useful comments and replies) in "Effective forecasting and judgmental adjustments: an empirical evaluation and strategies for improvement in supply-chain planning" by Fildes, Goodwin, Lawrence, and Nikolopoulos, published in International Journal of Forecasting 25 (2009) 3-23.

2 Comments

  1. Chris
    Posted October 22, 2011 at 5:36 am | Permalink

    Are there are a couple of other factors at play here i.e. lack of domain knowledge, and the impact of 'unknown unknowns'. Taking the latter as an example, nobody predicted the depth of the recession and that will have impacted on forecasts in all sorts of commercial organisations.

  2. Posted November 18, 2011 at 2:59 pm | Permalink

    Check Chapter 6 of "Interpreting Economic and Social Data-A Foundation of Desdcriptive Statistics", Springer, 2009. Some important facts overlooked by nearly all forecasters.

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <p> <pre lang="" line="" escaped=""> <q cite=""> <strike> <strong>

  • About this blog

    Michael Gilliland is a longtime business forecasting practitioner and currently Product Marketing Manager for SAS Forecasting. He initiated The Business Forecasting Deal to help expose the seamy underbelly of the forecasting practice, and to provide practical solutions to its most vexing problems.
  • Subscribe to this blog

    Enter your email address:

    Other subscription options

  • Archives