Tumbling dice

4

Mean Absolute Percent Error (MAPE) is the most commonly used forecasting performance metric, and for good reason, the most disparaged.

When we compute the absolute percent error the usual way, as

APE = | Forecast - Actual | / Actual

this is undefined when Actual = 0.  It can also lead to huge percent error values when Actual is very small compared to the size of the error. (So definitely don't use MAPE to evaluate forecasting performance for intermittent demand.)

Since the usual calculation of APE has these flaws (and at least one more big one, to be discovered below), a veritable cottage industry has evolved around finding alternatives.

Percentage Errors Can Ruin Your Day

In their article "Percentage Errors Can Ruin Your Day" in the latest (Fall 2011) issue of Foresight,  Stephan Kolassa and Roland Martin examine four proposed alternatives to the APE calculation:

  • APEf (APE with respect to forecast): | Forecast - Actual | / Forecast
  • sAPE (Symmetric APE): | Forecast - Actual | / ((Forecast + Actual)/2)
  • maxAPE (Max of Actual and Forecast): |Forecast - Actual | / max {Forecast,Actual}
  • tAPE (Truncated APE): min {APE,1)

The authors suggest a simple dice tossing experiment to illustrate the implications for forecasting bias in each of these metrics.  Suppose you take a standard six-sided die (assumed to be fair and balanced, not right or left leaning) and use each roll to simulate demand.  Since the possible outcomes are 1, 2, 3, 4, 5, or 6, we should be able to agree that 3.5 is the "best" forecast.  (Over a large number of rolls, the "average demand" will be very close to 3.5, with over- and under- forecasts roughly equal.)  Our forecast of 3.5 for each roll is unbiased -- it won't be chronically too high or too low.

Carry out the tumbling die experiment, measure the forecast errors and bias, and you'll find that each flavor of APE can encourage a biased forecast.  For example, if our company uses the standard MAPE to evaluate forecasting performance, we would minimize our forecast errors by always forecasting 2.  In fact, always forecasting 1, 2, or 3 will, over the long haul, give us a MAPE less than the MAPE of always forecasting 3.5 or anything above that.

So why should management care?  Because if the sole job objective of their forecasters is to minimize MAPE, and those forecasters are smart enough to do the math (or eschew the math and instead read clever articles in Foresight), the forecasters will purposely forecast too low -- perhaps leading to chronic inventory shortages and lousy customer service.

 

 

 

Share

About Author

Mike Gilliland

Product Marketing Manager

Michael Gilliland is a longtime business forecasting practitioner and formerly a Product Marketing Manager for SAS Forecasting. He is on the Board of Directors of the International Institute of Forecasters, and is Associate Editor of their practitioner journal Foresight: The International Journal of Applied Forecasting. Mike is author of The Business Forecasting Deal (Wiley, 2010) and former editor of the free e-book Forecasting with SAS: Special Collection (SAS Press, 2020). He is principal editor of Business Forecasting: Practical Problems and Solutions (Wiley, 2015) and Business Forecasting: The Emerging Role of Artificial Intelligence and Machine Learning (Wiley, 2021). In 2017 Mike received the Institute of Business Forecasting's Lifetime Achievement Award. In 2021 his paper "FVA: A Reality Check on Forecasting Practices" was inducted into the Foresight Hall of Fame. Mike initiated The Business Forecasting Deal blog in 2009 to help expose the seamy underbelly of forecasting practice, and to provide practical solutions to its most vexing problems.

4 Comments

  1. Thanks for your Foresight shoutout, Mike. If your readers would like to check out our clever articles for themselves, they are welcome to download a complimentary issue here.

  2. Pingback: The Objectives of Forecasting: Narrow and Broad - The Business Forecasting Deal

  3. Pingback: Lancaster Centre for Forecasting survey - The Business Forecasting Deal

  4. Pingback: Lancaster Centre for Forecasting survey - supplychain.com

Back to Top