First, do no harm

5

Did Hippocrates really say this? Probably not, for among other reasons he spoke Ancient Greek and not Modern English. However, such mere technicality should not distract us from the importance of this oath for forecasters. Please place your hand over your heart and say with me:

First, do no harm.

There is a certain level of satisfaction we achieve with touching and tinkering and changing things to our liking. But does this touching and tinkering and changing improve the quality of our work? And how would we ever know?

In forecasting, we have the opportunity to change things to our liking by manual override of our system generated forecasts. Many organizations have some manner of forecasting system. These can range from lowly trend line functions in Excel, all the way up to large-scale automatic forecasting with inputs and calendar events in SAS Forecast Server. Whatever the software comes up with for a forecast, we reserve the right to review and change the number to something that meets our management’s approval. But is this the right thing to do?

Forecast Value Added is a metric for evaluating this kind of forecasting process activity. FVA is defined as the change in a forecasting performance metric (MAPE, bias, or whatever you happen to be using) that can be attributed to a particular step or participant in the forecasting process. When FVA is positive this is a good thing – your efforts are adding value by making the forecast better. But when FVA is negative, it means that your efforts are just making the forecast worse. By itself, a traditional forecasting performance metric (such as MAPE) does not tell you whether you are making things better or worse – you’d never know. The painful reality is that much of what we do in forecasting does just make it worse – in complete violation of our Hippocratic oath. (See my quarterly column “Worst Practices in Business Forecasting” in Supply Chain Forecasting Digest for examples of these missteps.)

For more details on FVA analysis and some short case studies on organizations that have applied it, see the recorded webcast Forecast Value Added Analysis: Step-by-Step and the accompanying white paper.

I am committed in this blog to be critical, but also good humored. For a good laugh (if it weren’t so sad), please take a look at this minute long demo: Base sales forecasts and trend lines on data. Warning: As you watch this keep in mind the children’s game “What’s Wrong with This Picture.” I’ve already compiled my own list, and am looking forward to hearing yours…

Tags
Share

About Author

Mike Gilliland

Product Marketing Manager

Michael Gilliland is author of The Business Forecasting Deal (the book), and editor of Business Forecasting: Practical Problems and Solutions. He is a longtime business forecasting practitioner, and currently Product Marketing Manager for SAS Forecasting software. Mike serves on the Board of Directors for the International Institute of Forecasters, and received the 2017 Lifetime Achievement in Business Forecast award from the Institute of Business Forecasting. He initiated The Business Forecasting Deal (the blog) to help expose the seamy underbelly of forecasting practice, and to provide practical solutions to its most vexing problems.

5 Comments

  1. Steve Siewert on

    I am trying to get across to my leadership the harm in attempting to do FVA on only one month of the demand plan. Our demand plans are typically 24 months long with monthly increments. What they are attempting to do is evaluate the effectiveness of a demand planner by looking at the systemically generated statistical forecast and then comparing it to whatever the statistical forecast became with demand planner modifications to the algorithm parameters. This is done only by looking at the next month of the demand plan. Do you have any white papers or comments in relation to how long the study should be?

  2. Michael Gilliland on

    Hi Steve -- As you are trying to communicate to your leadership, with just one point (or even a small number of points) any FVA results could be due entirely to chance. So you definitely wouldn't want to draw conclusions about "value added" by your forecasters with so little data. Donald Wheeler's short book "Understanding Variation" has a lot of good examples and guidance for management analysis and reporting, and I highly recommend it. I also have a white paper "Forecast Value Added Analysis: Step-by-Step" that goes into more detail on FVA data collection and reporting. This is available on the SAS Forecasting website: http://www.sas.com/technologies/analytics/forecasting/index.html#section=6.
    I don't have hard rule about the number of observations necessary, but what you are trying to do is determine whether you can reject the null hypothesis that FVA = 0 (i.e. the forecaster override has no impact). For example, in one month the FVA could be positive or negative entirely by chance. If FVA is positive for 3 or even 6 straight months you may start to feel that value is being added, but I still wouldn't draw any firm conclusions since it is still possible that this occurred by chance. However, if you track data for a year, and your FVA is positive for 11 or 12 of those months, that is much stronger evidence that the forecasting is making a positive difference. Wheeler's book can help you come up with good rules to use for your data.

  3. Steve Siewert on

    Good Morning Mike,
    First of all, thanks for the quick reply. I am meeting with leadership again at 10:30 this morning and your response I believe provides that last piece that I needed. It's good to know that there is a forum like this that people have as a resource.
    Again thanks,
    Steve

  4. Pingback: Forecast quality in the supply chain (Part 2) - The Business Forecasting Deal

  5. Pingback: 5 steps to setting forecasting performance objectives (Part 2) - The Business Forecasting Deal

Leave A Reply

Back to Top