High on complexity


Paul Goodwin's Hot New Research column is a must-read in each issue of Foresight. The current column, "High on Complexity, Low on Evidence: Are Advanced Forecasting Methods Always as Good as They Seem?" ends with this sage advice:

If the name of a method contains more words than the number of observations that were used to test it, then it's wise to put any plans to adopt the method on hold.

In professional forecasting circles, it has long been recognized that fancier methods are no guarantee of better forecasts (even though fancier methods can often provide a better fit to history!).  As Goodwin notes, "...an improved fit to past data may be associated with poorer forecasts because the method is falsely seeing systematic patterns in the random movements in past data, and assuming that these will continue into the future."

The main point of Goodwin's column is that forecasting researchers -- those publishing articles touting advanced new methods -- should provide reliable evidence that the methods they advocate will actually work. The accuracy of the new method should be compared to the accuracy of appropriate benchmarks (typically simpler methods), to see if the extra complexity of the new method is justified.

Goodwin cites Len Tashman's 2000 article ("Out-of-sample tests of forecast accuracy: An analysis and review," International Journal of Forecasting, 16, 437-450), arguing that forecasting methods need to be tested on a sufficient number of out-of-sample observations. He points to several recently published examples "where grandiose claims of forecasting performance rest on scant evidence, where "big conclusions" are drawn from "small amounts of data."

New forecasting ideas are great -- the bigger and crazier the better. But any new forecasting method should be evaluated (just like we would a new drug or therapy) starting with a null hypothesis:

H0: The new method has no effect on forecasting performance

Without sufficient evidence to reject the null hypotheses, implementing the method might just be wasting our time.



About Author

Mike Gilliland

Product Marketing Manager

Michael Gilliland is author of The Business Forecasting Deal (the book), editor of Business Forecasting: Practical Problems and Solutions, and Associate Editor of Foresight: The International Journal of Applied Forecasting. He is a longtime business forecasting practitioner, and currently Product Marketing Manager for SAS Forecasting software. Mike serves on the Board of Directors of the International Institute of Forecasters, and received the 2017 Lifetime Achievement award from the Institute of Business Forecasting. He initiated The Business Forecasting Deal (the blog) to help expose the seamy underbelly of forecasting practice, and to provide practical solutions to its most vexing problems.

1 Comment

  1. Pingback: Paul Goodwin on misbehaving forecasting agents - The Business Forecasting Deal

Leave A Reply

Back to Top