SAS author's tip: Planning versus forecasting and manual overrides


Applied Data Mining for Forecasting Using SASThis week's SAS tip is from Applied Data Mining for Forecasting Using SAS by Tim Rey, Arthur Kordon, and Chip Wells. Whether you're a forecasting practitioner, engineer, statistician, or economist, you'll appreciate the many real-world examples in the book. And hopefully this free excerpt.

The following excerpt is from SAS Press authors Tim Rey, Arthur Kordon, and Chip Wells' book “Applied Data Mining for Forecasting Using SASCopyright © 2012, SAS Institute Inc., Cary, North Carolina, USA. ALL RIGHTS RESERVED. (please note that results may vary depending on your version of SAS software).

10.6 Planning Versus Forecasting and Manual Overrides

Manual overrides are adjustments made to statistical forecasts. Forecasts are usually adjusted to more closely align them with stakeholder group plans or goals over the lead forecast horizon. This topic has already received excellent coverage in several texts. The purpose of this section is to summarize how planning activities can impact the accuracy and usefulness of generated forecasts. A method for implementing manual overrides in a way that increases the likelihood of enhancing forecast usefulness is discussed.

Statistical forecasts are not usually implemented as provided in the supply chain or planning processes that they feed. Forecast users include C-level executives, non-statistician forecasters, production managers, planners, sales managers, and so on. Each of these people usually has an opinion about the accuracy of generated statistical forecasts and adjusts or ignores them accordingly. In many businesses, each of these people might be allowed to adjust (manually override) generated forecasts in a haphazard way.

Some of the ideas covered in reconciliation apply to this topic, too. The statistical forecast is an objective best guess, based on historical patterns, about what will happen in the future. Manually changing or overriding these forecasts has the potential to substantially degrade their accuracy and usefulness. However, this does not mean that forecasts should never be adjusted. Forecast users might have information about competitor activities or forthcoming restrictions on business processes that is not in the historical data. Under these circumstances, forecast accuracy can be enhanced by embedding the hypothesized effect of future events into lead forecasts.

Forecasting best practices include the systematic and supervised adjustment of generated forecasts (Chase 2010). Forecasts should only be adjusted based on reliable information that is not contained in the historical data. Forecasting worst practices (see Gilliland 2010) includes haphazard forecast adjustments from stakeholder groups that might have different and conflicting objectives and small adjustments based on gut instincts (Gilliland 2010).

Some interesting studies provide evidence to confirm this categorization. Defining a small adjustment as around 5%, Fildes and Goodwin (2007) found that small manual overrides tend to diminish the accuracy of forecasts. (See also Armstrong 2001.) Manual adjustments in the range of 25% or more are likely to have the opposite effect. The authors (cited above) conclude that small adjustments tend to be based on factors like a stakeholder’s need to appear to be contributing something to the planning process and not on evidence of future changes. Larger changes are more likely to be based on plausible evidence of future changes because an explanation for the override will likely be required by other members of the firm. This leads to another forecasting best practice—the requirement that all manual overrides be tracked and reasons for changes to the baseline forecast be documented (Chase 2009).

Systematic overrides that are based on plausible evidence of future changes tend to enhance forecast accuracy and usefulness. Haphazard overrides that are based on other factors usually lead to worse forecasts. Chase (2009) provides a framework for conducting overrides in a systematic, supervised, and documented planning environment. His approach is called demand-driven forecasting and his book is recommended if you are interested in this area of applied forecasting. 17


About Author

Shelly Goodin

Social Media Specialist, SAS Publications

Shelly Goodin is SAS Publications' social media marketer and the editor of "SAS Publishing News". She’s worked in the publishing industry for over thirteen years, including seven years at SAS, and enjoys creating opportunities for fans of SAS and JMP software to get to know SAS Publications' many offerings and authors.

Comments are closed.

Back to Top