Reminder: September 30 deadline for IIF/SAS Research Grants

I wanted to pass along this reminder from Pam Stroud at the International Institute of Forecasters:

Grant to Promote Research on Forecasting

Pam Stroud

Pam Stroud

For the twelfth year, the IIF, in collaboration with SAS®, is proud to announce financial support for research on how to improve forecasting methods and business forecasting practice. The award for this year will be (2) two $5,000 grants. The application deadline for the grant year 2014-2015 is September 30, 2014.  For more information visit the IIF website,

Industry forecasting practitioners should also consider applying for these grants. As a practitioner you have access to a wealth of your own company data -- the kind of data that academics have difficulty getting access to. Analysis of volatility, FVA, and the real-life performance of particular models, forecasting systems, and forecasting process would all be of interest.

Post a Comment

Forecasting and supply chain blogs

In the summer heat, when The BFD alone isn't quite quenching your thirst for forecasting know-how, here are several other sources:


Steve Morlidge

Steve Morlidge

CatchBlog -- by Steve Morlidge of CatchBull

From his 2010 book Future Ready (co-authored with Steve Player), to his recent 4-part series in Foresight dealing with the "avoidability" of forecast error, Steve is delivering cutting edge content in the area of forecast performance management. He also contributed a Foresight/SAS Webinar available for on-demand review.

Eric Stellwagen

Eric Stellwagen

The Forecast Pro -- Eric Stellwagen of Business Forecast Systems

Eric delivers great practical tutorials on various forecasting topics through his blog, and also in conference presentations and articles (including 2 chapters in the Foresight Forecasting Methods Tutorials compilation).

IBF Blog -- various industry contributors

Good source for previews to upcoming Institute of Business Forecasting events, contributed by conference speakers. Also event recaps by conference attendees, and previews of the Journal of Business Forecasting.

Shaun Snapp

Shaun Snapp

SCM Focus Blogs -- by Shaun Snapp of SCM Focus

Shaun is a prolific writer of blogs (he maintains several on his website) and books (most recently Rethinking Enterprise Software Risk), and provides a sharp critical eye on IT, ERP, supply chain, and forecasting issues.

Lora Cecere

Lora Cecere

Supply Chain Shaman -- by Lora Cecere of Supply Chain Insights

Lora is a long time industry analyst and, like Shaun, is another prolific writer covering a wide range of topics. Her recent ebook The Supply Chain Shaman's Journal: A Focused Look at Demand includes some nice mention of FVA.




Post a Comment

IIF/SAS grants to support research on forecasting

IIF LogoAgain this year (for the 12th time), SAS Research & Development has funded two $5,000 research grants, to be awarded by the International Institute of Forecasters.

  • Criteria for award of the grant will include likely impact on forecasting methods and business applications.
  • Consideration will be given to new researchers in the field and whether supplementary funding is possible.
  • The application must include: Project description, letter of support, C.V., and budget for the project.

Applications should be submitted to the IIF office by September 30, 2014 in electronic format to:

Pamela Stroud, Business Director,

For more information:

See Pamela's IIF blog for information on 2013 grant winners Jeffrey Stonebraker (North Carolina State University, USA) and Yongchen (Herbert) Zhao (University at Albany, USA). The more information link also provides details on past grant recipients, including such forecasting royalty as Robert Fildes, Dave Dickey, Sven Crone, Paul Goodwin (who delivered the inaugural Foresight/SAS Webinar last year), and Stavros Asimakopoulous, who is delivering the July 24 (10am ET) Foresight/SAS Webinar on the topic of forecasting with mobile devices.

Picture of Stavros

Stavros Asimakopoulos

In addition to his recent article in Foresight, Stavros has also posted "Forecasting in the Pocket" on the SAS Insights Center.

Post a Comment

Foresight/SAS webinar: forecasting on mobile devices

Picture of Stavros

Stavros Asimakopoulos

On July 24, 10am ET, Stavros Asimakopoulos delivers the next installment of the Foresight/SAS Webinar Series, "Forecasting 'In the Pocket': Mobile Devices Can Improve Collaboration." Based on his article (co-authored with George Boretos and Constantinos Mourlas) in the Winter 2014 issue of Foresight, Stavros will discuss how smartphones, tablet computers and other mobile devices mean new opportunities for communication and collaboration on business forecasts.

Stavros currently serves as Foresight Editor for Forecasting Support Systems, and Lecturer in User Experience (UX) at the University of Athens, Greece. Formerly a Director of UX at Experience Dynamics - USA, he has over 10 years of experience in user research, forecasting software-design improvements and UX strategy in business forecasting systems. Stavros is an active member of the British Human-Computer Interaction Group and regularly presents his work at the International Symposium on Forecasting. Earlier this week at ISF in Rotterdam, Stavros moderated a panel on forecasting user experience that included Ryan Chipley, Sr. User Experience Designer at SAS.

There is a preview of the presentation available on YouTube, and webinar registration is free.


Post a Comment

Forecasting across a time hierarchy with temporal reconciliation

The BFD blog probably won't make you the best forecaster you can be. There are plenty of books and articles and world class forecasting researchers (many of them meeting next week in Rotterdam at the International Symposium on Forecasting), that can teach you the most sophisticated techniques to squeeze every last percent of accuracy out of your forecasts. Think of these as your offense in the game of forecasting.

The purpose of The BFD, in this thankless game, is to play defense. To identify the stupid stuff, the bad practices, the things we do that sound like good ideas -- but may just make the forecast worse. In short, to protect you from becoming the worst forecaster you can be.

But sometimes the best defense is a good offense, so let's examine a method known as "temporal reconciliation" that has been getting some recent attention.

Hierarchical Forecasting

Every business forecaster is familiar with forecasting hierarchies. For example, for products:


For manufacturing and distribution:


For sales management:


Some forecasting software packages maintain the hierarchies separately, while others (like SAS Forecast Server), combine the product and geographical / organizational levels into a single hierarchy, such as:


Forecasts at the most granular level of detail (e.g., ITEM) can then be aggregated to any higher level. SAS Forecast Server automatically builds customized models and generates forecasts for each node, at each level of the hierarchy. Using "hierarchical reconciliation" you choose the forecasts at one level and have them cascade down and up the hierarchy, apportioning the numbers so that the whole hierarchy adds up properly.

The value of reconciliation is that it allows you to take advantage of information at each level. For example, there may be consistent seasonal trends that are observable at higher levels, but are difficult to discern at the most granular levels where the data is very noisy. You'll often find that models at the most granular level are flat line -- simply forecasting the average -- since there is too much randomness to detect a seasonal signal. But by applying a seasonal model at a higher level, the seasonality can be pushed down to the granular level through reconciliation. This can give you a better forecast at the granular level.

Forecasting Across a Time Hierarchy

"Temporal reconciliation" is a less familiar approach, utilizing forecasts created in different time buckets. It has been getting a lot of attention lately, with two articles in the forthcoming issue of Foresight. Per Editor Len Tashman's preview:

The two articles in this section address temporal aggregation. In his own piece, Forecasting by Temporal Aggregation, Aris provides a primer on the key issues in deciding which data frequencies to forecast and how to aggregate/disaggregate from these to meet organizational requirements. Then, in Improving Forecasting via Multiple Temporal Aggregation, Fotios Petropoulos and Nikolaos Kourentzes offer a provocative new way to achieve reconciliation of daily through yearly forecasts while increasing the accuracy with which each frequency is forecast.

Petropoulos and Kourentzes provide a summary version in the Lancaster Centre for Forecasting's latest newsletter, with the assertion, "The key idea is that by combining various views/aggregation levels of the series we augment different features of it, helping forecasting models to better capture its structure. The same simple trick can be used to reduce or even remove intermittency in slow moving items making them easier to forecast."

If you are ready to try this, note that SAS Forecast Server has included temporal reconciliation since version 4.1 (released in July 2011). As explained in The BFD at its release:

Temporal Reconciliation: A powerful new tool for combining and reconciling forecasts generated at different time frequencies (e.g. hourly, daily, weekly).  Consider demand forecasting in the electric utility industry, where consumption varies by hour within the day (less demand in the middle of the night), by day within the week (business days compared to weekends), and by week within the year (higher peak demand in summer to drive air conditioners).  With the SAS Forecast Server Procedures engine inside SAS Forecast Server 4.1, you can model the behavior at each time frequency, and then combine the results to incorporate the various seasonal cycles.


Post a Comment

Weather forecasts: deterministic, probabilistic or both?

In 1965's Subterranean Homesick Blues, Bob Dylan taught us:

You don't need a weatherman / To know which way the wind blows

In 1972's You Don't Mess Around with Jim, Jim Croce taught us:

You don't spit into the wind

By combining these two teachings, one can logically conclude that:

You don't need a weatherman to tell you where not to spit

But the direction of expectoration is the least of my current worries. What I really want to know is, WTH does it mean when the weatherman says there is a 70% chance of rain?

Greg Fishel, WRAL-TV Chief Meteorologist to the Rescue

Greg Fishel

Greg Fishel & Fan Club

I am pleased to announce that Greg Fishel, local celebrity and weather forecaster extrordinaire, will be speaking at the Analytics2014 conference at the Bellagio in Las Vegas (October 20-21). Greg (shown here with several groupies during a recent visit to the SAS campus) will discuss "Weather Forecasts: Deterministic, Probabilistic or Both?" Here is his abstract:

The use of probabilities in weather forecasts has always been problematic, in that there are as many interpretations of how to use probabilistic forecasts as there are interpreters. However, deterministic forecasts carry their own set of baggage, in that they often overpromise and underdeliver when it comes to forecast information critical for planning by various users. In recent years, an ensemble approach to weather forecasting has been adopted by many in the field of meteorology. This presentation will explore the various ways in which this ensemble technique is being used, and discuss the pros and cons of an across-the-board implementation of this forecast philosophy with the general public.

Please join us in Las Vegas, where I hope Greg will answer that other troubling weather question, can I drive a convertible fast enough in a rainstorm to not get wet. (Note, Mythbusters determined this to be "Plausible but not recommended.")



Post a Comment

Guest blogger: Len Tashman previews Summer 2014 issue of Foresight


Foresight editor Len Tashman

Len Tashman

Our tradition from Foresight’s birth in 2005 has been to feature a particular topic of interest and value to practicing forecasters. These feature sections have covered a wide range of areas: the politics of forecasting, how and when to judgmentally adjust statistical forecasts, forecasting support systems, why we should (or shouldn’t) place our trust in the forecasts, and guiding principles for managing the forecasting process, to name a selection.

This 34th issue of Foresight presents the first of a two-part feature section on forecasting by aggregation, the use of product aggregates to generate forecasts and then reconcile the forecasts across the product hierarchy. Aggregation is commonly done for product and geographic hierarchies but less frequently implemented for temporal hierarchies, which depict each product’s history in data of different frequencies: daily, weekly, monthly, and so forth.

The entire section has been organized by Aris Syntetos, Foresight’s Supply Chain Forecasting Editor, who is interviewed as this issue's Forecaster in the Field.  In his introduction to the feature section, Aris writes that forecasting by aggregation can provide dramatic benefits to an organization, and that its merits need to be more fully recognized and supported by commercially available forecasting software.

The two articles in this section address temporal aggregation. In his own piece, Forecasting by Temporal Aggregation, Aris provides a primer on the key issues in deciding which data frequencies to forecast and how to aggregate/disaggregate from these to meet organizational requirements. Then, in Improving Forecasting via Multiple Temporal Aggregation, Fotios Petropoulos and Nikolaos Kourentzes offer a provocative new way to achieve reconciliation of daily through yearly forecasts while increasing the accuracy with which each frequency is forecast.

Also in this issue, we review two new books that will interest those who take the long-range view of forecasting, both of  its history and professionally: Fortune Tellers: The Story of America’s First Economic Forecasters by Walter A. Friedman, and In 100 Years: Leading Economists Predict the Future, edited by Ignacio Palacios-Huerta.

The first book, writes reviewer Ira Sohn, Foresight’s Editor for Long-Range Forecasting, provides a “historical overview of the pioneers of forecasting, of the economic environments in which they worked, and of the tool sets and methodologies they used to generate their forecasts.” These trailblazers include familiar names and some not so well known: Roger Babson, John Moody, Irving Fisher, C. J. Bullock, Warren Persons, Wesley Clair Mitchell, and, surprisingly, Herbert Hoover. Why these? According to Friedman, they were the first to envision the possibility that economic forecasting could be a field, or even a profession; that the systematic study of a vast range of statistical data could yield insights into future business conditions; and that a market existed in business and government for weekly economic forecasts.

For the second book, editor Palacios-Huerta invited some of the “best brains in economics” – three of them already awarded Nobel prizes – to speculate on the state of the world and material well-being in 2113. Here they address some big issues: how will population, climate, social and economic inequality, strife, work, and education change in the next 100 years, and what are our prospects for being better off then than we are now?

Our section on Forecasting Principles and Methods turns to Walt Disney Resorts’ revenue managers McKay Curtis and Frederick Zahrn for a primer on Forecasting for Revenue Management. The essential objective, they write, is to adjust prices and product/service availability to maximize firm revenue from a given set of resources. We see revenue management in operation most personally when we watch airline ticket-price movements and need to know hotel room availability. It is the forecasts that drive these systems, and the authors show how they are used in revenue management.

Our concluding article is the fourth and final piece in Steve Morlidge’s series of discussions on forecast quality, a term Steve defines as forecast accuracy in relation to the accuracy of naïve forecasts, which he measures by the Relative Absolute Error (RAE). The previous articles demonstrated realistic boundaries to the RAE: an RAE above 1.0 is a cause for change since the method employed is no more accurate than a naïve – no change from last period – forecast, while an RAE below 0.5 occurred very rarely and thus represented a practical lower limit to forecast error.

Steve now deals with the natural follow-up question of how we should be Using Relative Error Metrics to Improve Forecast Quality in the Supply Chain. What actions should the business take in response to particular values for the RAE? His protocols will help identify those items that form a “sweet spot” for efforts to upgrade forecasting performance.

Meet us in Columbus, Ohio in October

The lively learning environment, easy camaraderie among the presenters and delegates, and very practical program made last year's Foresight Practitioner Conference at  Ohio State University’s Fisher College of Business a great success. We're looking forward to this year's event, From S&OP to Demand-Supply Integration: Collaboration Across the Supply Chain. The FPC offers a unique blend of practitioner experience and scholarly research within a vendor-free environment — I hope we'll see you there!

Find more information on the program at Registration is discounted $100 (to $1295) for Foresight subscribers and IIF members (use registration code 2014FORESIGHT when you register).

Post a Comment

Upcoming forecasting conferences

We're entering the busy season for forecasting events, and here is the current calendar:

Analytics2014 - Frankfurt

The European edition of Analytics2014 kicks off tomorrow in Frankfurt, Germany. Five hundred of the leading thinkers and doers in the analytics profession hook up for two full days of interaction and learning. I'll try to get reports on the several forecasting related presentations, including "The New Analytical Mindset in Forecasting" by Nestlé Iberia, and the latest on "Forecast Value Added and the Limits of Forecastability" by Steve Morlidge of CatchBull. (Steve's compelling work has been subject of several BFD postings over the last year.)

For those (like me) who were unable to travel to Germany for this event, the US edition of Analytics2014 is back in Las Vegas (October 20-21).

APICS & IBF Best of the Best S&OP Conference - Chicago

For Sales & Operations Planning junkies, next week (June 12-13) is the first of two upcoming S&OP-focused conferences.

Hosted by APICS and the Institute of Business Forecasting, Best of the Best speakers include two winners of IBF's Excellence in Business Forecasting & Planning award, Patrick Bower of Combe, and Alan Milliken of BASF. Also, Dr. Chaman Jain, editor of Journal of Business Forecasting, will speak on S&OP for new products.

IBF Webinar: Risk Mitigation and Demand Planning Segmentation Strategies

Even agoraphobics, shut-ins, those without travel budgets, and those under house arrest can get in on the forecasting knowledge fest, with this June 18 (11:00am EDT) IBF webinar by Eric Wilson of Tempur Sealy. Per the abstract:

To improve forecast accuracy, companies are constantly seeking demand planning practices and solutions that best utilize their planners' expertise. This presentation on product segmentation will provide the best use of segmentation using attributes and FVA to create differentiated demand planning approaches. The general purpose of segmenting planning strategies is to mitigate risk. The purpose of FVA is to minimize forecasting resources, while reaching accuracy goals. In this session, we will offer basic strategies and conceptual ideas that you may utilize in any industry. Plus, we will provide examples of how we applied these practices ourselves that led to reduction in inventory, while maintaining or improving service levels that our customers love.

International Symposium on Forecasting - Rotterdam

SAS is again a sponsor of the ISF (June 29 - July 2), which brings together the world's elites in forecasting research and practice.

For industry practitioners who may have shied away from the ISF for being too "academic" I would encourage you to reconsider. I felt the same way before attending my first ISF (San Antonio 2005), yet found a vibrant interest in real-life forecasting issues, and the opportunity to build relationships with academic researchers. There is also strong interest in practitioner presentations. So even if it's too late to enjoy Rotterdam, start planning now for ISF 2015 in Riverside, California.

Foresight Practitioner Conference - Columbus, OH

This year's second big S&OP event is "From S&OP to Demand-Supply Integration" (October 8-9 at Ohio State University). Limited to 100 attendees, with no vendors permitted to present or exhibit, it is hosted by Len Tashman, editor of Foresight: The International Journal of Applied Forecasting. The day-and-a-half conference concludes with a 90-minute panel discussion moderated by Len.


Post a Comment

A naive forecast is not necessarily bad

As we saw in Steve Morlidge's study of forecast quality in the supply chain (Part 1, Part 2), 52% of the forecasts in his sample were worse than a naive (random walk) forecast. This meant that over half the time, these companies would have been better off doing nothing and just using the naive model rather than using the forecast produced through their systems and forecasting process.

Of course, we don't know in advance whether the naive forecast will be more accurate, so this doesn't help with decision making in particular instances. But the findings provide further evidence that in the real-life practice of business forecasting, we are disturbingly mediocre.

In private correspondence with Steve last week, he brought out some important points that I failed to mention in my damnation of forecasting practice:

Very often what we find is that the non-value adding forecasts aren’t necessarily poor forecasts; in the conventional sense of the word at least…in fact they might have low MAPE. It is just that they are worse than the naïve…which is not the same.

As usual, it is obvious when you think about it…if you have a stable demand pattern it might be easy to construct a forecast with good MAPE numbers but it is actually very difficult to beat the naïve forecast because last period's actual is a good predictor. And if you manually intervene in the process in any way then you will almost certainly make things worse. Ironically it is the very messy forecast series which offer the greatest potential to add value through good forecasting.

Two conclusions to draw from this:

  1. Sometimes the naive model provides a perfectly appropriate forecast (so a naive forecast is not necessarily a "bad" forecast).
  2. A forecast that fails to beat a naive forecast is not necessarily a "bad" forecast.

Illustrate with a Comet Chart

By creating a "comet chart" (find instructions in The Accuracy vs. Volatility Scatterplot), you may be able to illustrate these observations with your own company data.

Comet Chart

Comet Chart

This comet chart displays Forecast Accuracy (here scaled 0-100%) on the vertical axis, and coefficient of variation (here truncated at 160%) on the horizontal axis, for the 5000 Item / DC combinations at a food manufacturer. The line represents the approximate accuracy of a very simple forecasting model (here a moving average) at each level of volatility.

As you would expect, forecast accuracy tends to be quite high when volatility of sales patterns is quite low, so even a very simple model (like a moving average or random walk) would perform well, and be difficult to beat. For more volatile sales patterns, there is more opportunity to "add value" to the forecast by judgmental overrides or more sophisticated models.

The comet chart is easy to construct and provides a new way to look at your company's forecasting challenge. Create yours today.

Post a Comment

Forecast quality in the supply chain (Part 2)

As we saw last time with Steve Morlidge's analysis of the M3 data, forecasts produced by experts under controlled conditions with no difficult-to-forecast series still failed to beat a naive forecast 30% of the time.

So how bad could it be for real-life practitioners forecasting real-life industrial data?

In two words: Pretty bad.

The New Study

Morlidge's nine sample datasets covered 17,500 products, over an average of 29 (weekly or monthly) periods. For these real-life practitioners forecasting real-life data, 52% of forecasts had RAEs above 1.0.


As he puts it, "This result distressingly suggests that, on average, a company's product forecasts do not improve upon naive projections."

Morlidge also found that only 5% of the 17,500 products had RAEs below 0.5, which he has posited as a reasonable estimate of the practical lower limit for forecast error.

The Implications

What are we to make of these findings, other than gnash our teeth and curse the day we ever got ourselves suckered into the forecasting profession? While Morlidge's approach continues to receive further vetting on a broader variety of datasets, he itemizes several immediate implications for the practical task of forecasting in the supply chain:

1. RAE of 0.5 is a reasonable approximation to the best forecast that can be achieved in practice.

2. Traditional metrics (e.g. MAPE) are not particularly helpful. They do not tell you whether the forecast has the potential to be improved. And a change in the metric may indicate a change in the volatility of the data, not so much a change in the level of performance.

3. Many forecasting methods add little value.

On the positive side, his findings show that there is significant opportunity for improvement in forecast quality. He found the weighted average RAE to be well above the lower bound for forecast error (RAE = 0.5). And roughly half of all forecasts were worse than the naive forecast -- error which should be avoidable.

Of course, we don't know in advance which forecasts will perform worse than the naive forecast. But by rigorous tracking of performance over time, we should be able to identify those that are problematic. And we should always track separately the "statistical forecast" (generated by the forecasting software) and the "final forecast" (after judgmental adjustments are made) -- a distinction that was not possible in the current study.

Morlidge concludes, is likely that the easiest way to make significant improvement is by eliminating poor forecasting rather than trying to optimise good forecasting. (p.31)

[You'll find a similar sentiment in this classic The BFD post, "First, do no harm."]

Hear More at Analytics 2014 in Frankfurt

Join over 500 of your peers at Analytics 2014, June 4-5 in Frankfurt, Germany. Morlidge will be presenting on "Forecasting Value Added and the Limits of Forecastability." Among the 40 presentations and four keynotes, there will also be forecasting sessions on:

  • Forecasting at Telefonica Germany
  • Promotion Forecasting for a Belgian Food Retailer, Delhaize
  • The New Analytical Mindset in Forecasting: Nestle's Approach in Europe and a Case Study of Nestle Spain
  • Big Data Analytics in the Energy Market: Customer Profiles from Smart Meter Data
  • Shopping and Entertainment: All the Figures of Media Retail

In addition, my colleague Udo Sglavo will present "A New Face for SAS Analytical Clients" -- the forthcoming web interface for SAS Forecast Server.

(Full Agenda)

Post a Comment
  • About this blog

    Michael Gilliland is a longtime business forecasting practitioner and currently Product Marketing Manager for SAS Forecasting. He initiated The Business Forecasting Deal to help expose the seamy underbelly of the forecasting practice, and to provide practical solutions to its most vexing problems.
  • Subscribe to this blog

    Enter your email address:

    Other subscription options

  • Archives