Highlights from IBF Orlando

Last week I had the pleasure of attending (with six of my SAS colleagues) the IBF's Best Practices Forecasting Conference in Orlando. Some of the highlights:

Worst Practices RoundtableIn addition to the SCB interviews, Charlie and I hosted roundtable discussions, and delivered regular sessions (mine co-presented with Erin Marchant of Moen).

  • The "worst practices" roundtable had participants confessing their forecasting sins -- or at least reporting ones committed by "friends" or colleagues.

Erin Marchant did a fabulous job at our "Applying Forecast Value Added at Moen" presentation. While they are still early in their FVA efforts, she shared some extremely valuable insights from Moen's journey:

  • FVA analysis takes the emotion out of forecast discussions
    • Data now supports the conversation about how to improve the process
  • Sometimes you can't do better than the naive forecast
    • FVA helps you focus your forecast improvement efforts in the right places
  • FVA is a measure of your forecast process
    • NOT an "us vs. them" proposition
    • NOT a catapult to throw others "under the bus"
    • NOT an excuse to create the sales forecast in a silo
  • FVA starts a conversation that improves your forecasting process!

Erin also noted that because of FVA:

  • We are able to better pinpoint the causes of forecast error and start dialogues about those parts of our process.
  • We are able to begin conversations about our supply chain structure that could lead to more accurate demand signals and better service for our customers.

Learn more in Erin's prior interview on the IBF blog, and in her forthcoming SupplyChainBrain interview.

Receiving pillow from Eric Wilson of Tempur SealySpecial Thanks

Thanks to Eric Wilson of Tempur Sealy and the folks at Arkieva, who contributed the Tempur-Cloud® Breeze Dual Cooling Pillow that I won in the raffle at Eric's session.

My long-term forecast is for good sleeping.

Stephanie MurrayAnd as always, IBF's Director of Strategic Relationships, Partnerships, Memberships, & Events, Stephanie Murray gave SAS the royal treatment. For me personally, this included a stash of M&Ms (brown ones removed per contract), and mini-Tobasco sauce with every meal.

Stephanie's energy and behind-the-scenes organization are a huge (and under-appreciated) part of the success of these events.


Post a Comment

Probabilistic load forecasting competition

Dr. Tao Hong

Dr. Tao Hong

So you think you know how to forecast?

Now is your chance to prove it, by participating in a probabilistic load forecasting competition run by my friend (and former SAS colleague), Dr. Tao Hong.

Currently a professor at UNC Charlotte and director of the Big Data Energy Analytics Laboratory (BigDEAL), Tao is opening his class competition to students and professionals outside of his Energy Analytics course. See his October 12 Energy Forecasting blog for information, including these competition rules:

  • The competition will start on 10/22/2015, and end on 11/25/2015.
  • The historical data will be released on 10/22/2015.
  • The year-ahead hourly probabilistic load forecast is due on 11:45am ET each Wednesday starting from 10/28/2015.
  • The exam is individual effort. Each student form a single-person team. No collaboration is allowed.
  • The student can not use any data other than what's provided by Dr. Tao Hong and the U.S. federal holidays.
  • Pinball loss function is the error measure in this competition.
  • The benchmark will be provided by Dr. Tao Hong. A student receive no credit if not beating the benchmark nor ranking top 6 in the class.
  • No late submission is allowed.

If you are interested in joining the competition, please email Dr. Tao Hong (at hongtao01@gmail.com) for more details.

Post a Comment

Guest blogger: Len Tashman previews Fall issue of Foresight

Editor Len Tashman's Preview of the Fall 2015 issue of Foresight

Len Tashman

Len Tashman

This 39th issue of Foresight features a special section on forecasting support systems (FSS) developed by our FSS Editor Fotios Petropoulos. His article Forecasting Support Systems: Ways Forward highlights three main areas for improvement: better utilization of open-source software and Web-based features, adoption of important methods not currently in the FSS arsenal, and much greater support for interaction between statistical output and managerial judgment. His “ways forward” make me think that if you’re satisfied with the capabilities of your current FSS, you don’t know what you’re missing.

That said, not all of Fotios’ vision finds support from the authors of the six commentaries following his article. A majority of these “critics” are developers and experts on systems for forecasting and planning. While they do endorse much of what Fotios recommends and expand his list with their own, they express particular reservations about the role of open-source software as well as anxiety over the challenges organizations will face in implementing an upgraded FSS. As one commentator expressed it:

In the hands of an uneducated user, a more powerful FSS may simply provide
the tools to be wrong faster and on a larger scale. The key aspect in the future
of FSS is user education.

The success of collaboration in forecasting and planning nearly always faces “inside threats” from employee and executive (mis)behaviors. In Collaborative Culture: The New Workplace Reality—the first of two articles in our Collaborative Forecasting and Planning section—Neill Wallace and John Mello examine the need to sustain effective internal collaboration, and urge management to be aware of practices that work against this goal:

The challenge for leaders will be to let go of conventional practices and instead embrace an approach that places trust in teams.

Their joint authorship itself is a victory for collaboration: Neill is a business executive and writer on workplace trends, and John is an academic and Foresight’s S&OP Editor.

In the second article in this section, Jack Harwell presents An Executive Guide to Hiring Successful Demand Planners. Management, according to Jack, has to engage in broader thinking about this role and its growing importance:

Demand planning has evolved into more than a numbers game. Today’s demand planner is often the leader of the S&OP process and must guide an organization through collaboration and conflict resolution.

Contributors Wallace and Harwell then stick around a bit longer to be the subjects in this issue’s double dose of our Forecaster in the Field feature.

The Fall issue concludes with a review of the newly revised edition of the well-known pharmaceutical forecasting book by Arthur G. Cook, Forecasting for the Pharmaceutical Industry: Models for New Product and In-Market Forecasting and How to Use Them. Our reviewer is Christian Schäfer, former strategist for Boehringer Ingelheim and author of an earlier Foresight article on pharma forecasting, “How to Separate Risk from Uncertainty in Strategic Forecasting,” which appeared in our Winter 2013 issue.

Post a Comment

Fall 2015 forecasting education opportunities

You may not be in London on October 7 to take advantage of the Lancaster Centre for Forecasting's free workshop on promotional forecasting. However, there are still plenty of forecasting educational opportunities coming up this fall:

SAS Business Knowledge Series

My colleague Charlie Chase (author of Demand-Driven Forecasting), delivers this new course over two full days at the SAS offices in downtown Chicago. Use the code BDDF-CHI for a 25% discount on the registration fee.

My colleague Chip Wells is teaching this online course, held on the afternoons (1:00-4:30pm ET) of September 28-29. Chip expanded my original half day course with lots of new examples and exercises. This is a great chance to get  up to speed on FVA analysis, from the convenience of your own office.

IBF Business Planning & Forecasting Conference (Orlando, October 18-21)

IBF's biggest event of the year includes a free Forecasting & Planning Tutorial for IBF members on October 18, and a Leadership Forum with VIP Dinner Reception on the 19th. The conference continues October 20-21, with Sara Park, Group Director of Sales & Operations Planning at Coca-Cola, delivering the Keynote address.

Among the over two dozen sessions, I hope you'll join me and Erin Marchant, Demand Planning and Analytical Systems Power user at Moen, for a look at "Applying Forecast Value Added at Moen." And check out Erin's interview on the IBF blog.

Analytics 2015 - Las Vegas (October 26-27)

The SAS Analytics events cover the full range of advanced analytics, including statistical analysis, data mining, text analytics, optimization and simulation, and forecasting. Hot topics like machine learning, cybersecurity, and the internet of things will also be covered.

Forecasters will be particularly interested in the session by Len Tashman, editor-in-chief of Foresight: The International Journal of Applied Forecasting, on the "Do's and Don'ts of Measuring Forecast Accuracy."

Also check out the SAS Talks Keynote by Udo Sglavo, Senior Director of the data mining and forecasting R&D teams at SAS, for the latest news on SAS forecasting software.

Here are the top 3 reasons to attend Analytics 2015 at the Bellagio in Las Vegas.

Analytics 2015 - Rome (November 9-11)

Regardless of your role – whether data scientist, industry expert, analyst, statistician, business professional, leading researcher or academic – Analytics 2015 in Rome will give you practical and strategic insights on all emerging analytics technologies and approaches.

In addition, there is a dedicated path for executives and C-levels who want to enhance their company analytics culture learning from international best practices.

Among the forecasting topics, Udo Sglavo will present "A New Face for SAS Forecast Server" -- demonstrating the SAS Forecast Server Client web interface, and Snurre Jensen will show how "The World's Best Large-Scale Forecasting Product Gets Even Better." Olivier Gleron, AVP of Demand & Supply Planning at Nestlé, will deliver a success story.

Post a Comment

Free practitioner workshop - October 7 (London, UK)

LCF LogoOur friends at the Lancaster Centre for Forecasting have announced their next free practitioner workshop. It will be held Wednesday October 7,1:00-5:15pm, at BMA House, Tavistock Square, London WC1H. This session's top is:

Promotional Modelling and Forecasting

Understanding and forecasting the effects of a promotion for both retailers and manufacturers remains a major challenge with important financial and supply chain consequences. But promotional modelling is developing rapidly and new approaches and software are becoming available.

The event will bring together well-known practitioners (SAS, Nielsen, Marketing QED) and academics to present, hear about and discuss new actionable insights on promotions and demand forecasting within the realm of marketing analytics.

Drawing on their expertise in data-driven promotion strategies, the participants will share the newest, impactful, and innovative ideas, experiences and tools with business attendees.

The workshop will provide market analysts and demand forecasting professionals with the opportunity to turn sophisticated analytical promotional models into real-life, useful insights delivering more effective forecasts.

Places are limited, so booking is essential. Sign up here: Forecasting Events Registration Form.

Post a Comment

Judgmental adjustments to the forecast

So you think you can outsmart your statistical forecast? Apparently, lots of people do.

In "Judgmental Adjustments to Forecasts in the New Economy" (Foresight, Issue 38 (Summer 2015), 31-36), Manzoor Chowdhury and Sonia Manzoor argue that forecasters are becoming more dependent on judgmental adjustments to a statistical forecast.

Sometimes this is because there isn't sufficient data to generate trustworthy statistical forecast. For example, there may be no history for a new item, or limited history for items with a short product lifecycle. Or volume may be fragmented across complex and interconnected distribution channels. Or the immediate impact of social media (favorable or unfavorable) cannot be reliably determined.

Of course, the old standby reason for judgmental adjustments is when the statistical forecast does not meet the expectations of management. Executives may have "propriety information" they can't share with the forecasters, so it cannot be included in the statistical models. Or they may be (mis-)using the forecast as a target or stretch goal (instead of what the forecast should be -- a "best guess" at what is really going to happen).

Do You Have a Good Reason?

Does your boss (or higher level management) make you adjust the forecast? If so, that is probably a good enough reason to do so. But if they insist you make small adjustments, consider pushing back with the question, "What is the consequence of this adjustment -- will it change any decisions?"

Even if directionally correct, a small adjustment that results in no change of actions is a waste of everyone's time.

A large adjustment, presumably, will result in different decisions, plans, and actions. But will it result in better decisions, plans, and actions? In a study of four supply chain companies, Fildes and Goodwin ("Good and Bad Judgment in Forecasting," Foresight, Issue 8 (Fall 2007), 5-10) found that any benefits to judgmental adjustments are "largely negated by excessive intervention and over-optimism." In their sample, negative adjustments (lowering the forecast) tended to improve accuracy more than positive adjustments.

A Simple Test

As a simple test of your forecasting abilities, it should be easy to determine whether your adjustments are at least directionally correct.

Take a look at your historical forecasting performance data. (Every organization should be recording, at the very least,  the statistical forecast (generated by the forecasting software) and the final forecast (after adjustments and management approval), to compare to the actual that occurred. Much better is to also record the forecast at each sequential step in the forecasting process, such as statistical forecast, forecaster's adjustment, consensus adjustment, and final (management approved) forecast.)

What percentage of the adjustments were directionally correct? If more than half then congratulations -- you are doing better than flipping a coin!

Warning: Just be aware that you can make a directionally correct adjustment and still make the forecast worse. For example, statistical forecast=100, adjusted forecast=110, actual=101.

If you don't keep or have access to historical data on your forecasts and actuals, then test yourself this way: Every morning before the opening bell, predict whether the Dow Jones Industrial Average (or your own favorite stock or stock market index) will end higher or lower than the previous day. It may not be as easy as you think.

Post a Comment

Forecast tracking by iteration with SAS Forecast Server Client

My colleague Evan Anderson completes this week's blog series with a post on the enhanced forecast tracking capabilities built into SAS Forecast Server's new web client. Evan is R&D's Principal Software Developer for the tracking capabilities.

Guest Blogger Evan Anderson on Forecast Tracking by Iteration

Photo of Evan Anderson

Evan Anderson

The best measures of forecast accuracy depend on hindsight. When the future arrives, we want to compare it with our previously computed forecasts. This should be a part of any iterative forecasting process, that is, any forecasting project where new observed values of the data become available periodically, and the forecast horizon gets extended accordingly.

By archiving each iteration of the forecast data set before it is replaced with new actuals and new forecasts, you acquire a source of forecast accuracy data which grows over time. You can use it to answer questions such as:

  1. How accurate are my forecasts, overall?
  2. How many periods into the future can I forecast before accuracy drops below an acceptable level?
  3. Has my accuracy been stable over time, or do one or more of my iterations stand out from the rest?

If you are forecasting across a hierarchy of products, SKUs, forecasters, etc., you can find out how accuracy varies across the hierarchy. An exception report showing which nodes of the hierarchy are associated with unacceptable levels of accuracy can help you focus your efforts where they are most needed to improve forecasts.

These capabilities are built into the Forecast Tracking feature of Forecast Server Client.

When input data are updated with newer observations, it automatically archives the existing forecasts and creates a new iteration. It matches up actual values for a given date with previously forecasted values for that date and computes accuracy in terms of Mean Error, Mean Absolute Percent Error (MAPE), Mean Absolute Deviation (MAD), or MAD/Mean Ratio.

For exception reporting, rules can be defined in terms of acceptable thresholds for the values of one or more of these measures.

Reports show, in graphical and tabular format,

  1. Current forecasts alongside actuals and forecasts from previous iterations
  2. The forecast graph from any selected iteration overlaid with actuals from the most recent iteration
  3. Accuracy measures by lead-time (number of periods)
  4. Accuracy measures by iteration

Here are some examples, from a project having five forecasting iterations.

Example of Tracking Example of Tracking

Post a Comment

Multistage modeling with SAS Forecast Server Client (Part 2)

In this post, my colleague Pu Wang will take us through an example of the multistage modeling strategy in the new web client in SAS Forecast Server. It shows the potential forecast accuracy performance of this method compared to traditional time series modeling.

Guest Blogger Pu Wang on Multistage Modeling

Picture of Pu Wang

Pu Wang

In this example we are using sample data from a department store. This includes weekly sales from fall 2009 to fall 2012, at store-sku level. The last 12 months of data is used as out-of-sample data, so as to evaluate the accuracy of final forecast.

In the retail industry, it is commonly known that price is a key factor that affects sales of a product. Therefore, modeling price elasticity becomes crucial for a good forecasting model. Figure 1 plots a typical demand (black) and price (green) series in the retail industry at store-SKU level. It is difficult to observe any salient features about the demand series due to the sparsity:

1. Demand occurs intermittently at store-SKU level, no seasonal pattern is observed
2. It is inappropriate to apply traditional time series models on the demand series
3. No solid conclusion can be drawn about the relationship between demand and price

Demand vs Price chart

Figure 1. Demand vs. Price series at store-sku level

(Note: Click on Figures for full resolution view of the SAS Forecast Server screenshots.)

Data aggregation is a natural solution for data sparsity, which often times can reveal some hidden features of the data. Figure 2 plots the aggregated demand and price:

1. Demand series is continuous, where time series models can be applied to estimate the trend and seasonality
2. Price elasticity can be estimated through ARMAX or regression models

Forecasting models can be built to generate forecast for aggregated demand series using techniques mentioned above.

Chart of aggregated data

Figure 2. Aggregated data

In order to obtain the disaggregation factors needed in the reconciliation process, we need to generate the low level forecast for each demand series in the 2nd stage.

Various feature extraction techniques can be applied to obtain the predicted values, such as moving average, simple smoothing, regression etc. With the advantage of pooling information across multiple series, regression is a preferable tool to do feature extraction. As for this example, regression model pools price information across multiple products, which gives a better estimate on price elasticity at the low level.

In the last stage, with forecasts from previous two stages, a top-down reconciliation is conducted to generate the final forecast at the low level.

Figure 3 illustrates the difference between time series model and regression model with pooling effect. In the plot, black series is the actual demand. Red series shows the final forecast from time series model, which is basically some smoothed value of the actual demand. Blue series is the final forecast from regression model, which captures the demand occurrence most of the time.

chart of actual vs predicted demand

Figure 3. Actual demand vs. Predicted demand

Finally, Table 1 presents accuracy measurements for both in-sample and out-of-sample data. The multistage modeling strategy shows superior performance comparing to traditional time series model in terms of prediction power.

table of forecast accuracy

Table 1. Forecast accuracy measures for in-sample and out-of-sample data

*multistage: proposed multistage modeling strategy
*timeseries: traditional time series model without applying feature extraction techniques.

Post a Comment

Multistage modeling with SAS Forecast Server Client (Part 1)

Pu Wang is a Sr. Research Statistician in SAS R&D, and has contributed this post on multistage modeling in the new SAS Forecast Server web client.

Guest Blogger Pu Wang on Multistage Modeling

Picture of Pu Wang

Pu Wang

The rapid development of information technologies in the recent decade provides forecasters with huge amount of data, as well as massive computing capabilities. However, “sufficient” data and strong computing power do not necessarily translate into good forecasts.

Different industries and products all have their unique demand patterns. There is not a one-size-fits-all forecasting model or technique.

For example, in the consumer package goods (CPG) industry, demand at store-SKU level is usually sparse and noisy, which makes it difficult to extract price and promotional effects. For high frequency data such as hourly grocery basket transactions, it is inappropriate and inefficient to apply traditional time series models. A good forecasting model must be tailored for the data to capture the salient features and satisfy the business needs.

Hierarchical Forecasting

A hierarchy based multistage modeling strategy can be used to provide tailored forecasting models.

This strategy provides a general framework to build a forecasting system in three stages. The system determines a forecast reconciliation level, which is typically some higher level in the hierarchy.

  • In the first stage, data aggregation is applied to eliminate noise and reveal hidden features. Feature extraction techniques are combined with time series models to generate forecasts for aggregated data.
  • In the second stage, feature extraction techniques are applied again to pool salient features across multiple time series, and generate forecasts for each individual time series at low level.
  • In the third stage, it combines the forecasts obtained from the previous two stages, and conducts a top-down reconciliation to generate the final forecast.

This multistage modeling strategy is available as a plugin in the new SAS Forecast Server Client.

Diagram of Multistage Modeling

In Part 2, we will walk through an example to show the philosophy of the multistage modeling strategy, and the performance of this method compared to traditional time series model in terms of forecasting accuracy.

[Note: For additional discussion of hierarchical forecasting, see previous BFD posts on Mistakes in the forecasting hierarchy and Forecasting across a time hierarchy with temporal aggregation.]

Post a Comment

Time series segmentation with SAS Forecast Server Client (Part 2)

Photo of Jessica Curtis

Jessica Curtis

Recall from yesterday that time series segmentation is the method of dividing time series data into distinct types or segments based on the underlying properties of the time series. It is the one of the most important first steps in the forecasting process. What this allows you to do is apply a customized forecast modeling strategy to each time series segment.

Today my colleague Jessica Curtis, Solutions Architect at SAS, continues her discussion...

Guest Blogger Jessica Curtis on Time Series Segmentation

There are many different approaches for segmenting your time series. We’ve listed some of the more prevalent approaches here. Time series can be segmented based on the patterns in demand, you can take a business rules-based approach, you can apply predictive modeling with time series clustering, or segment time series based on their percent contribution to revenue.

Modeling Strategies

Let’s return to our example. With time series segmentation, we can now apply customized forecast modeling strategies to each of our segments.

For the stable time series, we can apply robust forecasting models, such as ARIMA models. For the seasonal time series, we can apply seasonal forecasting models. For the time series with a level shift in demand, we can use ARIMAX models that include a level shift event variable. For the short history items, we can forecast using similarity analysis techniques. Intermittent demand forecast models can be applied to the time series with a lot of sparsity in demand. We can forecast the holiday items with custom time intervals.

Types of models to use with each type of time series

The value here is that we have a structured approach to our large-scale forecasting challenge, and a methodology for creating high quality forecasts across many different time series patterns. Now let’s go beyond the methodology behind time series segmentation and talk through a few real world examples fo where our customers are using time series segmentation to improve their forecasting processes.

Customer Stories

SAS has many customers leveraging time series segmentation to improve the forecasting process. Let's talk about two customer success stories today:

The first customer example is a Fortune 100 food production company. Their goal was to improve the SKU replenishment forecasting process in order to drive increased sales and customer satisfaction. Before using SAS, they lacked a scalable, manageable solution for implementing efficient forecasting models. By applying time series segmentation to their replenishment forecasting process, this company saw increased sales, reduced stock-outs, improved brand management and forecast accuracy.

Our second customer example is a multinational technology company. This company wanted to improve their product sales forecast, which was being created manually and one product at a time. They also wanted to incorporate search intent data into their forecast to drive better marketing efforts. By using SAS time series segmentation and forecasting, this company saw improved forecast accuracy, a more efficient forecasting process, and better data management through reducing the number of search intent forecast variables to only the most relevant to each segment.

SAS Forecast Server Client

We’ve talked a lot about the methodology behind time series segmentation and the value it adds to our customer’s forecasting challenges. Now, let’s look at how SAS makes time series segmentation an integrated part of the forecasting process. Laura Ryan, the Product Manager for SAS Forecast Server has recorded a short overview of the new Forecast Server Client. As you’ll see in the demonstration, one of the new features of Forecast Server Client is incorporating time series segmentation techniques into the forecasting process.

Wrap Up

We reviewed an approach for successfully forecasting a wide range of times series patterns through time series segmentation. We talked about the value time series segmentation has provided to our customers, and we showed how SAS’s new Forecast Server Client makes time series segmentation an integrated part of the forecasting process. For more information, please visit us at sas.com.


Time Series Segmentation Recorded Webinar

If you missed this yesterday, watch the 15-minute webinar Time Series Segmentation by Jessica, which includes a 6-minute demonstration of the new SAS Forecast Server Client by SAS forecasting product manager Laura Ryan.


Post a Comment
  • About this blog

    Michael Gilliland is a longtime business forecasting practitioner and currently Product Marketing Manager for SAS Forecasting. He initiated The Business Forecasting Deal to help expose the seamy underbelly of the forecasting practice, and to provide practical solutions to its most vexing problems.
  • Subscribe to this blog

    Enter your email address:

    Other subscription options

  • Archives