Worst practices in forecasting conference (October 5-6, 2016)

In conjunction with the International Institute of Forecasters and the Institute for Advanced Analytics at North Carolina State University, the 2016 Foresight Practitioner conference will be held in Raleigh, NC (October 5-6, 2016) with the theme of:

Worst Practices in Forecasting:

Today's Mistakes to Tomorrow's Breakthroughs

This is the first ever conference dedicated entirely to the exposition of bad forecasting practices. I am co-chairing the event along with Len Tashman, editor-in-chief of Foresight. Our "worst practices" theme reflects an essential principle:

The greatest leap forward for our forecasting functions lies not in squeezing an extra trickle of accuracy from our methods and procedures, but rather in recognizing and eliminating practices that do more harm than good.

As discussed so many times in this blog, we often shoot ourselves in the foot when it comes to forecasting. We spend vast amounts of time and money building elaborate systems and processes, while almost invariably failing to achieve the level of accuracy desired. Organizational politics and personal agendas contaminate what should be an objective, dispassionate, and largely automated process.

At this conference you'll learn what bad practices to look for at your organization, and how to address them. I'll be delivering the introductory keynote: Worst Practices in Forecasting, and the rest of the speaker lineup already includes:

  • Len Tashman - Foresight Founding Editor and Director of the Center for Business Forecasting: Forecast Accuracy Measurement: Pitfalls to Avoid, Practices to Adopt.
  • Paul Goodwin, coauthor of Decision Analysis for Management Judgment and Professor Emeritus of Management Science, University of Bath: Use and Abuse of Judgmental Overrides to Statistical Forecasts.
  • Chris Gray - coauthor of Sales & Operations Planning - Best Practices: Lessons Learned and principal of Partners for Excellence: Worst Practices in S&OP and Demand Planning.
  • Steve Morlidge - coauthor of Future Ready: How to Master Business Forecasting and former Finance Director at Unilever: Assessing Forecastability and Properly Benchmarking.
  • Wallace DeMent - Demand Planning Manage at Pepsi Bottling, responsible for forecast, financial and data analysis: Avoiding Dangers in Sales Force Input to the Forecasts.
  • Anne Robinson - Executive Director, Supply Chain Strategy and Analytics, Verizon Wireless and 2014 President of the Institute for Operations Research and the Management Sciences (INFORMS): Forecasting and Inventory Optimization: A Perfect Match.
  • Erin Marchant - Senior Analyst in Global Demand Management, Moen: Worst Practices in Forecasting Software Implementation.

Stay tuned to the Foresight website for forthcoming specifics on registration, sponsorship, and a detailed agenda.

About the Conference Hosts

Worst Practices in Forecasting Conference will be held at the Institute for Advanced Analytics logoFounded in 2007, the Institute for Advanced Analytics at North Carolina State University prepares practitioners of analytics for leadership roles in our digital world. Its flagship program is the Master of Science in Analytics (MSA), which covers a wide spectrum of skills including data management and quality, mathematical and statistical methods for data modeling, and techniques for visualizing data in support of enterprise-wide decision making.

Foresight logoNow celebrating its 10th anniversary of publication, Foresight: The International Journal of Applied Forecasting is the practitioner journal of the nonprofit International Institute of Forecasters (IIF), a cross-disciplinary institute including analysts, planners, managers, scholars, and students across business, economics, statistics, and other related fields. Through its journals and conferences, the IIF seeks to advance and improve the practice of forecasting.

 

Post a Comment

Why forecasting is important (even if not highly accurate)

"Why forecasting is important" gets searched over 100 times monthly on Google. Search results include plenty of rah-rah articles touting the obvious benefits of an "accurate forecast," but are of little help in the real life business world where high levels of forecast accuracy are usually not achieved. This is a troubling disconnect.

When can we forecast accurately?

photo of sunrise

Why forecasting is important? So we know when to get up in the morning.

We can forecast the exact time of sunrise tomorrow, and indefinitely into the future.

Why is this? Because we have good observational data (the relative positions and velocities of the sun and planets in our solar system). And we seem to have a pretty good understanding of the laws of physics that guide the behavior being forecast.

[For some interesting background, see How is the time of sunrise calculated?]

Climate and weather are guided by the same laws of physics (and also chemistry) that we seem to have a pretty good understanding of. Yet we can't expect to forecast long term climate (or even near term weather) as accurately as sunrise for a number of reasons:

  • photo of lightning

    Why forecasting is important? So we avoid lightning.

    The climate is a much more complex system. It is not nearly so easy to model as a dozen or so heavenly bodies hurtling through space.

  • Our observational data is limited to a finite number of monitors and weather stations. So there are gaps in our knowledge of the complete state of the system at any given time. (We cannot monitor every single point of land, sea, and atmosphere.)

In addition, the assumptions upon which a long term forecast is based can change over time. There may be a change in decisions or actions in response to the original forecast. [Snurre Jensen examines this phenomenon in "When poor forecast accuracy is a good thing."] For example:

  • A business increases advertising when the demand forecast falls short of plan.
  • Politicians take action to reduce CO2 levels based on a forecast of warming.

[Despite the implausibility of the second example, I want to commend Nate Silver's The Signal and the Noise for its extremely informative chapter on climate forecasting (and on forecasting in general).]

Why forecasting is important

Highly accurate forecasting, while always desirable, is rarely necessary. In fact, if your organizational processes NEED highly accurate forecasts to function properly, I suspect they don't function very well.

As long as forecasting can get you "in the ballpark," and thereby improve your decision making, it has demonstrated its value. Remember the objective:

To generate forecasts as accurate and unbiased as can reasonably be expected -- and to do this as efficiently as possible.

Why forecasting is important is that it at least gives us a chance for a better future.

Post a Comment

Know your forecastability (and maybe save your job)

Larry Lapide receiving award

Larry Lapides receives 2012 Lifetime Achievement Award from Anish Jain of the IBF

Journal of Business Forecasting columnist Larry Lapide is a longtime favorite of mine. As an industry analyst at AMR, and more recently as an MIT Research Affiliate, Larry's quarterly column is a perpetual source of guidance for the practicing business forecaster. No wonder he received IBF's 2012 Lifetime Achievement in Business Forecasting award.

In the Fall 2015 issue, Larry takes another look at the hot topic of "forecastability" -- something he first touched on in his Winter 1998/99 column "Forecasting is About Understanding Variations."

In the earlier article, Larry introduced the metric Percent of Variation Explained (PVE), where

PVE = 100 x (1 - MAPE/MAPV)

In this formula, the Mean Absolute Percent Variation (MAPV) is simply the MAPE that would have been achieved had you forecast the mean demand every period. So when you compare the MAPE of your real forecasts to the MAPV, this provides an indication of whether you have "added value" by forecasting better than just using the mean.

In spirit this approach is very similar to Forecast Value Added (FVA) analysis (which compares your forecasting performance to a naive model). As I discussed in The Business Forecasting Deal (the book), PVE is analogous to conducting FVA analysis over some time frame and using the mean demand over that time frame as the naive or "placebo" forecast.

The benefit of Lapide's approach is that it provides a quick and easy way to answer a question like "Would we have forecasted better [last year] by just using the year's average weekly demand as our forecast each week?" However, this method is not meant to replace full and proper FVA analysis because it does not make a fair comparison; the forecaster does not know in advance what mean demand for the year is going to be. Mean demand is not an appropriate placebo forecast for FVA analysis because we don't know until the year is over what mean demand turns out to be. This violates a principle for selection of the placebo, that it should be a legitimate forecasting method that the organization could use. (p. 108)

In short, we could never use the mean demand for the year as our real-life operating forecast, because we don't know what the mean demand is until the year is over!

Larry's Fall 2015 JBF column looks at how forecastability can be used to segment an organization's product portfolio, and guide the efforts of the forecaster or planner. A number of different segmentation schemes have been proposed, for example:

SAS forecasting customers may also wish to view the webinar Time Series Segmentation by Jessica Curtis of SAS, to see how to segment time series using SAS Forecast Server Client.

Why Knowing Forecastability Might Save Your Job

Management is fond of handing out performance goals, and inappropriate goals can get a forecaster in a lot of trouble. So it is essential for the forecaster to understand what forecast accuracy is reasonable to expect for any given demand pattern, and be able to push back when necessary.

Lapide argues that "sustained credibility" is the most important part of a forecaster's job review. This means management is willing to trust your analysis and judgment, that you are delivering the most accurate forecast that can reasonably be expected, even if the accuracy is not as high as they would like.

Being able to explain what is reasonable to expect -- even it is not what management wants to hear -- can establish that credibility.

(For more information, see 5 Steps to Setting Forecasting Performance Objectives.)

 

 

Post a Comment

FOBFD Greg Fishel makes headlines on climate change

WRAL Chief Meteorologist (and Friend Of the Business Forecasting Deal) Greg Fishel garnered national attention recently with a thoughtful (yet to some, provocative) blog post on climate change.

Greg Fishel

Greg Fishel with Fan Club on 2014 visit to SAS

In the post, Fishel chronicled his evolving thought on the subject. He argued for an end to the political partisanship that stifles meaningful discussion. And he appealed for us to approach the issue through science rather than ideology.

(May I have an "Amen"?)

Attention to the blog blew up when the Washington Post Capital Weather Gang covered it. And he has since  been on the cover of the local Indy Week newspaper, and interviewed on NPR.

The Fallout

Fishel's follow-up Facebook posting garnered some choice comments, as might be expected when one takes a stand against dogma and fanaticism. My favorite of which accuses him of holding "Communist views."

(Readers, I know Greg Fishel. I have eaten lunch with Greg Fishel. Greg Fishel is a friend of mine. Greg Fishel is no Communist.)

Yet, encouragingly, there were far more words of support for reason, science, and the pursuit of truth. (Some even called for Fishel himself to enter politics -- although, in the opinion of this blogger, that would be a considerable step down from his current position at WRAL.)

When it comes to understanding complex systems, like the earth's climate -- or the impact of a promotion on the demand for our products, the reality is we may never know for certain. So with science, and critical thought, and by taking an analytical approach to our problems, we are forced into a position of humility (instead of a position of OxyContin induced bellicosity).

With science (in contrast to religion or partisan politics), no supposition can be taken as fact. Instead, we must constantly refine, test, and assess our beliefs. That's how we make progress. And that's how, through science, we enjoy such wonderful technological advances as the Avocado Saver and the Car Exhaust Grill. (The latter of which, by the way, makes practical use of an automobile's greenhouse gas emissions.)

Post a Comment

Highlights from IBF Orlando

Last week I had the pleasure of attending (with six of my SAS colleagues) the IBF's Best Practices Forecasting Conference in Orlando. Some of the highlights:

Worst Practices RoundtableIn addition to the SCB interviews, Charlie and I hosted roundtable discussions, and delivered regular sessions (mine co-presented with Erin Marchant of Moen).

  • The "worst practices" roundtable had participants confessing their forecasting sins -- or at least reporting ones committed by "friends" or colleagues.

Erin Marchant did a fabulous job at our "Applying Forecast Value Added at Moen" presentation. While they are still early in their FVA efforts, she shared some extremely valuable insights from Moen's journey:

  • FVA analysis takes the emotion out of forecast discussions
    • Data now supports the conversation about how to improve the process
  • Sometimes you can't do better than the naive forecast
    • FVA helps you focus your forecast improvement efforts in the right places
  • FVA is a measure of your forecast process
    • NOT an "us vs. them" proposition
    • NOT a catapult to throw others "under the bus"
    • NOT an excuse to create the sales forecast in a silo
  • FVA starts a conversation that improves your forecasting process!

Erin also noted that because of FVA:

  • We are able to better pinpoint the causes of forecast error and start dialogues about those parts of our process.
  • We are able to begin conversations about our supply chain structure that could lead to more accurate demand signals and better service for our customers.

Learn more in Erin's prior interview on the IBF blog, and in her forthcoming SupplyChainBrain interview.

Receiving pillow from Eric Wilson of Tempur SealySpecial Thanks

Thanks to Eric Wilson of Tempur Sealy and the folks at Arkieva, who contributed the Tempur-Cloud® Breeze Dual Cooling Pillow that I won in the raffle at Eric's session.

My long-term forecast is for good sleeping.

Stephanie MurrayAnd as always, IBF's Director of Strategic Relationships, Partnerships, Memberships, & Events, Stephanie Murray gave SAS the royal treatment. For me personally, this included a stash of M&Ms (brown ones removed per contract), and mini-Tobasco sauce with every meal.

Stephanie's energy and behind-the-scenes organization are a huge (and under-appreciated) part of the success of these events.

 

Post a Comment

Probabilistic load forecasting competition

Dr. Tao Hong

Dr. Tao Hong

So you think you know how to forecast?

Now is your chance to prove it, by participating in a probabilistic load forecasting competition run by my friend (and former SAS colleague), Dr. Tao Hong.

Currently a professor at UNC Charlotte and director of the Big Data Energy Analytics Laboratory (BigDEAL), Tao is opening his class competition to students and professionals outside of his Energy Analytics course. See his October 12 Energy Forecasting blog for information, including these competition rules:

  • The competition will start on 10/22/2015, and end on 11/25/2015.
  • The historical data will be released on 10/22/2015.
  • The year-ahead hourly probabilistic load forecast is due on 11:45am ET each Wednesday starting from 10/28/2015.
  • The exam is individual effort. Each student form a single-person team. No collaboration is allowed.
  • The student can not use any data other than what's provided by Dr. Tao Hong and the U.S. federal holidays.
  • Pinball loss function is the error measure in this competition.
  • The benchmark will be provided by Dr. Tao Hong. A student receive no credit if not beating the benchmark nor ranking top 6 in the class.
  • No late submission is allowed.

If you are interested in joining the competition, please email Dr. Tao Hong (at hongtao01@gmail.com) for more details.

Post a Comment

Guest blogger: Len Tashman previews Fall issue of Foresight

Editor Len Tashman's Preview of the Fall 2015 issue of Foresight

Len Tashman

Len Tashman

This 39th issue of Foresight features a special section on forecasting support systems (FSS) developed by our FSS Editor Fotios Petropoulos. His article Forecasting Support Systems: Ways Forward highlights three main areas for improvement: better utilization of open-source software and Web-based features, adoption of important methods not currently in the FSS arsenal, and much greater support for interaction between statistical output and managerial judgment. His “ways forward” make me think that if you’re satisfied with the capabilities of your current FSS, you don’t know what you’re missing.

That said, not all of Fotios’ vision finds support from the authors of the six commentaries following his article. A majority of these “critics” are developers and experts on systems for forecasting and planning. While they do endorse much of what Fotios recommends and expand his list with their own, they express particular reservations about the role of open-source software as well as anxiety over the challenges organizations will face in implementing an upgraded FSS. As one commentator expressed it:

In the hands of an uneducated user, a more powerful FSS may simply provide
the tools to be wrong faster and on a larger scale. The key aspect in the future
of FSS is user education.

The success of collaboration in forecasting and planning nearly always faces “inside threats” from employee and executive (mis)behaviors. In Collaborative Culture: The New Workplace Reality—the first of two articles in our Collaborative Forecasting and Planning section—Neill Wallace and John Mello examine the need to sustain effective internal collaboration, and urge management to be aware of practices that work against this goal:

The challenge for leaders will be to let go of conventional practices and instead embrace an approach that places trust in teams.

Their joint authorship itself is a victory for collaboration: Neill is a business executive and writer on workplace trends, and John is an academic and Foresight’s S&OP Editor.

In the second article in this section, Jack Harwell presents An Executive Guide to Hiring Successful Demand Planners. Management, according to Jack, has to engage in broader thinking about this role and its growing importance:

Demand planning has evolved into more than a numbers game. Today’s demand planner is often the leader of the S&OP process and must guide an organization through collaboration and conflict resolution.

Contributors Wallace and Harwell then stick around a bit longer to be the subjects in this issue’s double dose of our Forecaster in the Field feature.

The Fall issue concludes with a review of the newly revised edition of the well-known pharmaceutical forecasting book by Arthur G. Cook, Forecasting for the Pharmaceutical Industry: Models for New Product and In-Market Forecasting and How to Use Them. Our reviewer is Christian Schäfer, former strategist for Boehringer Ingelheim and author of an earlier Foresight article on pharma forecasting, “How to Separate Risk from Uncertainty in Strategic Forecasting,” which appeared in our Winter 2013 issue.

Post a Comment

Fall 2015 forecasting education opportunities

You may not be in London on October 7 to take advantage of the Lancaster Centre for Forecasting's free workshop on promotional forecasting. However, there are still plenty of forecasting educational opportunities coming up this fall:

SAS Business Knowledge Series

My colleague Charlie Chase (author of Demand-Driven Forecasting), delivers this new course over two full days at the SAS offices in downtown Chicago. Use the code BDDF-CHI for a 25% discount on the registration fee.

My colleague Chip Wells is teaching this online course, held on the afternoons (1:00-4:30pm ET) of September 28-29. Chip expanded my original half day course with lots of new examples and exercises. This is a great chance to get  up to speed on FVA analysis, from the convenience of your own office.

IBF Business Planning & Forecasting Conference (Orlando, October 18-21)

IBF's biggest event of the year includes a free Forecasting & Planning Tutorial for IBF members on October 18, and a Leadership Forum with VIP Dinner Reception on the 19th. The conference continues October 20-21, with Sara Park, Group Director of Sales & Operations Planning at Coca-Cola, delivering the Keynote address.

Among the over two dozen sessions, I hope you'll join me and Erin Marchant, Demand Planning and Analytical Systems Power user at Moen, for a look at "Applying Forecast Value Added at Moen." And check out Erin's interview on the IBF blog.

Analytics 2015 - Las Vegas (October 26-27)

The SAS Analytics events cover the full range of advanced analytics, including statistical analysis, data mining, text analytics, optimization and simulation, and forecasting. Hot topics like machine learning, cybersecurity, and the internet of things will also be covered.

Forecasters will be particularly interested in the session by Len Tashman, editor-in-chief of Foresight: The International Journal of Applied Forecasting, on the "Do's and Don'ts of Measuring Forecast Accuracy."

Also check out the SAS Talks Keynote by Udo Sglavo, Senior Director of the data mining and forecasting R&D teams at SAS, for the latest news on SAS forecasting software.

Here are the top 3 reasons to attend Analytics 2015 at the Bellagio in Las Vegas.

Analytics 2015 - Rome (November 9-11)

Regardless of your role – whether data scientist, industry expert, analyst, statistician, business professional, leading researcher or academic – Analytics 2015 in Rome will give you practical and strategic insights on all emerging analytics technologies and approaches.

In addition, there is a dedicated path for executives and C-levels who want to enhance their company analytics culture learning from international best practices.

Among the forecasting topics, Udo Sglavo will present "A New Face for SAS Forecast Server" -- demonstrating the SAS Forecast Server Client web interface, and Snurre Jensen will show how "The World's Best Large-Scale Forecasting Product Gets Even Better." Olivier Gleron, AVP of Demand & Supply Planning at Nestlé, will deliver a success story.

Post a Comment

Free practitioner workshop - October 7 (London, UK)

LCF LogoOur friends at the Lancaster Centre for Forecasting have announced their next free practitioner workshop. It will be held Wednesday October 7,1:00-5:15pm, at BMA House, Tavistock Square, London WC1H. This session's top is:

Promotional Modelling and Forecasting

Understanding and forecasting the effects of a promotion for both retailers and manufacturers remains a major challenge with important financial and supply chain consequences. But promotional modelling is developing rapidly and new approaches and software are becoming available.

The event will bring together well-known practitioners (SAS, Nielsen, Marketing QED) and academics to present, hear about and discuss new actionable insights on promotions and demand forecasting within the realm of marketing analytics.

Drawing on their expertise in data-driven promotion strategies, the participants will share the newest, impactful, and innovative ideas, experiences and tools with business attendees.

The workshop will provide market analysts and demand forecasting professionals with the opportunity to turn sophisticated analytical promotional models into real-life, useful insights delivering more effective forecasts.

Places are limited, so booking is essential. Sign up here: Forecasting Events Registration Form.

Post a Comment

Judgmental adjustments to the forecast

So you think you can outsmart your statistical forecast? Apparently, lots of people do.

In "Judgmental Adjustments to Forecasts in the New Economy" (Foresight, Issue 38 (Summer 2015), 31-36), Manzoor Chowdhury and Sonia Manzoor argue that forecasters are becoming more dependent on judgmental adjustments to a statistical forecast.

Sometimes this is because there isn't sufficient data to generate trustworthy statistical forecast. For example, there may be no history for a new item, or limited history for items with a short product lifecycle. Or volume may be fragmented across complex and interconnected distribution channels. Or the immediate impact of social media (favorable or unfavorable) cannot be reliably determined.

Of course, the old standby reason for judgmental adjustments is when the statistical forecast does not meet the expectations of management. Executives may have "propriety information" they can't share with the forecasters, so it cannot be included in the statistical models. Or they may be (mis-)using the forecast as a target or stretch goal (instead of what the forecast should be -- a "best guess" at what is really going to happen).

Do You Have a Good Reason?

Does your boss (or higher level management) make you adjust the forecast? If so, that is probably a good enough reason to do so. But if they insist you make small adjustments, consider pushing back with the question, "What is the consequence of this adjustment -- will it change any decisions?"

Even if directionally correct, a small adjustment that results in no change of actions is a waste of everyone's time.

A large adjustment, presumably, will result in different decisions, plans, and actions. But will it result in better decisions, plans, and actions? In a study of four supply chain companies, Fildes and Goodwin ("Good and Bad Judgment in Forecasting," Foresight, Issue 8 (Fall 2007), 5-10) found that any benefits to judgmental adjustments are "largely negated by excessive intervention and over-optimism." In their sample, negative adjustments (lowering the forecast) tended to improve accuracy more than positive adjustments.

A Simple Test

As a simple test of your forecasting abilities, it should be easy to determine whether your adjustments are at least directionally correct.

Take a look at your historical forecasting performance data. (Every organization should be recording, at the very least,  the statistical forecast (generated by the forecasting software) and the final forecast (after adjustments and management approval), to compare to the actual that occurred. Much better is to also record the forecast at each sequential step in the forecasting process, such as statistical forecast, forecaster's adjustment, consensus adjustment, and final (management approved) forecast.)

What percentage of the adjustments were directionally correct? If more than half then congratulations -- you are doing better than flipping a coin!

Warning: Just be aware that you can make a directionally correct adjustment and still make the forecast worse. For example, statistical forecast=100, adjusted forecast=110, actual=101.

If you don't keep or have access to historical data on your forecasts and actuals, then test yourself this way: Every morning before the opening bell, predict whether the Dow Jones Industrial Average (or your own favorite stock or stock market index) will end higher or lower than the previous day. It may not be as easy as you think.

Post a Comment
  • About this blog

    Michael Gilliland is a longtime business forecasting practitioner and currently Product Marketing Manager for SAS Forecasting. He initiated The Business Forecasting Deal to help expose the seamy underbelly of the forecasting practice, and to provide practical solutions to its most vexing problems.

    Mike is also the author of The Business Forecasting Deal, and co-editor of Business Forecasting: Practical Problems and Solutions. He also edits the Forecasting Practice section of Foresight: The International Journal of Applied Forecasting.
  • Subscribe to this blog

    Enter your email address:

    Other subscription options

  • Archives