The Institute of Business Forecasting's FVA blog series continued on March 2, with my interview of Steve Morlidge of CatchBull. Steve's research (and his articles in Foresight) have been a frequent subject of BFD blog posts over the last couple of years (e.g. The "Avoidability of Forecast Error (4 parts),
Tag: Steve Morlidge
Sports provide us with many familiar clichés about playing defense, such as: Defense wins championships. The best defense is a good offense. Or my favorite: The best defense is the one that ranks first statistically in overall defensive performance, after controlling for the quality of the offenses it has faced. Perhaps not
The Institute of Business Forecasting's FVA blog series continued in January, with my interview of Shaun Snapp, founder and editor of SCM Focus. Some of Shaun's answers surprised me, for example, that he doesn't compare performance to a naïve model (which I see as the most fundamental FVA comparison). But he went
There are some things every company should know about the nature of its business. Yet many organizations don't know these fundamentals -- either because they are short on resources, or their resources don't have the analytical skills to do the work. The summer research projects offered by the Lancaster Centre for Forecasting,
And now for the five steps: 1. Ignore industry benchmarks, past performance, arbitrary objectives, and what management "needs" your accuracy to be. Published benchmarks of industry forecasting performance are not relevant. See this prior post The perils of forecasting benchmarks for explanation. Previous forecasting performance may be interesting to know, but
In the summer heat, when The BFD alone isn't quite quenching your thirst for forecasting know-how, here are several other sources: CatchBlog -- by Steve Morlidge of CatchBull From his 2010 book Future Ready (co-authored with Steve Player), to his recent 4-part series in Foresight dealing with the "avoidability" of forecast
We're entering the busy season for forecasting events, and here is the current calendar: Analytics2014 - Frankfurt The European edition of Analytics2014 kicks off tomorrow in Frankfurt, Germany. Five hundred of the leading thinkers and doers in the analytics profession hook up for two full days of interaction and learning.
As we saw in Steve Morlidge's study of forecast quality in the supply chain (Part 1, Part 2), 52% of the forecasts in his sample were worse than a naive (random walk) forecast. This meant that over half the time, these companies would have been better off doing nothing and
As we saw last time with Steve Morlidge's analysis of the M3 data, forecasts produced by experts under controlled conditions with no difficult-to-forecast series still failed to beat a naive forecast 30% of the time. So how bad could it be for real-life practitioners forecasting real-life industrial data? In two words:
The Spring 2014 issue of Foresight includes Steve Morlidge's latest article on the topic of forecastability and forecasting performance. He reports on sample data obtained from eight business operating in consumer (B2C) and industrial (B2B) markets. Before we look at these new results, let's review his previous arguments: 1. All