Excerpt from Steve Morlidge's new book Present Sense (Matador, 2019). Part 3: Not Storytellers But Reporters This is not the place to explore the role and ethics of performance reporting in detail, but I think there are at least four key duties. The duty of clarity Performance reports should be
Tag: Steve Morlidge
Excerpt adapted from Steve Morlidge's new book Present Sense (Matador, 2019). Part 2: Not Storytellers But Reporters News and Evidence If we ever doubted the importance of having a capability to assimilate mountains of detail, synthesize it and present it in an accessible and balanced way, the storm around the
It's no secret that The BFD is a huge fan of Steve Morlidge. Morlidge has contributed work of fundamental importance to our understanding of the practice of business forecasting. His studies of forecast quality exposed the abysmal nature of the practice, including the startling statistic that perhaps 30-50% of real-life
Note: The following concludes an eight-part serialization of selected content from Steve Morlidge's The Little (Illustrated) Book of Operational Forecasting. Good forecasts don’t always ‘look right’ Many forecasters believe that they can tell how good a forecast is by ‘eyeballing’ it. Good forecasts just ‘look right’ or so they would
Note: Following is an eight-part serialization of selected content from Steve Morlidge's The Little (Illustrated) Book of Operational Forecasting. The measurement challenge So here is the forecasters dilemma: There will always have forecast error. The challenge is to work out the cause of the error and to take the appropriate
Note: Following is an eight-part serialization of selected content from Steve Morlidge's The Little (Illustrated) Book of Operational Forecasting. The forecasting challenge It is not possible to forecast any future outcomes precisely. Only the signal is potentially forecastable – noise is unforecastable in principle. And all forecasts assume that the
Note: Following is an eight-part serialization of selected content from Steve Morlidge's The Little (Illustrated) Book of Operational Forecasting. Data Series are different – and it matters to forecasters The nature of demand that is to be forecast, as represented by patterns in the historic data series, that is to
Note: Following is an eight-part serialization of selected content from Steve Morlidge's The Little (Illustrated) Book of Operational Forecasting. Forecasting is not compulsory Operational forecasting is important but it is not mandatory. operational forecasts are used to make sure that a business can respond effectively to customer demand for its
Note: Following is an eight-part serialization of selected content from Steve Morlidge's The Little (Illustrated) Book of Operational Forecasting. The quality of forecasts matters…a lot It is difficult to precisely estimate the business impact of forecast quality partly because it impacts so many variables in ways that are not easy
Note: Following is an eight-part serialization of selected content from Steve Morlidge's The Little (Illustrated) Book of Operational Forecasting. Different kinds of forecasts This book is focused on operational forecasting – the stuff you do to determine what you need to buy, produce, hold in stock or otherwise give your customers
Note: Following is an eight-part serialization of selected content from Steve Morlidge's The Little (Illustrated) Book of Operational Forecasting. What IS a forecast? First of all, we need to be absolutely clear what a forecast is – and what it isn’t. A forecast is a best estimate of future
The Little (Illustrated) Book of Operational Forecasting Steve Morlidge's latest work, The Little (Illustrated) Book of Operational Forecasting, is a unique contribution to the field. It is a guide for short term operational forecasting, delivered in a pocket-sized format, through 79 brief (two page) illustrated lessons. As I stated in my
When you realize your organization has a forecasting problem, what do you do to solve it? In particular, if you realize you need new forecasting software, how do you begin to find it? All too often, the first step in a software selection process is the Request for Proposal (RFP)
In this guest blogger post, Chris Gray of Gray Research weighs in on the discussion of forecasting vs. budgeting. Chris Gray on Forecasting vs. Budgeting Generally I agree with Steve Morlidge’s points about the differences between forecasts (or sales plans) and budgets, and the fact that they are unlikely to
Chris Gray of Gray Research is a longtime contributor to the practice of Sales and Operations Planning. He is author of several books on S&OP, software selection, and other supply chain related areas, including Sales and Operations Planning Standard System (2007). In 2006 he co-authored Sales & Operations Planning –
Yesterday I recommended Steve Morlidge’s The Little Book of Beyond Budgeting, for helping to illuminate the troubling usage of business forecasts in the traditional management / budgeting process. Steve reached out to me overnight with some additional points that he shares in this guest blogger post. Steve Morlidge on Forecasts
Is There Something Beyond Budgeting? Forecasting is an integral part of the business planning and budgeting process. Presumably the forecast (which should be an "unbiased best guess" at what is really going to happen in the future) can provide a reasonable foundation upon which the annual budget and operating plans
The Means of the Defensive Paradigm The Defensive paradigm pursues its objective by identifying and eliminating forecasting process waste. (Waste is defined as efforts that are failing to make the forecast more accurate and less biased, or are even making the forecast worse.) In this context, it may seem ridiculous
"The Role of Model Interpretability in Data Science" is a recent post on Medium.com by Carl Anderson, Director of Data Science at the fashion eyeware company Warby Parker. Anderson argues that data scientists should be willing to make small sacrifices in model quality in order to deliver a model that
Editor Len Tashman's preview of the Winter 2016 issue of Foresight This 40th issue of Foresight begins with a review of the new book by Philip Tetlock and Dan Gardner with the enticing title Superforecasting: The Art and Science of Prediction. Reviewer Steve Morlidge explains that …the “superforecasters” of the
The Institute of Business Forecasting's FVA blog series continued on March 2, with my interview of Steve Morlidge of CatchBull. Steve's research (and his articles in Foresight) have been a frequent subject of BFD blog posts over the last couple of years (e.g. The "Avoidability of Forecast Error (4 parts),
Sports provide us with many familiar clichés about playing defense, such as: Defense wins championships. The best defense is a good offense. Or my favorite: The best defense is the one that ranks first statistically in overall defensive performance, after controlling for the quality of the offenses it has faced. Perhaps not
The Institute of Business Forecasting's FVA blog series continued in January, with my interview of Shaun Snapp, founder and editor of SCM Focus. Some of Shaun's answers surprised me, for example, that he doesn't compare performance to a naïve model (which I see as the most fundamental FVA comparison). But he went
There are some things every company should know about the nature of its business. Yet many organizations don't know these fundamentals -- either because they are short on resources, or their resources don't have the analytical skills to do the work. The summer research projects offered by the Lancaster Centre for Forecasting,
And now for the five steps: 1. Ignore industry benchmarks, past performance, arbitrary objectives, and what management "needs" your accuracy to be. Published benchmarks of industry forecasting performance are not relevant. See this prior post The perils of forecasting benchmarks for explanation. Previous forecasting performance may be interesting to know, but
In the summer heat, when The BFD alone isn't quite quenching your thirst for forecasting know-how, here are several other sources: CatchBlog -- by Steve Morlidge of CatchBull From his 2010 book Future Ready (co-authored with Steve Player), to his recent 4-part series in Foresight dealing with the "avoidability" of forecast
We're entering the busy season for forecasting events, and here is the current calendar: Analytics2014 - Frankfurt The European edition of Analytics2014 kicks off tomorrow in Frankfurt, Germany. Five hundred of the leading thinkers and doers in the analytics profession hook up for two full days of interaction and learning.
As we saw in Steve Morlidge's study of forecast quality in the supply chain (Part 1, Part 2), 52% of the forecasts in his sample were worse than a naive (random walk) forecast. This meant that over half the time, these companies would have been better off doing nothing and
As we saw last time with Steve Morlidge's analysis of the M3 data, forecasts produced by experts under controlled conditions with no difficult-to-forecast series still failed to beat a naive forecast 30% of the time. So how bad could it be for real-life practitioners forecasting real-life industrial data? In two words:
The Spring 2014 issue of Foresight includes Steve Morlidge's latest article on the topic of forecastability and forecasting performance. He reports on sample data obtained from eight business operating in consumer (B2C) and industrial (B2B) markets. Before we look at these new results, let's review his previous arguments: 1. All