Editor Len Tashman's Preview of Foresight
Foresight has always presented its methods-based articles as either tutorials, which introduce and illustrate a methodology in nontechnical language, or as case studies, with a focus on the practical issues and challenges in generating forecasts. We lead off this issue with two practical issues articles. First, Stephan Kolassa and Roland Martin team up again to examine major challenges of Forecasting to Meet Demand. Then Christian Schäfer presents a case study in the use of simulation and scenarios to forecast the market success of new pharmaceutical products. His article shows How to Separate Risk from Uncertainty in Strategic Forecasting.
Foresight’s feature article in the Fall 2012 issue, “Why Should I Trust Your Forecasts?,” examined the main determinants of trust, including “open communication between forecast users and providers.” In this issue, Joe and Simon Sez expands on the theme with our dynamic duo’s ingredients for Fostering Communication that Builds Trust.
Two very special articles appear in our S&OP section. Jane Lee offers a good dose of reality therapy in her discussion of The Role of S&OP in a Sluggish Economy. Jane argues that, rather than maintaining the pretense of “business as usual,” firms should face the necessity of rapid downward adjustments and utilize the S&OP process for gaining buy-in on new forecasts, new plans, and new production and inventory targets.
Jason Boorman then takes us through the key Five Steps to Gaining Necessary and Appropriate Buy-In for a new S&OP process. It’s a formidable task and success, he says, requires the gaining of commitment from top management and the execution of a successful pilot project.
With so many software solutions out there, it’s hard to imagine room for yet another –but here’s one that really is different. Jeff Greer makes a motivating case for adding Geographic Information Systems software to the supply-chain toolbox in GIS: The Missing Tool for Supply-Chain Design.
The United States has just emerged from a presidential election in which the populace was besieged – nay, bludgeoned – with predictions of the outcome from polls, betting markets, expert opinions, regression models, aggregators, and blogs such as Nate Silver’s FiveThirtyEight for the New York Times. Many polls were wrong beyond statistical expectation (e.g. Gallup), but one clear winner was the aggregator approach, which is based on the averaging of different methods. Andreas Graefe and his team at Pollyvote.com had it consistently right throughout the campaign; how this was accomplished is the subject of Combined Forecasts of the 2012 Election: ThePollyVote.
And speaking of Nate Silver – who outperformed Polly and almost every other predictor in the final two weeks before the elections and has achieved superstar status for his campaign-outcome acumen – he tells all in his new book, The Signal and the Noise: Why So Many Predictions Fail -- But Some Don't. In David Orrell’s review of the book in this issue, he shows the breadth of Silver’s forecasting approach reaching well beyond politics to economics, athletics, and climate.