Show your forecasting skills in Good Judgment Open


Good Judgment® Open

Ever wondered how good you are at forecasting? As a business forecaster, you can do the usual comparison against a naive model (and hopefully you are beating it!). You might also compare your forecast accuracy to published industry benchmarks -- although I would strongly recommend against this. (See the section below for reasons why.)

But if you're willing to test your performance in a big pond, against the big fish, in a wide variety of forecasting challenges, sign up for Good Judgment Open. Brought to you by the authors of Superforecasting, here is the description:

Superforecasting book coverStart keeping score.

Are you a Superforecaster®? Can you predict the future better than the pundits in the media? Join Good Judgment Open, the site for serious forecasting.

GJ Open is more than a game. It’s the best possible place to hone your forecasting skills. We’ve designed GJ Open specifically to help you improve your forecasting abilities. Make a forecast, explain your reasoning (and be challenged by others), and find out how you stack up against the crowd.

From the future of US politics to international affairs, from technology to sports and entertainment, there's bound to be something in your wheelhouse.

Still not sure? Check out our active challenges, our featured questions, or browse a list of all questions on GJ Open.

In the book you can find out more about The Good Judgment Project, and learn what it takes to be a recognized superforecaster.

Why Not to Compare to Industry Benchmarks

The BFD has previously discussed this issue in The Perils of Forecasting Benchmarks and More on Forecasting Benchmarks.

My argument against benchmarks is based on trustworthiness of the data, consistency of measurement across benchmark participants, and most important, the relevance of comparing performance between organizations that probably have different levels of forecastability of their data. I suspect that the "best in class" forecasters do so because they have the easiest to forecast demand -- not necessarily because they have the most admirable forecasting processes.

For a thorough and definitive discussion of the topic of benchmarking, see Stephan Kolassa's article "Can We Obtain Valid Benchmarks from Published Surveys of Forecast Accuracy," originally published in Foresight (Fall 2008), and reprinted in Business Forecasting: Practical Problems and Solutions.


About Author

Mike Gilliland

Product Marketing Manager

Michael Gilliland is author of The Business Forecasting Deal (the book), and editor of Business Forecasting: Practical Problems and Solutions. He is a longtime business forecasting practitioner, and currently Product Marketing Manager for SAS Forecasting software. Mike serves on the Board of Directors for the International Institute of Forecasters, and received the 2017 Lifetime Achievement in Business Forecast award from the Institute of Business Forecasting. He initiated The Business Forecasting Deal (the blog) to help expose the seamy underbelly of forecasting practice, and to provide practical solutions to its most vexing problems.

Leave A Reply

Back to Top