Speed Review: Superforecasting

Speed Review: Superforecasting

Speed Review: Superforecasting

The Art and Science of Prediction

by Philip E. Tetlock & Dan Gardner

In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people—including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer—who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They’ve beaten other benchmarks, competitors, and prediction markets. They’ve even beaten the collective judgment of intelligence analysts with access to classified information. They are "superforecasters."

Review

Rethinking Expert Predictions

Between 1984 and 2004, Wharton professor Philip Tetlock conducted one of the most in-depth experiments in forecasting in history. Thousands of forecasts were collected and then analyzed over time for accuracy. The results of the experiment, called the Good Judgment Project (GJP), were published in a book called Expert Political Judgment in 2005. The conclusions of the book would be popularized in a simple, colorful phrase: The accuracy of expert predictions is about equal to predictions based on a chimpanzee throwing darts at a dartboard.

At first, Tetlock owned the metaphor that, he admits, he had used himself. But over time, the joke grew tired because it oversimplified the results. His experiment was being used as proof that expert predictions were useless — which is not, Tetlock argues, what the results showed. He eloquently describes in his new book, Superforecasting: The Art and Science of Prediction, that predictions can be made more accurate if experts adopt certain mindsets and behaviors.

Superforecasting, co-authored with journalist Dan Gardner, is based on empirical evidence drawn from what Tetlock calls phase 2 of the GJP. This phase was actually part of a larger experiment sponsored by the Intelligence Advanced Research Projects Activity, or IARPA, a government agency tasked with sponsoring research that improves the effectiveness of American intelligence.

IARPA launched a multiyear forecasting competition among five research teams, including Tetlock’s team. Each of the project teams could conduct their research in any way they wanted. The only requirement was to make predictions to a certain set of questions every day. Tetlock set about winning the tournament by recruiting ordinary people who made a hobby of forecasting. Eventually, he would attract, in total, 20,000 intellectually curious amateur forecasters to his team — people such as Bill Flack, a retired Department of Agriculture manager, or Devyn Duffy, unemployed from a closed-down factory (he is currently employed with the state).

The tournament was scheduled to last from 2011 to 2015, but after two years, Tetlock’s forecasters and “superforecasters” — a sub-category of the volunteers with significantly better-than-average results — were dominating to such an extent that IARPA dropped the other teams, including research teams from the University of Michigan and MIT.

From Active Open-Mindedness to Teams

How was Tetlock’s team of forecasters able to dominate the tournament? The answers to this question are detailed in a fascinating book that is filled with historic examples and often-surprising insights into the mistakes and assumptions that undermine even the best of minds.

One important and seemingly obvious (and yet so elusive) element required for accurate predictions is active open-mindedness — in other words, not building the answers based on one’s assumptions and biases. “For superforecasters,” Tetlock and Gardner explain, “beliefs are hypotheses to be tested, not treasures to be guarded.”

Another element required for superior predictions is a tendency toward probability. Many of us ignore probability more than we might realize. When the weatherman says that there is an 80 percent chance of rain the next day and the next day is sunny, the weatherman was wrong — or so we think. In truth, there was a 20 percent probability of a sunny day. The next day fell into that probability. Superforecasters think in terms of percentages, not yes, no or maybe.

Two other lessons drawn from the experiment are that superforecasters constantly update their information and adjust their predictions (and their beliefs) accordingly and that they work better in teams. The chapter on teams includes a fascinating account of the team around President John Kennedy whose advice prompted him to order the disastrous Bay of Pigs invasion, but who were also instrumental in guiding Kennedy during the Cuban Missile Crisis.

Leaders across the spectrum — from world leaders in the capitals of nations to business leaders at all levels of their organizations — are making decisions every day based on what they believe is going to happen. Given the dramatic consequences that can result from the wrong predictions, Superforecasting is perhaps one of the few books this year that should be required reading for all of us — and especially for our leaders.

Receive our latest book reviews in your inbox each month.

Matching Products

Image of Flash Foresight
Image of Flash Foresight

Flash Foresight

By John David Mann, Daniel Burrus

View Details