Staff Picks

Superforecasting: The Art and Science of Prediction

September 23, 2015

Share

Philip Tetlock pulls back the curtain on his Good Judgment Project, teaching us all to be "superforecasters."

My gut reaction to the words "forecasting" and "prediction" is a bit sour. When I think about the idea of prediction, I envision an amalgam of palm readers and cable news talking heads, neither of whom rank highly in this average citizen's opinion. But then I don't think so badly of my financial planner, with whom my wife and I just met today to discuss all kinds of numbers, buckets, and funnels. I balk at a handsome stranger with no apparent qualification for prediction who would like to tell me which candidate will win the Republican presidential nomination, because I tend to think that the real outcome of such a thing is too complicated for most people to predict. (A few have proven to be significantly more qualified than most to make such a prediction, but then most are not among this qualified few.) But when it comes to where I put money for things like retirement and insurance, I hardly give a thought to the chance that the systems on which I will later rely for income and livelihood will not continue to perform as I hope they will in thirty or forty years. And still, in neither of these cases do I count myself as even remotely qualified to make my own predictions of outcomes. So I simply don't make those predictions. But could I?

I had not considered how much I take predictions—large and small—for granted, until I opened Superforecasting: The Art and Science of Prediction. Philip Tetlock's new book explores prediction as an acquired skill, not a birthright bestowed upon a lucky few. Superforecasting is a natural byproduct of Tetlock's Good Judgment Project, an ongoing research project-cum-contest in which participants are filtered through series of tests that help draw conclusions about what makes the best forecasters, or superforecasters, as the project calls them. One outcome of GJP seems to be that Tetlock and his team have made a great deal of measurable progress toward finding out what makes a good forecaster, even such that GJP forecasters have frequently outperformed intelligence agents with access to classified information. I would like those skill please, so I continued reading.

Throughout Superforecasting, Tetlock and co-author Dan Gardner steadily unveil the ideal characteristics of a good forecast and those traits that make up a superforecaster. One case the book cites is the US Intelligence Community's misjudgment of Iraq and that country's possession of weapons of mass destruction (WMDs). Blowing through one of the most fundamental of Tetlock's points—to welcome uncertainty in a high-stakes decision or recommendation—the IC concluded plainly to the public that Iraq had WMDs.

The eventually obvious fact that Iraq did not actually have WMDs draws Tetlock to another key point, and really the keystone of Superforecasting and the work of GJP: carefully analyzing the outcomes or predictions and adjusting future forecasts based on what was measured. Connecting the Iraq WMD debacle to the birth of the Intelligence Advanced Research Projects Activity, Tetlock contrasts the old IC "thinking"—thinking in which bad forecasts were more often criticized with political rather than constructive motives—with newer thinking dedicated to learning from previous forecasts in hope of making more accurate forecasts in the future.

In another chapter, Tetlock demonstrates how exchanging obviously unknowable questions for smaller, hopefully easier questions can allow a forecaster to suss out at least partial knowledge to the greater question. He cites the physicist Enrico Fermi's question to his students: how many piano tuners are there in Chicago? The question, especially at the time when Fermi posed it to his students, was seemingly impossible—there was no stored means by which to simply "look up" this answer (e.g. the internet). But what Tetlock shows (and he credits Dr. Daniel Levitin's earlier presentation of the same Fermi question) is that this seemingly unknowable question actually contains many sub-questions for which answers can be found. There do still remain unknowable pieces and Tetlock makes "black-box" guesses at those. But even the black-box guesses at the smaller questions are much more likely to be close to the mark than would be a black-box guess at the big question. With the small questions answered, Tetlock then pieces the big question back together to conclude there are sixty-three tuners in Chicago—not far from Levitin's own eighty-three.

Later in the book, Tetlock touches on something that should underscore many professions—a person's view of his or her own abilities and a receptiveness to growth. He cites psychologist Carol Dweck's "growth mindset," the belief that a person's abilities are subject to effort. Looking at the growth mindset and how it might benefit forecasters, Tetlock delves into the importance of objective measuring and adjustment after a forecast.


In 1988, when the Soviet Union was implementing major reforms that had people wondering about its future, I asked experts to estimate how likely it was that the Communist Party would lose its monopoly on power in the Soviet Union in the next five years. In 1991 the world watched in shock as the Soviet Union disintegrated. So in 1992-93 I returned to the experts, reminded them of the question in 1988, and asked them to recall their estimates. On average, the experts recalled a number 31% higher than the correct figure. [...] There was even a case in which an expert who pegged the probability at 20% recalled it as 70%.

Hindsight bias is an inevitable part of remembering events when no objective measurement of an event is there to remind us what really happened. Tetlock's recommendation, then, is rigorous language in making forecasts and an objective eye when measuring the accuracy of a forecast.

On the surface, Superforecasting is an extremely useful list of qualities and practices that Tetlock and the Good Judgment Project have identified as essential to good forecasting. True to his tendency toward optimistic skepticism, Tetlock is always pushing the reader toward his or her own skepticism, and providing plenty of good case material to hopefully convince us of the value in these pages. The flap-copy predicts Superforecasting is "destined to become a modern classic." Thanks to his own book, I don't have to wonder what Tetlock would think of such a forecast.


We have updated our privacy policy. Click here to read our full policy.