The Denver International Airport opened 16 months late, at a cost overrun of $2 billion. (I’ve also seen $3.1 billion asserted.) The Eurofighter Typhoon, a joint defense project of several European countries, was delivered 54 months late at a cost of $19 billion instead of $7 billion. The Sydney Opera House may be the most legendary construction overrun of all time, originally estimated to be completed in 1963 for $7 million, and finally completed in 1973 for $102 million.1
Are these isolated disasters brought to our attention by selective availability? Are they symptoms of bureaucracy or government incentive failures? Yes, very probably. But there’s also a corresponding cognitive bias, replicated in experiments with individual planners.
Buehler et al. asked their students for estimates of when they (the students) thought they would complete their personal academic projects.2 Specifically, the researchers asked for estimated times by which the students thought it was 50%, 75%, and 99% probable their personal projects would be done. Would you care to guess how many students finished on or before their estimated 50%, 75%, and 99% probability levels?
- 13% of subjects finished their project by the time they had assigned a 50% probability level;
- 19% finished by the time assigned a 75% probability level;
- and only 45% (less than half!) finished by the time of their 99% probability level.
As Buehler et al. wrote, “The results for the 99% probability level are especially striking: Even when asked to make a highly conservative forecast, a prediction that they felt virtually certain that they would fulfill, students’ confidence in their time estimates far exceeded their accomplishments.”3
More generally, this phenomenon is known as the “planning fallacy.” The planning fallacy is that people think they can plan, ha ha.
A clue to the underlying problem with the planning algorithm was uncovered by Newby-Clark et al., who found that
- Asking subjects for their predictions based on realistic “best guess” scenarios; and
- Asking subjects for their hoped-for “best case” scenarios...
… produced indistinguishable results.4
When people are asked for a “realistic” scenario, they envision everything going exactly as planned, with no unexpected delays or unforeseen catastrophes—the same vision as their “best case.”
Reality, it turns out, usually delivers results somewhat worse than the “worst case.”
Unlike most cognitive biases, we know a good debiasing heuristic for the planning fallacy. It won’t work for messes on the scale of the Denver International Airport, but it’ll work for a lot of personal planning, and even some small-scale organizational stuff. Just use an “outside view” instead of an “inside view.”
People tend to generate their predictions by thinking about the particular, unique features of the task at hand, and constructing a scenario for how they intend to complete the task—which is just what we usually think of as planning.
When you want to get something done, you have to plan out where, when, how; figure out how much time and how much resource is required; visualize the steps from beginning to successful conclusion. All this is the “inside view,” and it doesn’t take into account unexpected delays and unforeseen catastrophes. As we saw before, asking people to visualize the “worst case” still isn’t enough to counteract their optimism—they don’t visualize enough Murphyness.
The outside view is when you deliberately avoid thinking about the special, unique features of this project, and just ask how long it took to finish broadly similar projects in the past. This is counterintuitive, since the inside view has so much more detail—there’s a temptation to think that a carefully tailored prediction, taking into account all available data, will give better results.
But experiment has shown that the more detailed subjects’ visualization, the more optimistic (and less accurate) they become. Buehler et al. asked an experimental group of subjects to describe highly specific plans for their Christmas shopping—where, when, and how.5 On average, this group expected to finish shopping more than a week before Christmas. Another group was simply asked when they expected to finish their Christmas shopping, with an average response of four days. Both groups finished an average of three days before Christmas.
Likewise, Buehler et al., reporting on a cross-cultural study, found that Japanese students expected to finish their essays ten days before deadline. They actually finished one day before deadline. Asked when they had previously completed similar tasks, they responded, “one day before deadline.”6 This is the power of the outside view over the inside view.
A similar finding is that experienced outsiders, who know less of the details, but who have relevant memory to draw upon, are often much less optimistic and much more accurate than the actual planners and implementers.
So there is a fairly reliable way to fix the planning fallacy, if you’re doing something broadly similar to a reference class of previous projects. Just ask how long similar projects have taken in the past, without considering any of the special properties of this project. Better yet, ask an experienced outsider how long similar projects have taken.
You’ll get back an answer that sounds hideously long, and clearly reflects no understanding of the special reasons why this particular task will take less time. This answer is true. Deal with it.
Roger Buehler, Dale Griffin, and Michael Ross, “Inside the Planning Fallacy: The Causes and Consequences of Optimistic Time Predictions,” in Gilovich, Griffin, and Kahneman, Heuristics and Biases, 250–270.
Roger Buehler, Dale Griffin, and Michael Ross, “Exploring the ‘Planning Fallacy’: Why People Underestimate Their Task Completion Times,” Journal of Personality and Social Psychology 67, no. 3 (1994): 366–381, doi:10.1037/0022-3518.104.22.1686; Roger Buehler, Dale Griffin, and Michael Ross, “It’s About Time: Optimistic Predictions in Work and Love,” European Review of Social Psychology 6, no. 1 (1995): 1–32, doi:10.1080/14792779343000112
Buehler, Griffin, and Ross, “Inside the Planning Fallacy.”
Ian R. Newby-Clark et al., “People Focus on Optimistic Scenarios and Disregard Pessimistic Scenarios While Predicting Task Completion Times,” Journal of Experimental Psychology: Applied 6, no. 3 (2000): 171–182, doi:10.1037/1076-898X.6.3.171.
Buehler, Griffin, and Ross, “Inside the Planning Fallacy.”