(and why public works are reluctant to be completed on time)
By Pedro G. Del Carpio.
If I had to guess how fed up you are with traffic jams in your city on a scale from 1 to 10, I would bet that you are probably close to a ten.
Enduring another 6 pm Lima city traffic jam I read with feigned surprise that, once again, a public infrastructure project work won’t be completed on time. At the time of this publication, the remodelling work of an overpass in the most important highway of the city has ground to a halt for around a year, and only 20% of the project has been completed due to a land dispute.
Again? What a bunch of idiots! Couldn’t they have planned better? Although our indignation towards the people in charge feels justified, I am sorry to tell you that barring intentional mismanagement of the project, we are no better at making accurate forecasts than the people in charge at our infrastructure department
The human inability to produce accurate forecasts has been extensively studied in Psychology. The Planning Fallacy describes the phenomenon in which people tend to systematically make mistakes when aiming to predict the future outcome of projects. When facing a forecasting challenge about the fate of some task we want to accomplish, our mental machinery can’t help but overlook relevant information about past or similar scenarios that could serve as a baseline, and overwhelmingly focuses on the particularities of the task we have in hand. Clearly, discarding external information means that we are ignoring the most objective source we have when making any kind of forecast.
Thus, when we estimate the time we will take to complete a project, we tend to move away from reality and lean toward the “best possible” scenario. As if previous experiences didn’t matter, even knowing that similar tasks have taken longer than expected, we fall into the same trap of believing that they will require fewer resources — tangible and intangible — than what they actually need for their completion.
When we estimate the time we will take to complete a project, we tend to move away from reality and lean toward the “best possible” scenario.
But what could be going on in our minds that we continue to make these systematic mistakes? In the world of metaphors, it’s the System 1 of our mind (the fast, intuitive and automatic thinking) shamelessly getting its way over System 2 (the slow, analytic and automatic reasoning). Psychology suggests the presence of three cognitive phenomena [1] in action:
Optimism bias
It’s our tendency to overestimate the occurrence of positive results and underestimate the probabilities that something bad could happen. Research has shown our propensity to believe we are at less risk than other people under the same circumstances [2] or to think that our performance and skills are better than what they actually are [3].
Attribution error
We tend to believe that regardless of the causes, positive results have a disproportionate origin in our ability, while negative results are due to some external element that played against us. For example, researchers found evidence of something that certainly doesn’t seem like a secret: people in managerial positions tend to overestimate the effect of their actions on the success of their organization y underestimate the role played by circumstances beyond their control [4][5].
We neglect the role of randomness
The human necessity of finding coherent explanations for our experiences — X event produces a direct effect on Y — make us reject the always-present role of chance. This illusion is the result of ignoring the impact that the interaction between unknown elements has on the outcomes, assigning our actions a greater importance than what they really have. Imagine that the events which shape our reality could have easily not have happened due to unexpected changes down the road — it’s a titanic task for our minds. Obviously, this doesn’t mean that everything is determined by luck, but definitively randomness is more present in our lives than we tend to believe [6].
Certainly, if the opening day of the Sydney Opera House took twenty more years than what was originally expected — with a cost overrun of 105 million Australian dollars — the possibility of public work delays in our region shouldn’t seem surprising to us. That being said, this fact doesn’t excuse the people in charge of designing and approving those projects. Even discarding Black Swan type events — anomalies virtually impossible to forecast but when they happen they bring extreme consequences — [7], everyone who deals with any form of planning should internalize the fact that tasks have a high probability of taking longer than what it is estimated, and therefore forecasts should be adjusted accordingly.
If you’ve gotten this far, it’s probably because you have fallen prey more than once to the Planning Fallacy. After all, you don’t have to be a project manager to want to know how to make better forecasts in any aspect of your life that requires making decisions in times of uncertainty.
Reducing the effect of the Planning Fallacy
The team of psychologists Daniel Kahneman and Amos Tversky has suggested taking an external approach [8] to attenuate the effect of our cognitive biases. In brief, they recommend the following steps:
Identify the reference class in which the problem you want to solve can be located. For example, if the challenge is about estimating the required time to complete your graduate thesis, the reference class could be the time it takes other students to complete writing their thesis projects.
Gather data and statistical information about that reference class and determine a baseline. Ask, how long do such projects usually last? For example, determine what is the mean time the average student takes to complete a graduate thesis.
Make an intuitive estimation based on the specific characteristics of this case and then make a final forecast, adjusting it toward the mean of the reference class. For example, let’s say you believe that writing your thesis will take you two months because you know a lot about the topic and you have a track record of handing in your coursework before your classmates. Nevertheless, the external information you compiled shows that the average student takes five months to complete this kind of project. A forecast that takes into consideration the reference class will estimate that your graduate thesis will be completed in a range between two and five months, three and a half months being the average and most likely a more accurate estimate of the final timeline.
Main takeaway
The bounded capacity to make an accurate forecast is not exclusive of the professionals in charge of managing important projects. The human mind makes everybody fall into the same estimation mistakes because of an excess of optimism, errors of attribution, and our aversion to the role of chance in the development of the events that shape our lives. Although it’s not possible to completely eliminate the biases of our intuitive predictions, including external information allow us to reduce their influence and helps us to make better forecasts.
References
[1] Delusions of Success: How Optimism Undermines Executives’ Decisions. Por Dan Lovallo y Daniel Kahneman
[2] Optimism Following a Tornado Disaster. Por Jerry Suls, Jason P. Rose, Paul D. Windschitl y Andrew R. Smith
[3] The optimism bias and traffic accident risk perception. By David Dejoy
[4] Causal Ambiguity, Management Perception, and Firm Performance. By Powell, Lovallo y Caringal
[5] The justification of organizational performance in annual report narratives. By Izabella Frinhani, Marcelo Sanches, Antonio Thadeu Mattos da Luz
[6] Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets. By Nassim Taleb
[7] The Black Swan: The Impact of the Highly Improbable. By Nassim Taleb
[8] Intuitive prediction: Biases and Corrective Procedures. By Daniel Kahneman and Amos Tversky
Exploring the “Planning Fallacy”: Why People Underestimate Their Task Completion Times. By Roger Buehler, Dale Griffin, and Michael Ross
Thinking, Fast and Slow. by Daniel Kahneman
The Optimism Bias. By Tali Sharot
コメント