In partnership with

Keeping projects on track: Overcoming cognitive biases in project planning and delivery

August 15, 2017 Toby Park and Ariella Kristal

Much of our work focuses on how policymakers can use a more sophisticated understanding of behaviour to make better policy, and this includes thinking about the cognitive biases that affect all of us. However these same psychological quirks also affect policymakers in their own work – so how can we improve these organisational decisions?

We undertook a project with the Department for Transport (DfT) to explore how biases affect judgment and decision-making in project planning and delivery. DfT are responsible for some of the biggest infrastructure projects in the UK, and so optimising these human aspects of their work can have enormous benefits.

Explored in detail in our accompanying literature review, we focused on three biases in particular:

Planning Fallacy – our tendency to make overly optimistic predictions about the time, cost and likelihood of success when planning a future task or project.

There are several causes of overoptimistic planning, including:

  • overconfidence and illusory superiority (in a hubristic quirk of mathematical impossibility, most of us think we’re better drivers than average, and the same seems to be true of performance at work);
  • overestimation of personal control (financial traders have been shown to believe they are manipulating stock values, even when it’s random, with those demonstrating greater delusion in their control being worse performers on financial measures);
  • motivated reasoning by project teams (contractors tend to want to win bids, and project planners like to see projects they are passionate about materialise, and so project planning can develop a degree of wishful thinking, often subconsciously); and
  • Simply, the causes of overspend, overrun or failure often don’t get adequately accounted for in a project plan, because, well, they’re not part of the plan.

Groupthink – our tendency to be influenced by the opinions and actions of others when operating within a group. This can easily lead to false consensus as dominant voices drown out quieter team members, similar views are reinforced, and the desire to reach agreement risks overriding the need to reach the best outcome.

President Kennedy famously sought to avoid groupthink during the Cuban Missile Crisis, inviting outside experts to share their viewpoints, and encouraging group members to question and challenge them carefully. He also encouraged group members to discuss possible solutions outside of the group to broaden their perspectives, and he even divided the group up into various sub-groups, to partially break the cohesion. Kennedy was also deliberately absent from some meetings, to diminish his own influence over the decision.

Sunk Cost Fallacy – our tendency to make decisions based on past costs (including time, effort and money) that have already been incurred, cannot be recovered and have no impact on future outcomes.

The sunk cost fallacy often leads to an escalation of commitment, making it increasingly difficult, psychologically, to change our course. In lay-speak, we make effort to finish something we have started, to save face, or because we feel that cutting our losses is wasteful or defeatist. The problem is, aborting or changing our current trajectory is sometimes the best thing to do, with any further resource better spent elsewhere.

After exploring the academic literature, reviewing DfT planning documents, interviewing project delivery teams, and understanding the issues the department faces, we came up with several solutions that we proposed testing. Here’s just a few of them:

Pre-mortem – Intended to partially address the planning fallacy, a pre-mortem requires decision makers and project teams to imagine that their project has failed, overrun or overspent, and to work backwards to conceive of all the possible reasons why this happened. Explicitly including contingencies for these risks in project plans helps set more realistic goals, and an important part of this is creating a work environment in which recognising the likelihood of setbacks is not seen as defeatist or incompetent (this is no easy task – none of us wants to start a project admitting that it will probably go wrong at some point).

Red Teaming – Many of the biases underlying to the planning fallacy and sunk cost fallacy are introspective in nature – in other words we have distorted views of our own abilities, we are motivated to get our own projects over the line, our decisions are swayed by the amount of effort and emotional investment we have put into our own projects.  By comparison outsiders are relatively unclouded by these personal attachments to the project, and so tend to have a more pessimistic (and realistic) perspective. ‘Red Teams’, who are otherwise completely detached from the project, act solely as devil’s advocates, systematically challenging a project team’s assumptions and plans, and providing independent critical thought to improve decision-making.

Decision trees – Decision trees are decision aids which use tree-like flow-charts identifying key decision points and consequences. They are useful because they help decision-makers focus more clearly on pertinent aspects of the situation, and remove extraneous ‘noise’ – such as sunk costs – from the process. They can also help identify where and when key decisions need to be made, prompting delivery teams to pause for thought and question the status quo. Such a process could be used in project delivery, and well as to set the course of a project.

The next step is to implement and evaluate some of these ideas (and others explored in the full report), to find out what really works in the context of DfT’s project delivery.

Subscribe to our mailing list