Is Psychology or Politics Behind Project Failures?
How Big Things Get Done (Ch2): Exploring the forces that drive us to think fast and act slow

- David Gérouville-Farrell
- 7 min read

A key theme from Chapter 2 of “How Big Things Get Done” is that while we know we should think slowly and act fast, people often do the exact opposite. The chapter explores the psychological and political forces driving this behaviour.
The Rush to Commit
The chapter introduces the concept of the commitment fallacy - a behavioural bias where we rush into decisions without proper consideration. The age-old advice “Act in haste, repent at leisure” captures this perfectly.
Big projects typically come together with a rush to commit:
- Purposes and goals are not carefully considered
- Alternatives are not explored
- Difficulties and risks are not investigated
- Solutions are not found
Instead, shallow analysis leads to quick lock-in decisions that sweep aside all other possibilities. This “lock-in” becomes a self-fulfilling prophecy - even when facing greater costs or risks than initially acceptable, organisations behave as if they have no choice but to push forward.
Amazon’s leadership principles provide an interesting contrast here. While they promote “bias for action” with statements like “Speed matters in business” and “Many decisions and actions are reversible and do not need extensive study”, Bezos (and co) specifically limits this to reversible decisions. If you can undo something easily, there’s no need to overanalyse. But large projects are often effectively irreversible - you can’t build the Pentagon, knock it down, and rebuild it after realising it ruins the view. Often the “speed matters in business” concept becomes part of corporate culture without Jeff Bezos’s caveat that this should apply where decisions and actions are reversible.
Strategic Misrepresentation: Politics by Design
One of the concepts introduced that caught me by surprise is “strategic misrepresentation” - defined as “the tendency to deliberately and systematically distort or misstate information for strategic purposes.”
I have seen this. Although I don’t think I had a way to frame it, or recognise it before.
When working on big projects, there is an unwritten rule that we have to “get behind” what we’re trying to do. And I do think there’s some truth to the idea that people will distort reality in order to “make it happen.” I’ve been in departments where nobody really believed the quoted numbers, but (without ever being told) all agreed to pretend they did.
Think about it: if you want to win a contract or get funding for a particular project, superficial planning is actually quite handy. It glosses over major challenges, keeping costs and timelines down on paper. The challenges don’t disappear, of course - they just resurface later when the project is too far along to turn back.
French architect Jean Nouvel, a Pritzker Prize winner, puts it bluntly: “In France, there is often a theoretical budget that is given because it is the sum that politically has been realised to do something. In 3 out of 4 cases, the sum does not correspond to anything in technical terms… The real price comes later. The politicians make the real price public where they want and when they want.” These estimates aren’t meant to be accurate - they’re meant to sell the project.
The author argues this is politics resulting in failure by design. While people might not consciously admit this is what they’re doing, it happens beneath the surface.
Psychology vs Politics
The author shares an anecdote around their discussions back and forth with Daniel Kahneman that began in the Harvard Business Review and led to meeting in person, where the author and Kahneman tried to find common ground around whether psychology or politics was primarily responsible for bad decision making. This reminds me of a podcast I heard recently shortly after Danny Kahneman’s death where previous collaborators shared their experience of working with him (it’s a great listen and shows how willing Kahneman was to engage with people who held differing opinions. He certainly seems to have had an incredible work ethic and a commitment to try to understand where differences of opinion came from).
Where the author and Kahneman ended up is an understanding that both psychology and politics play a role in bad decision making. The determinant factor of which is more responsible seems to be related to the stakes. In experimental contexts there is:
typically no jockeying for position, no competition for scarce resources, no powerful individuals or organisations, no politics of any kind.
As the stakes increase, as the money and the potential personal gains increase, the psychology becomes slightly less of a factor and the politics becomes more of a factor.
The Optimism Trap
We are, as a species, deeply optimistic:
- Most drivers believe their skills are above average
- Most small business owners are confident of success (despite most businesses failing)
- Most smokers believe they’re less at risk than other smokers
- etc…
The analogy used in the book is about whether you want your flight attendant or your pilot to be optimistic. It’s good for the flight attendant to be optimistic because they’re going to try to create a comfortable atmosphere and a confidence and a calm state in the back of the plane. But you don’t want to hear the pilot saying, “I’m optimistic we have enough fuel.” You want your pilot to be hard-nosed and analytical and to live in as close to reality as is possible.
This is a key distinction that the chapter is trying to make. Yes, optimism has a place. Yes, getting positive and having a can-do attitude is important. But at the top level of the project, at the planning and key decision stages, you really want somebody there who is as analytical, hard-nosed, and realistic as possible. We can’t afford optimism-driven surprises when the stakes are this high.
Hofstadter’s Law and the Planning Fallacy
Hofstadter’s Law captures our persistent optimism about timelines: “It always takes longer than you expect, even when you take into account Hofstadter’s Law.” The meta-joke here is telling - we underestimate task completion time even when we know we tend to underestimate task completion time.
When asked for a “best guess” timeline, people typically provide something indistinguishable from a “best case” scenario. We imagine ourselves executing the task without interruption, overlooking the messy reality of competing priorities, illness, dependencies, and the general chaos of real-world projects.
Moving Forward: Commit to Not Commit
The chapter concludes with an interesting paradox: where you feel the urge to commit to a decision, the author suggests instead trying to commit to keeping an open mind. They’re trying to shift how we approach big decisions and trying to help us resist that temptation to lock in to early commitment.
In my chapter one review, I mentioned my interest in seeing how the author handles big plan up front versus surprises and the way that Agile has responded to those surprises.
The author says here that they’re going to lay out a process that identifies flaws and opportunities. So I’m looking forward to getting into that too.
thingsithinkithink
-
The distinction between politics and psychology in decision-making is interesting. There’s a lot of focus on the internet on the role that psychological biases play in poor decision making. But I haven’t seen as much focus on the systemic failures being driven by institutional incentives, power dynamics and other forms of politics.
-
There’s something quite profound about the idea that strategic misrepresentation - essentially, planned failure - might be a feature rather than a bug. It suggests that fixing project management isn’t just about better processes, but about fundamentally realigning incentives.
-
There’s a quote in the book that I don’t agree with: “we all know, at least when we are thinking coolly, that strong emotions are not necessarily logical or supported by evidence and are therefore an unreliable basis for judgment.” Having studied emotion during my PhD, I see this as a naive interpretation. Emotions aren’t irrational - they’re rational signals about things we’re invested in. Anger signals we care about something and believe we can change it; sadness might signal we care but feel powerless to act. Rather than dismissing emotions as irrational distractions, we should recognise them as valuable signals that can help us make better decisions.