Few days ago, I attended a conference on an example of a project management disaster: the Titanic, the British ship which sank in its maiden trip from Southampton to New York in 1912 causing the death of above 1,500 passengers, above 2 thirds of those aboard.
[The conference was part of the same cycle of which I attended another one about 2 months ago about the success of the management project for the delivery of the infrastructure, design and construction of buildings, transport and the legacy of the London 2012 Olympic Games (I wrote about it here).]
Learning from a well-known disaster, as opposed to a success, made the audience more eager to listen to Ranjit Sidhu, a consultant who has made extensive research about the Titanic and has written the book “Titanic Lessons in Project Leadership“.
She was going to focus the conference on 3 sides of project management: communication, leadership and teamwork (1), and the problems which each of those originated in the disaster.
Sidhu started giving an introduction of some of the characters involved in the project to show the kind of power plays and conflicts that took place at the time of taking decisions. Some of those characters were: Bruce Ismay (chairman of White Star Line), Lord Pirrie (chairman of the shipbuilding company Harland and Wolff), J.P. Morgan (American banker who financed the formation of International Mercantile Marine Company, mother company of White Star Line), Alexander Carlisle (chief engineer of the project in Harland and brother-in-law of Pirrie), Thomas Andrews (successor of Carlisle), Captain Smith (sea-captain of the Titanic).
From the beginning of the project the mantra that the Olympic line of boats was going to be unsinkable was created due to some features which indeed made the boats more secure than others at the time, as well as the largest and most luxurious. From that point onwards, several psychological flaws impeded perceptions to be re-evaluated, messages to get across, decisions to be questioned, etc.
For some of the characters (Carlisle and Andrews) safety was the top objective, to the point that when the number of life boats was decided to be reduced against the engineers’ criteria Carlisle resigned as chief engineer of the project and left Harland despite of being a relative of the chairman.
For other characters in the story the emphasis was in the size or the luxury: an ample dinning room, clean views from the cabins (not disturbed by life boats, for instance), etc.
The power play, the financial pressure on the project, the deadlines of both departure and arrival in New York, the image to keep before the press, etc., all made that several decisions were taken despite of compromising technical features (life boats reduction and placement), manufacturing operations (working in increasing shifts due to the delay caused by the repair of the Olympic at the same shipyard), operational decisions (such as short time for sea trial of the ship, radio operators priorities and incentives misalignment…), etc., adding to the diminished safety of the trip.
Some of the psychological flaws that were going on when taking those decisions include: anchoring effect (the image of the Titanic as unsinkable was fixed in the mindset despite of decisions compromising safety), bandwagon effect, confirmation bias (negative signals being filter out vs. acknowledging supporting evidences), conformity to the norm, framing effect, normalcy bias (denial and underestimation of the consequences of the disaster once occurred), etc.
Last minute misfortunes added up to the disaster: missing binoculars for the scouts (due to the departure of a crew component who held them), a shorter rope to perform ice tests, radio messages from the Californian boat not being prioritized by operators to be brought to the main deck…
The end to the story is well-known.
Have we progressed as a society since them?
Today we like to think that yes. More requirements regarding safety are put into projects. Regulations are passed to ensure safety. Risk management is used as part of project management to ensure that the kind of decisions taken at the time of the Titanic today they are taken without overlooking the risks behind them.
However, I would like to bring 3 questions raised by colleagues in the Q&A session that followed the presentation:
- Of the cited characters, who could have been more proactive to prevent the disaster? Taking into account that Carlisle, the chief engineer, went to the point of resigning without (a seemingly) major effect to the fate of the ship.
- How can we react to a pressure situation under a powerful sponsor? We can try to find allies, framing the situation as an “us” as a group instead of opposing the sponsor.
- If the Titanic hadn’t sunk, would it be seen as an example of success in project management instead of a disaster? You may dismiss the point too quickly by thinking “oh, yes, but it happened that it sank!“.
Here, I remembered the theory of the safety in systems seen as layers of safety added one after the other. Each of the layer may have some holes in it just as a portion of cheese (typical image used in aerospace projects). By having several layers, accidents are prevented in most of the cases. However, from time to time the holes in the layers are perfectly aligned and the accident happens (lack of sea trials, radio messages not passed, urgency to reach New York, scouts without binoculars, improper ice tests, power vs. authority struggle in that precise trip in which the chairman of the company travels alongside the captain…).
My takeaways from the conference are:
- to continuously remind ourselves of the flaws we have in our mental processes (I recommend a couple of books to that respect: “Thinking Fast and Slow” by Daniel Kahneman and “Poor Charlie’s Almanack“, by Charlie Munger),
- to sharpen our perception of risks (both at work and daily life),
- to understand that we are a layer (with our own holes) in the safety system (both at work and daily life).
(1) She did not enter much into risk management despite of acknowledging that it had not worked (or rather overlooked).