Advertisement
Nobody is perfect, and no system is perfect, so we just have to do the best we can, and hope that it is good enough. But how good is ‘good enough’? Or how bad do things have to be before they are not ‘good enough’? And is this culture of ‘good enough’ wise? It’s certainly expedient but, if we trawl through the news headlines, we can find many instances of when ‘good enough’ was not good enough.
Challenger – the poster boy for normalisation of deviance
The term ‘normalisation of deviance’ was coined by sociologist Diane Vaughan, who led a 10-year research study into the Space Shuttle Challenger disaster of 28 January 1986. It’s interesting that the most comprehensive study of this technical catastrophe was led by a sociologist, not an engineer. That was probably because – by engineering standards – nothing should have gone wrong. According to the technical wisdom of the day, all systems were fine. Well, they weren’t, and what Vaughan found out was that, while the technical cause of the explosion was faulty O rings, the ultimate cause was human. And not the age-old scapegoat human error. The fault was a tendency she labelled normalisation of deviance.
The short version of the story is that the engineers at NASA knew the O rings were not 100% perfect, but they had worked well enough in previous launches, so – well – they’d gotten used to them. They normalised the problem.
Advertisement
History tells us what the result of that little shortcut was – seven people dead and billions of dollars of high-tech gizmos spectacularly up in smoke.
And in the world of real people
You’d probably like to think that you wouldn’t make that kind of mistake, and nor would anyone you know. But think about it. Have you ever driven after ‘just one or two’ drinks, because you know it’s safe because you’ve done it many times, and it’s always been fine? Even though every single study states that even one small drink slows your reaction time, and reduces your ability to judge distance and speed.
Have you ever watched while one lone voice in a meeting tries to convince management that it’s not a good idea to cut costs by using a cheaper material or a cheaper building method? And then they use that cheaper material, and it’s fine. So next time they do it again. And again. Until – one day – it isn’t fine. Here’s a scary list of incidents that happened just like that:
- On 29 June 2003, a balcony collapsed during a party in an apartment building in Chicago, killing 13 people and seriously injuring 57 others. The building managers blamed the fact that there were too many people on the balcony because – hey – just because a balcony is big enough to hold twenty people, it doesn’t mean it’s strong enough to do so.
- On 23 May 2004, shortly after the inauguration of terminal 2E of Paris’s Charles de Gaul Airport, part of the roof collapsed, killing four people and injuring three more.
- On Monday 2 January 2006, in Bad Reichenhall, Bavaria, Germany, the roof of a popular ice rink collapsed, killing 15 people and injuring 32 more. Many survivors were trapped for more than 24 hours in the rubble in sub-zero temperatures, while rescuers worked to free them.
- On 28 January 2006, the roof on the Katowice Trade Hall in Poland collapsed during the 56th National Exhibition of Carrier Pigeons, killing 65 people, and injuring more. The number of avian fatalities is unrecorded, but two pigeons were found alive after being trapped for 22 days. The designers and management of the building spent a bit more time than that ‘trapped’ in prison.
- Two people died, and 21 were injured when a pedestrian bridge collapsed onto the M1 in Johannesburg on 14 October 2015. Remember that?
- On Valentine’s Day 2004, the roof collapsed onto Moscow’s Africa-themed Transvaal Park entertainment centre and water park, killing 28 people and injuring in excess of 190. (I confess, I am intrigued, and would have loved to have seen the Russian interpretation of ‘the Transvaal’. The mind fairly boggles.)
- On 27 May 2017, a parking garage at Eindhoven Airport – the second-largest airport in the Netherlands – collapsed due to a construction error. Fortunately with no injuries or fatalities – let’s end this list on a relatively positive note.
The engineering and technical specifics differ, but the ultimate cause of all these disasters is the normalisation of deviance.
So how do we un-normalise deviance?
What made Vaughan’s study of the Challenger disaster so meaningful is that she concentrated not on what decisions had been made, and who made them, but on how those decisions ended up as seemingly inevitable. She emphasised that the ‘fault’ lay with the organisational culture, rather than with any one person or group of people.
Sadly, NASA did not learn much from the Challenger accident. Seventeen years later the Space Shuttle Columbia burned up as it re-entered the earth’s atmosphere, killing all seven astronauts on board. And, yes, NASA knew that there was a problem with the heat shield because some of the insulation had peeled off on previous missions, but – well – it all turned out fine in the end, didn’t it? Not really.
Since the Columbia disaster, NASA has drawn up a list of recommendations that can be paraphrased as:
- Don’t use past success to redefine acceptable performance.
- For any system, procedure, material or process, insist on ‘proof of safety’, rather than being satisfied with ‘no evidence of risk’ (which is not the same as ‘evidence of no risk’).
- Value diversity – appoint people with opposing views or ask everyone to voice their opinion before discussion. And then pay heed. One way of ensuring that lone voices of caution get heard is to make decisions by consensus, and to make it clear that it is safe to hold up a decision if you believe there is a good reason.
- Avoid conflict of interests – keep safety programmes independent from those activities they evaluate.
- Set the standard – executives breaking or bending rules will set the tone for the company’s culture, and make it easier for others to justify their actions.
Of course, NASA’s list was too late to save the shuttle programme – and also too late to save Rick D. Husband, William C. McCool, Michael P. Anderson, Ilan Ramon, Kalpana Chawla, David M. Brown and Laurel Blair Salton Clark. Yes, it’s often real people who end up paying the price for corporate deviance.