BP Deepwater Horizon Arraignments: A Culture That “Forgot to be Afraid”
By Peter Kelly-Detwiler
Today, three BP employees are expected to be arraigned in connection with the explosion and sinking of the BP/Transocean rig Deepwater Horizon, in which 11 workers were killed and an estimated 205 million gallons of oil were leaked into the Gulf of Mexico. So justice is served. But the entire judicial process probably doesn’t address the real underlying issue. Which is the same issue that led to the 1988 Piper Alpha North Sea oil rig explosion killing 167 workers, the Montara blowout off Western Australia – a major oil platform spill in 2010 that took over ten weeks to contain, the explosions of the space shuttles Challenger in 1986 and Columbia in 2003, the Fukushima disaster, as well as many others of smaller scale.
The real issue is this: our operational cultures need to change. Technologies continue to get more bigger, more complex, and more dangerous, so they can be catastrophic when they fail. At the same time, these technologies and underlying processes are run by people under pressure to deliver economic performance, under deadlines, where time costs money and shortcuts can potentially save millions. Much of the work is routine, repetitive, and divided into small tasks that limit the ability to see the bigger picture. And unlike the military (which is not without its failings) or air traffic control (also not immune to problems), the culture of safety is less strong than the imperative to meet deadlines, what has been referred to in the Deepwater Report by the Center for Catastrophic Risk Management as an “imbalance between production and protection.” Furthermore, those cultures of safety tend to degrade over time. Success leads to complaceny.
Let’s start with the 1990 Lord Cullen report on the Piper Alpha disaster. It made 167 recommendations, and was highly critical of management’s training and safety culture. It’s not that different from the Commission of Inquiry into Australia’s Montara blowout whose summary is as follows: “The Inquiry has concluded that PTTEP Australia did not observe sensible oilfield practices at the Montara oilfield. Major shortcomings in the company’s procedures were widespread and systemic, directly leading to the blowout.”
hat summary is not very different from that of the inquiry into the space shuttle Columbia failure, which states:
“It is our view that complex systems almost always fail in complex ways, and we believe it would be wrong to reduce the complexities and weaknesses associated with these systems to some simple explanation. Too often, accident investigations blame a failure only on the last step in a complex process, when a more comprehensive understanding of that process could reveal that earlier steps might be equally or even more culpable. In this Boardʼs opinion, unless the technical, organizational, and cultural recommendations made in this report are implemented, little will have been accomplished to lessen the chance that another accident will follow.”
“Cultural traits and organizational practices detrimental to safety were allowed to develop, including: reliance on past success as a substitute for sound engineering practices (such as testing to understand why systems were not performing in accordance with requirements); organizational barriers that prevented effective communication of critical safety information and stifled professional differences of opinion; lack of integrated management across program elements; and the evolution of an informal chain of command and decision-making processes that operated outside the organizationʼs rules.”
These findings are largely echoed by The Japanese Diet’s investigation into the Fukushima catastrophe:
“It was a profoundly manmade disaster – that could and should have been foreseen and prevented. And its effects could have been mitigated by a more effective human response.”
“The direct causes of the accident were all foreseeable prior to March 11, 2011. But the Fukushima Daiichi Nuclear Power Plant was incapable of withstanding the earthquake and tsunami that hit on that day. The operator (TEPCO), the regulatory bodies (NISA and NSC) and the government body promoting the nuclear power industry (METI), all failed to correctly develop the most basic safety requirements—such as assessing the probability of damage, preparing for containing collateral damage from such a disaster, and developing evacuation plans for the public in the case of a serious radiation release.”
For those wanting to get a solid understanding of the events leading to the Deepwater Disaster, there is no better place to start than the Center for Catastrophic Risk Management’s report entitled Final Report on the Investigation of the Macondo Well Blowout. The Center focuses on investigation into and prevention into just this type of incident, and it’s pretty damning of BP:
The second progress report (July 15, 2010) concluded:“…these failures (to contain, control, mitigate, plan, and clean-up) appear to be deeply rooted in a multi-decade history of organizational malfunction and shortsightedness. There were multiple opportunities to properly assess the likelihoods and consequences of organizational decisions (i.e., Risk Assessment and Management) that were ostensibly driven by the management’s desire to “close the competitive gap” and improve bottom-line performance. Consequently, although there were multiple chances to do the right things in the right ways at the right times, management’s perspective failed to recognize and accept its own fallibilities despite a record of recent accidents in the U.S. and a series of promises to change BP’s safety culture.” The third progress report (December 1, 2010) concluded: “Analysis of the available evidence indicates that when given the opportunity to save time and money – and make money – tradeoffs were made for the certain thing – production – because there were perceived to be no downsides associated with the uncertain thing – failure caused by the lack of sufficient protection. Thus, as a result of a cascade of deeply flawed failure and signal analysis, decision-making, communication, and organizational – managerial processes, safety was compromised to the point that the blowout occurred with catastrophic effects.” (emphasis added).
“It is the underlying “unconscious mind” that governs the actions of an organization and its personnel. Cultural influences that permeate an organization and an industry and manifest in actions that can either promote and nurture a high reliability organization with high reliability systems, or actions reflective of complacency, excessive risk-taking, and a loss of situational awareness.”
The report further commented: “It has been observed that BP’s system ‘forgot to be afraid.’
By posting your comment, you agree to abide by our Posting rules