The big-budget Hollywood take on the 2010 explosion of the offshore drilling rig known as Deepwater Horizon opens today. Directed by Peter Berg and featuring Mark Wahlberg, the film arrives as a star-driven disaster movie, suspensefully depicting events that included the deaths of eleven men, the destruction of a half-billion-dollar drilling platform, and the worst human-made ecological disaster in Gulf Coast history.
As is frequently the case with big productions, where the film ended up isn’t where it was always headed, and initial director J. C. Chandor had a different vision for the story before being replaced over “creative differences” with the studio. Chandor’s plan was apparently to tell the story from the perspective of “everyone that was on the rig,” with an ensemble cast of over one hundred people. It’s worth noting that, according to a new analysis of the disaster by two senior systems engineers, Chandor’s version was on the right track.
Earl Boebert and James Blossom use a systems model to investigate what happened at the Macondo well, focusing on the complex interactions of technology, people, and procedures. From their book, Deepwater Horizon: A Systems Analysis of the Macondo Disaster:
A system is a collection of components developed more or less independently—plus the people who operate them. Two things make it a system: it has an intended purpose, and it exhibits what systems engineers call emergent properties. An emergent property is something that arises from the interaction between components rather than from the behavior of a single one. Emergent properties are things like safety, security, and reliability. They are typically important and hard to quantify.
Emergent properties suggest the possibility of emergent events: events that result from a combination of decisions, actions, and attributes of a system’s components, rather than from a single act or from the failure of a single piece of equipment. The Horizon disaster was the very model of an emergent event.
A systems analysis resists the tendency to focus on a single cause, an urge so common, and so misleading, that it has a name: “root cause seduction.” That seduction, write Boebert and Blossom, “is typically at the heart of efforts to assign blame or liability rather than prevent future accidents.”
In an interview with Fresh Air, Deepwater Horizon director Peter Berg indicates that the Macondo disaster may indeed have had a central fatal trigger. When asked for a short version of what went wrong on the rig, Berg notes that it’s impossible to answer the question conclusively and that there are differences of opinion among those who were there and those who’ve investigated. But then he goes on:
However, the one area that almost everyone seems to agree on—everyone I talked to and I—I’ve asked that question maybe to 150 people who are all very connected with the rig, you know. What went wrong? Why did it happen? You know, what was the genesis of the disaster? And almost everyone seems to believe—and now at the bottom of these wells, deep underground, they pump cement. They call it seament, but it’s cement—concrete. They pump it deep down into the core of the Earth. And that cement forms sort of a protective casing that’s designed to keep the oil from all flooding up out of that hole. It’s a very thick, heavy wall of concrete that’s poured deep underneath the ocean and the Earth’s surface. And, you know, if that cement is not poured properly, it hardens obviously. And that’s the beginning of the management of that intense pressure. If that cement is not poured properly, and if that cement starts to break apart or, you know, is compromised, you can get too much oil streaming to the hole that takes the oil up out of the Earth. It’s very widely believed that there was something wrong with that cement job.
And one of the reasons why people point a finger at BP is that before you declare a well safe, you have to do something called a cement bond log. And I don’t want to bore people, but it’s actually kind of interesting. A cement bond log is like a final test where you send sonar images deep, deep down into the Earth to test the solidity of that concrete. And that test costs a couple of hundred thousand dollars. BP didn’t do that test. They sent the team home that’s supposed to do the test, saving themselves a couple of hundred thousand.
And, you know, it’s many people’s belief that they didn’t do the test because they just didn’t want to get any bad news. They knew if the test came up bad, they’re going have to rip it all out and do it again. They were in a hurry to get out of there. Not doing that test, not checking the cement saved them $200,000 in the short term, and cost them $60 billion and 11 men who lost their lives in the long term. So that to me is, you know, most people seem to agree that that’s at the heart of what went wrong.
These things may all be true, and—even if they weren’t—Hollywood can be forgiven for simplifying complex stories. But it’s notable that what Berg presents as a sort of expert consensus centers on a remarkably narrow element of what was an extremely complex operation with multiple companies, multiple crews, multiple vessels, and hundreds of workers, bosses, and executives. Was Deepwater even the right rig for the job? Were the right people present at the right time? Did everyone have the training and information they needed? Were lines of communication open? Did the pressure to move on to the next well contribute to a collective case of “go fever”?
These are among the questions Boebert and Blossom ask, and they pointedly reject the notion that run of the mill cost-cutting can be singularly cited for blame, because oil companies are always under pressure to cut costs. There must have been more.
Efforts to curb Macondo’s cost do not explain the performance of BP and the Horizon before Macondo. Together they had drilled almost fifty wells without incident, including one that set a record for depth of water and depth of drilling. Any plausible explanation for the Horizon blowout must show why it occurred at Macondo but not at any of the previous wells. Our evidence shows that the explanation lay 200 miles to the southwest, far out in the Gulf of Mexico.
Oil companies must find new oil to replace the oil they sell or they are in danger of going out of business. BP’s strategy was to concentrate that effort on a few high-risk “supergiant” deposits. One was the Horizon’s next assignment: a monster discovery called Kaskida—a once-in-a-lifetime opportunity sixty times as large as Macondo.
BP was at risk of losing its lease to Kaskida if the company did not meet a regulatory deadline for drilling there, and the Horizon was the only rig BP had available for that task. BP executives have steadfastly denied that Kaskida was a factor in the blowout, and both official and unofficial investigations have accepted their denials. Despite that, we believe that such an extreme and widespread epidemic of go fever could arise only from the fear of losing a prize the size of Kaskida.
“Systems analysis” may not scream “page-turner,” but Boebert and Blossom’s book is flat out compelling. In a Foreword, scientist Peter Neumann calls it “an incisive parable for almost everyone involved in risky endeavors.” Drilling for oil is inherently risky, and as we go further and deeper to reach ever more of it, the risks inevitably grow. This risky endeavor involves us all.