What We Should Learn From the Tragic Failure of the Titan

Page 1

WHAT WE SHOULD LEARN FROM THE TRAGIC FAILURE OF THE TITAN

Thought Sparks

RITA MCGRATH

Introduction

My last “Thought Spark” was published before the dreadful implosion of the Titan submersible. Then, I was talking about intelligent failures. This time around, let’s focus on what my colleague Amy Edmondson would call a preventable complex failure

In her brilliant forthcoming book, “Right Kind of Wrong,” Amy Edmondson discusses three kinds of failures. There are the potentially intelligent ones. Those teach us things in the midst of genuine uncertainty, when you can’t know the answer before you experiment. There are the basic ones, that we wish we never experienced and which would be great to get a ‘do over ’ on. And then, there is a class of failure that the doomed Titan submersible falls into – the preventable complex systems failure.

Three kinds of failures and why this one has lessons for us all

Hubris, Failing to heed warnings, and more

Amy and her colleagues have written about organizations which managed to overlook or underplay “ambiguous” warnings that bad things were underfoot. To counteract such threats, organizations need to be able to envision a negative outcome, prepare as teams to take coordinated action and be open to receiving information that such a negative experience might be unfolding. But the Titan? Here, the threat wasn’t even ambiguous.

Acknowledging small failures before they become big ones…

While ignoring the warnings from what Andy Grove might have called “helpful Cassandras” is bad enough, Rush also violated one of the cardinal rules of preventing large-scale complex systems failures. This is to recognize and account for small-scale failures before they build up to become large ones. And by all accounts, the troubled Titan submersible experienced plenty of smaller failures before the one that was ultimately catastrophic. The first submersible created by the company didn’t survive testing and was scrapped.

Normal accidents, failure and high reliability organizations

Charles Perrow, in an insightful 1984 book Normal Accidents: Living with High Risk Technologies, pointed out that unless they are very carefully designed, complex systems will be failure prone because they interact in ways people find difficult to anticipate. Since the publication of that book, we ’ ve learned a tremendous amount about operating complex systems safely.

Want to spark some thinking in your own organization?

Book Now

https://thoughtsparks.substack.com/

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.