Editor's Choice

Meltdown: Why Our Systems Fail and What We Can Do About It

March 23, 2018

Share

Chris Clearfield and András Tilcsik have written a book about the increasing complexity of the world that makes it easy to understand, and oddly enjoyable to read.

Meltdown: Why Our Systems Fail and What We Can Do About It by Chris Clearfield & András Tilcsik, Penguin Press, 304 pages, Hardcover, ISBN 9780735222632

To understand Chris Clearfield and András Tilcsik’s new book, Meltdown, you have to go back to the work of sociologist Charles Perrow and his 1984 book Normal Accidents, which grew out of the President’s Commission on the Accident at Three Mile Island. Asked for a ten page report on “the reliability of industry,” Perrow instead produced a forty-page organizational analysis on the causes of the partial meltdown that occurred there and other accidents in “high risk, complex and highly interdependent operator-machine systems.” His conclusion was that “The system caused that accident, not the operators,” and that such accidents were inevitable because they are “a consequence of the complexity and interdependence that characterize the system itself.” As Clearfield and Tilcsik put it:

 

The failure was driven by the connections between different parts, rather than the parts themselves.  

 

At the time Perrow wrote his book, most industries were less complex and tightly coupled—indeed, most of life was. There was more slack in the system. But the world is increasing in complexity and connectivity:

 

When Perrow published Normal Accidents in 1984, the danger zone was sparse: it included systems like nuclear facilities, chemical plants, and space missions. Since then, all kinds of systems—from universities and Wall Street firms to dams and oil rigs—have become more complex and tightly coupled.

 

“No system,” the authors state, “is immune to this shift.” And that makes those systems more vulnerable. You can see it in the flash crashes that have occurred in stock exchanges as algorithms have taken over trading from human beings, or in the Deepwater Horizon spilling nearly five million barrels into the Gulf of Mexico after BP drilled an oil well deeper into the Earth than any company before it.

Not all of the catastrophe is accidental. As our consumer purchases migrate online (which, because cash registers are now computers connected to the internet, they are even if we’re shopping in the analog world), hackers are engaged in a constant effort to get at companies' customer information. Indeed, the authors note, “billion of new devices are now part of a network often called The Internet of Things, a huge, complex system vulnerable to both accidents and attacks.” Andy Greenberg wrote an article for Wired magazine about how it is possible for hackers to take control of a car and all of the systems within it, through its cellular connection—something he experienced first hand when he let Hackers Remotely Kill a Jeep on the Highway—With [Him] In It. Even more bizarre than that is when they explain how an episode of Homeland in which terrorists hack the vice-president’s pacemaker to kill him is not only not far-fetched, but that hacker Barnaby Jack thought it unrealistic because they made it seem too difficult to do. In fact:

 

Vice-President Dick Cheney had his cardiologist disable the wireless function in Cheney’s implanted device to avoid just such an attack.

 

The authors explore all these scenarios and more to explain how our systems can fail or be manipulated as they become more complex and connected. The story of Facebook and Cambridge Analytica, in the news this week, puts the evidence before us anew.

Complexity also makes it easier to cheat—for companies themselves to perpetuate and mask fraud. Enron was named by Fortune magazine as the most innovative company in America, six years in a row, because they were able to hide their fraud in the complexity of modern corporate financial accounting. Volkswagen’s ability to cheat on emissions tests was the result of implanted software and complex monitoring systems.

The first service the authors provide in Meltdown is to explain all this complexity and the risks inherent in it. The second, and much lengthier, is to explain how we can manage that risk. And, while the solutions, tools, and techniques they offer may not be easy (indeed, some of them are effective because they’re hard), most of them are remarkably simple. One of those tools, invented by Gary Klein, author of The Power of Intuition, is the premortem, which is a way to reframe potential risk. Instead of imagining what could likely go wrong, you imagine something has already gone wrong and write down all the reasons why the failure occurred. It is based on a concept called prospective hindsight.

 

If an outcome is certain, we come up with more concrete explanations for it—and that’s a tendency the premortem exploits. It reframes how we think about causes, even if we just imagine the outcome.

 

They discuss why we should put priority on a set of predetermined criteria, instead of our intuition, when making decisions on complex matters. They stress the importance of dissent, and how we can surface it productively in conversations and meetings, how leaders can soften power cues to put themselves on a more equal footing so that you’re more approachable and can hear alternative opinions. They write about the importance of diversity in helping stop the spread of groupthink and conformity by bringing in other viewpoints. Most people want to work with a group of like-minded people, to be able to avoid conflict and reach consensus quickly. But “[h]omogeneity,” the authors insist, “makes things too easy.”

 

It leads to too much conformity and not enough skepticism. It makes it easier for us all to fall for bad ideas.

 

Those homogenous groups do reach consensus more easily and are more likely to feel confident in their decisions. But in complex environments, that is a bug, not a feature. Diversity is important because it makes us work harder, not easier. As the authors put it:

 

Diversity works because it makes us question the consensus.

 

There is so much more to explore, and a diverse array of examples—from Target’s expansion into Canada, the Aviation Safety Reporting System (ASRS) run by NASA, the Devil’s Advocate in the Catholic Church, and even a modern family’s morning routine—to illustrate the techniques and tools they offer. In a world that can sometimes seem exasperating in its complexity and change, they offer guidance that is not only easy to understand, but is actually really fun to read.

 

We have updated our privacy policy. Click here to read our full policy.