website statistics Meltdown: What Plane Crashes, Oil Spills, and Dumb Business Decisions Can Teach Us about How to Succeed at Work and at Home - PDF Books Online
Hot Best Seller

Meltdown: What Plane Crashes, Oil Spills, and Dumb Business Decisions Can Teach Us about How to Succeed at Work and at Home

Availability: Ready to download

NAMED A BEST BOOK OF 2018 BY THE FINANCIAL TIMES A groundbreaking take on how complexity causes failure in all kinds of modern systems--from social media to air travel--this practical and entertaining book reveals how we can prevent meltdowns in business and life "Endlessly fascinating, brimming with insight, and more fun than a book about failure has any right to be, Meltd NAMED A BEST BOOK OF 2018 BY THE FINANCIAL TIMES A groundbreaking take on how complexity causes failure in all kinds of modern systems--from social media to air travel--this practical and entertaining book reveals how we can prevent meltdowns in business and life "Endlessly fascinating, brimming with insight, and more fun than a book about failure has any right to be, Meltdown will transform how you think about the systems that govern our lives. This is a wonderful book."--Charles Duhigg, author of The Power of Habit and Smarter Faster Better A crash on the Washington, D.C. metro system. An accidental overdose in a state-of-the-art hospital. An overcooked holiday meal. Surprising new research shows that all these events--and the myriad failures that dominate headlines every day--share similar causes. By understanding what lies behind these failures, we can design better systems, make our teams more productive, and transform how we make decisions at work and at home. Weaving together cutting-edge social science with riveting stories that take us from the frontlines of the Volkswagen scandal to backstage at the Oscars, and from deep beneath the Gulf of Mexico to the top of Mount Everest, Chris Clearfield and Andr�s Tilcsik explain how the increasing complexity of our systems creates conditions ripe for failure and why our brains and teams can't keep up--with an emphasis on practical solutions. It's an eye-opening, empowering, and entirely original book--one that will change the way you see our complex world and your own place in it.


Compare

NAMED A BEST BOOK OF 2018 BY THE FINANCIAL TIMES A groundbreaking take on how complexity causes failure in all kinds of modern systems--from social media to air travel--this practical and entertaining book reveals how we can prevent meltdowns in business and life "Endlessly fascinating, brimming with insight, and more fun than a book about failure has any right to be, Meltd NAMED A BEST BOOK OF 2018 BY THE FINANCIAL TIMES A groundbreaking take on how complexity causes failure in all kinds of modern systems--from social media to air travel--this practical and entertaining book reveals how we can prevent meltdowns in business and life "Endlessly fascinating, brimming with insight, and more fun than a book about failure has any right to be, Meltdown will transform how you think about the systems that govern our lives. This is a wonderful book."--Charles Duhigg, author of The Power of Habit and Smarter Faster Better A crash on the Washington, D.C. metro system. An accidental overdose in a state-of-the-art hospital. An overcooked holiday meal. Surprising new research shows that all these events--and the myriad failures that dominate headlines every day--share similar causes. By understanding what lies behind these failures, we can design better systems, make our teams more productive, and transform how we make decisions at work and at home. Weaving together cutting-edge social science with riveting stories that take us from the frontlines of the Volkswagen scandal to backstage at the Oscars, and from deep beneath the Gulf of Mexico to the top of Mount Everest, Chris Clearfield and Andr�s Tilcsik explain how the increasing complexity of our systems creates conditions ripe for failure and why our brains and teams can't keep up--with an emphasis on practical solutions. It's an eye-opening, empowering, and entirely original book--one that will change the way you see our complex world and your own place in it.

30 review for Meltdown: What Plane Crashes, Oil Spills, and Dumb Business Decisions Can Teach Us about How to Succeed at Work and at Home

  1. 5 out of 5

    C

    I gave this 5 stars because of the shocking insight, research, and ways we're learning more and more on how to avoid "Meltdown." we rely heavily on computers, but it seems most of catastrophes happen due to human error, like the tool left inside the engine of a plane, scandal at the Oscars, etc. we need more people to speak up if they think they messed up. I enjoyed that part in this book. I commend those individuals. It's a shame that people have died to find out what went wrong and why. Should I gave this 5 stars because of the shocking insight, research, and ways we're learning more and more on how to avoid "Meltdown." we rely heavily on computers, but it seems most of catastrophes happen due to human error, like the tool left inside the engine of a plane, scandal at the Oscars, etc. we need more people to speak up if they think they messed up. I enjoyed that part in this book. I commend those individuals. It's a shame that people have died to find out what went wrong and why. Should of been discovered earlier whether it was laziness or cost-cutting. I agree about over-worked nurses. I didn't want to skip any part in this book because I really wanted to learn and actually wanted more, felt like I didn't know about a lot of these things that happened. I was shocked by the ATM, Starbucks, and specially by the way people can hack into those things, cars, really. That's a very scary thought. Another part was about the airplanes at night and relying on people on the ground and their instruments in the cock pit. This book was not at all how I thought it would be and I'm glad I read it because it is an eye-opening book and very well written, the research, just everything. I'm glad there are people who take this seriously and do all they can to fix the problem and do tests to make it a better world for us. I highly recommend. Thank you Net Gallery, Authors, and Penguin Press for this copy. Looking forward to reading more like this. Gives to a new prospective. This would make a great documentary if there isn't already one. I just want to grab someone and tell them all about this book. Cherie' #netgallery #penguinpress #meltdown

  2. 4 out of 5

    Mehrsa

    There are some interesting insights in this book but most of them are derivative of other people's work--either scholars of complexity theory or just management and diversity gurus. Basically, the idea here is that systems that are complex and tightly coupled will experience meltdowns. But that there are signs that should not be ignored. Fair enough. And how do you deal with it? Basically by listening to diverse voices doing a pre-mortem, getting rid of heirarchies and letting all team members s There are some interesting insights in this book but most of them are derivative of other people's work--either scholars of complexity theory or just management and diversity gurus. Basically, the idea here is that systems that are complex and tightly coupled will experience meltdowns. But that there are signs that should not be ignored. Fair enough. And how do you deal with it? Basically by listening to diverse voices doing a pre-mortem, getting rid of heirarchies and letting all team members speak, etc. I was hoping for more theoretical work on complexity and not just a bland management book.

  3. 5 out of 5

    Chris

    Meltdown is an excellent book for anyone curious about making lives, communities, and the world more resilient. The stories are relevant, authentic, and engaging, and lead directly to lessons worth trying out in our own organizations and systems. In world that at times can feel like it is replete with disaster Chris and Andrȧs remind us that we can take small steps in any place to improve the robustness of our systems to stop meltdowns of the large and small. In my field of global environmental h Meltdown is an excellent book for anyone curious about making lives, communities, and the world more resilient. The stories are relevant, authentic, and engaging, and lead directly to lessons worth trying out in our own organizations and systems. In world that at times can feel like it is replete with disaster Chris and Andrȧs remind us that we can take small steps in any place to improve the robustness of our systems to stop meltdowns of the large and small. In my field of global environmental health, one of the great failures of development funding was the drilling of thousands of wells in Bangladesh to reduce water-borne disease, only to belatedly discover that this “solution” introduced a wholly new disaster from arsenic in the groundwater. Thus, as noble-missioned an organization as UNICEF inadvertently perpetrated “the largest poisoning in history” by not recognizing systematic risk. This sort of unintended consequence and systematic failure, even by well intentioned actors, is the type of problem that Chris Clearfield and Andrȧs Tilcsik aim to prevent through better system design in their award-winning book, Meltdown: Why Our systems Fail and What We Can Do About It. The book starts off with a litany of system failures across industries and scales, from airplanes and nuclear power plants, to Starbucks coffee and cooking our Thanksgiving meal. In our modern world, as both systems and problems become more complex and more intertwined (“coupled”), the possibility and scope of disaster grows. Fortunately for us, the majority of Meltdown is oriented towards solutions that are relevant from our own kitchen all the way to a war room in prevention problems from becoming disasters. Chris and Andrȧs offer specific, actionable advice and tools to improve our systems for reducing disaster.

  4. 4 out of 5

    Alex

    Chris Clearfield brings clarity to complexity by looking at the little things that make for big problems. He goes through the catastrophes in our everyday systems, breaking down why complex systems are prone to failure, accidents, and intentional wrongdoing. Grounded in examples most of us would know from the news, we learn about patterns across failure-prone systems (non-linear cause and effect as well as tight "coupling" across the parts), why our solutions to these patterns tend to fail (we un Chris Clearfield brings clarity to complexity by looking at the little things that make for big problems. He goes through the catastrophes in our everyday systems, breaking down why complex systems are prone to failure, accidents, and intentional wrongdoing. Grounded in examples most of us would know from the news, we learn about patterns across failure-prone systems (non-linear cause and effect as well as tight "coupling" across the parts), why our solutions to these patterns tend to fail (we unintentionally patch complexity by adding even more complexity), and how our very human nature lead us to construct these vulnerable system (such as confirmation bias and conformity tendencies). Each part of the book feels anticipatory. The common pitfalls address the lacking solutions we might come up with if we just learned about the patterns, while learning about our nature is important for us to be reflective as we execute and maintain these systems. Overall I found the book quite helpful, it arms me with frameworks to label problematic complexity that I see, stories and examples to use as metaphors and parallels (my favorite might be the idea of a VP of Common Sense), as well as a call to reflection for myself (there's a need for humility and willingness to listen not just knowledge and information to address these problems).

  5. 5 out of 5

    RSM

    Was lucky to get this in a giveaway. I had high expectations because of the glowing quotes on the back from authors I like (Dan Pink and Charles Duhigg) and I really liked this book. It is very readable and often entertaining, and it is actually much broader than it seems. It uses a single framework to explain a great range of things, from disasters in your kitchen to meltdowns on Twitter to airplane crashes. Some interesting thoughts on where we're headed as a society and a lot of good tips for Was lucky to get this in a giveaway. I had high expectations because of the glowing quotes on the back from authors I like (Dan Pink and Charles Duhigg) and I really liked this book. It is very readable and often entertaining, and it is actually much broader than it seems. It uses a single framework to explain a great range of things, from disasters in your kitchen to meltdowns on Twitter to airplane crashes. Some interesting thoughts on where we're headed as a society and a lot of good tips for managing teams/companies no matter what your industry or field is. And like the authors say, you don't need to be a CEO to make a difference.

  6. 5 out of 5

    Ola Rabba

    I got this book pre-release (Amazon Vine) and really enjoyed it. The topic is super-interesting and important. Despite the serious subject matter, this book still manages to be a surprisingly entertaining read. It is a mix of case studies (ranging from very funny to very sad), interesting social science research, and a framework that brings it all together. My favorite parts were the sections about how diversity helps teams avoid failure and how we can use small failures to anticipate big disast I got this book pre-release (Amazon Vine) and really enjoyed it. The topic is super-interesting and important. Despite the serious subject matter, this book still manages to be a surprisingly entertaining read. It is a mix of case studies (ranging from very funny to very sad), interesting social science research, and a framework that brings it all together. My favorite parts were the sections about how diversity helps teams avoid failure and how we can use small failures to anticipate big disasters. The book also provides some of the clearest and most accessible analysis (that I have read) of events like the Volkswagen scandal, the lead/water crisis in Flint and the Oscars gala envelope mistake (La La Land!). There is lots of food for thought about where we are headed as a society (some depressing conclusions but also some optimistic ones), some good business advice and even some good life and career advice (for example, about how to avoid bad decisions when buying a new house or launching a project at work). If you enjoy books by Daniel Pink, Adam Grant, Malcolm Gladwell, Jonah Berger, Charles Duhigg, Chip and Dan Heath, and similar authors, you won't be disappointed. It is accessible, entertaining and still quite practical. I learned a lot, and once I got into it, I couldn't put it down.

  7. 5 out of 5

    Meaghan Johns

    "All too often, when we deal with a complex system, we assume that things are working just fine and discard evidence that conflicts with that assumption." This is one of those books that is so relevant to my work in project delivery and the world I (we) live in that, even after borrowing this book from the library, I decided to straight up buy it because of the sheer amount of notes I took. Our world has become increasingly complex, and our systems have become less transparent and more tightly cou "All too often, when we deal with a complex system, we assume that things are working just fine and discard evidence that conflicts with that assumption." This is one of those books that is so relevant to my work in project delivery and the world I (we) live in that, even after borrowing this book from the library, I decided to straight up buy it because of the sheer amount of notes I took. Our world has become increasingly complex, and our systems have become less transparent and more tightly coupled. Meltdown argues that when systems become both complex and tightly coupled, they land directly in the danger zone - the place that causes these catastrophic failures and meltdowns (and also the hit track by Kenny Loggins). Clearfield and Tilcsik offer some much needed wisdom for dealing with complexity and finding ways to avoid these failures. This is an especially great book for project management professionals and anyone who works for a company where safety is paramount, but I think it's also useful for anyone who's interested in better understanding the way system failure works. As a bonus, the book offers up excellent arguments for increased diversity in the workplace and in the boardroom, as well as suggestions for getting employees to speak up when they spot a potential problem. (Plus, plenty of Toronto shout-outs! Yeah boi.)

  8. 4 out of 5

    Fred Hughes

    A great insight into complex systems and the simple humans that try to control them. Communications seems to be the magic solution, as it is with most problems, in an age when texting is the standard mode of communications. Good read

  9. 4 out of 5

    Dee Eisel

    A few years ago when I still had Scribd, I found a book by Charles Perrow called “Normal Accidents.” My Goodreads review of it is here. As it turns out, that wasn’t the kind of book I was looking for. “Meltdown” is exactly what I was looking for. It takes Perrow’s theories and provides a more modern and digestible framework. Perrow’s thesis is that in systems with sufficient complexity and tight coupling (not a lot of time or room for error), accidents are inevitable. He calls them normal acciden A few years ago when I still had Scribd, I found a book by Charles Perrow called “Normal Accidents.” My Goodreads review of it is here. As it turns out, that wasn’t the kind of book I was looking for. “Meltdown” is exactly what I was looking for. It takes Perrow’s theories and provides a more modern and digestible framework. Perrow’s thesis is that in systems with sufficient complexity and tight coupling (not a lot of time or room for error), accidents are inevitable. He calls them normal accidents. “Meltdown” uses this and applies it to more recent accidents - everything from Wall Street crashes to Enron to software bugs to potential issues with dams and nuclear power plants. Where Perrow was writing in the 80s, which was the thing I remember most from his book, Clearfield and Tilcsik have the advantage of everything he knew and everything that has happened since. This doesn’t make me feel any better on a global scale, because if anything normal accidents have become more normal and expanded out into more areas of life. “Meltdown” makes it clear that areas that formerly were loosely coupled are now tightening, such as dam safety. It does also point out areas where active work to decrease issues has been successful, such as cockpit resource management (a philosophy of flight decks where first officers feel more empowered to challenge potentially dangerous actions by their captains). Overall, though, I don’t feel like my world is any safer than it was before. That’s not to say it can’t become safer. Taking lessons from Perrow and other systems analysts can help and have helped many businesses. It’s too bad this wasn’t around before Target rolled out in Canada. Instead of being an object lesson in failure for Clearfield and Tilcsik, they could have been a lesson in success. Five of five stars.

  10. 5 out of 5

    Wendy

    “Meltdown: Why Our Systems Fail and What We Can Do About It” which I won through Goodreads Giveaways is a compelling look at system failures and the solutions to avoid a meltdown. Broken into two parts the first half of the book gives insight into why systems which today are more capable, complex and less forgiving can be problematic, killing people accidently, bankrupting companies, and even giving rise to the innocent being jailed. As the authors also point out, even a small change to one of t “Meltdown: Why Our Systems Fail and What We Can Do About It” which I won through Goodreads Giveaways is a compelling look at system failures and the solutions to avoid a meltdown. Broken into two parts the first half of the book gives insight into why systems which today are more capable, complex and less forgiving can be problematic, killing people accidently, bankrupting companies, and even giving rise to the innocent being jailed. As the authors also point out, even a small change to one of these systems can make them surprisingly more vulnerable to accidental failures, hacking and fraud. The second part offers useful solutions like learning from small errors, being cognizant of warning signs, listening to the input of skeptics and using the diversity in a company to avoid big mistakes like Target failed to do when management made the decision to expand into the Canadian market. A serious topic but presented in an entertaining manner, Chris Clearfield and András Tilcsik blend case studies with social science research into a convincing argument that’s utterly captivating and doesn’t let you go until the end. I enjoyed their insights into crises like Three Mile Island, Enron, Washington Metro Train 112, and aircraft disasters which fuel the positive solutions they provide to stop future failures. “Meltdown: Why Our Systems Fail and What We Can Do About It” is a fascinating book that’s well-written, interesting and should be at the top of everyone’s reading list.

  11. 5 out of 5

    Daniel

    This book is about catastrophic events, but taking a totally different approach from the Black Swan. Tight coupling + complexity = meltdown Fukushima. Long island, Target in Canada. Enron. Flint water. Washington Metro. Aircraft disasters. Oscar mix up. You cannot think of all the potential problems because complex systems interact with each other and create unforeseeable problems. Each person in the system can only see a small part of it. However warning signs do appear first. With the rise of a This book is about catastrophic events, but taking a totally different approach from the Black Swan. Tight coupling + complexity = meltdown Fukushima. Long island, Target in Canada. Enron. Flint water. Washington Metro. Aircraft disasters. Oscar mix up. You cannot think of all the potential problems because complex systems interact with each other and create unforeseeable problems. Each person in the system can only see a small part of it. However warning signs do appear first. With the rise of automation and algorithms, the world is getting more complex, so disasters are going to happen more. How to avoid it? 1. Reduce complexity: not usually possible. So reduce coupling by allowing more time. Try to get feedback 2. Be systematic in decision making. Agree on criteria first and then score them to come up with a score. 3. Do a premortem 4. Anonymous reporting and improvement system. Of course this did not work for the UK NHS’s Dr Hawa-Gaba. 5. Never ignore warning signs and close calls. But how? 6. Allow dissenters to voice out problems and solutions easily, in a structured manner. Soften power cues. 7. Have a diverse group because its members would not blindly trust each other’s judgment, avoiding herd mentality. The members are also able resist the famous social pressure in the wrong answer experiment. But this is hard: mandatory diversity programs make things worse! Fortunately voluntary ones work. Also formal rotation mentoring programs involving everyone reduce bias. A diversity tracking program is also effective. Bring in the amateurs to the board so every assumption is debated. 8. Bring in an outsider 9. Resist whoever is applying pressure on you, even if it is Steve Jobs. Stop everything and change course completely if necessary. For intense situations that are not going well, pause now and then and take a look at the big picture. I learnt so much from this book!

  12. 5 out of 5

    Joao Felipe

    The book is brilliant. It starts by describing how our systems work nowadays and how the evolution from systems which were more simple, and had more slack in their structures, can suffer a Meltdown. One of the reviews I’ve read before reading it, mentioned that the authors quote a lot of research and do very little original work. What I experienced, in fact, made it seem like a compliment, because there is nothing wrong in compiling a ton of research as long as you mention the sources and explai The book is brilliant. It starts by describing how our systems work nowadays and how the evolution from systems which were more simple, and had more slack in their structures, can suffer a Meltdown. One of the reviews I’ve read before reading it, mentioned that the authors quote a lot of research and do very little original work. What I experienced, in fact, made it seem like a compliment, because there is nothing wrong in compiling a ton of research as long as you mention the sources and explain (which the authors do very well) how are they connected. The book, though, is not for every “type” of reader, because it has too much details on very specific explanations. Ex: To explain the cooling system of a Nuclear Power Plant, the authors use a lot of details, and if you are not a very curious person about “how everything works”, it might be better to pass this book. A big pro for this book, is that they try to suggest solutions (some managerial tools) and ways to prevent Meltdowns from happening in any situation. From choosing which house to buy to how pilots should behave in landing the plane when the charts don’t show detailed altitudes. Do I recommend it? Yes. It is a great book, but depends on who is reading. If you want a compiled guide about failures in modern systems and how to prevent it, go forward. But if you don’t like detailed explanations about various topics or just want to see a more extensive approach, with more data from the studies they quote, it might be better just to research the studies in the index.

  13. 5 out of 5

    Fiona

    A fascinating gloss of how systems break down: its all about complexity and coupling, a simple concept with infinite applications. I really wish this book had been longer, a phrase I don't often utter. Thank you to Penguin/Random House for the free copy for review. It was delicious. A fascinating gloss of how systems break down: its all about complexity and coupling, a simple concept with infinite applications. I really wish this book had been longer, a phrase I don't often utter. Thank you to Penguin/Random House for the free copy for review. It was delicious.

  14. 5 out of 5

    Rayfes Mondal

    We increasingly rely on complex systems that fail in unforeseen ways. This book describes many of them and steps we can take to reduce failure. An enjoyable, informative read.

  15. 4 out of 5

    Peter Immanuel

    Great book! Using several fascinating stories, the authors show how a combination of complexity (i.e. when parts of a system interact in non-linear, hidden, and unexpected ways) and tight coupling (i.e. when there is little slack among the parts of a system and the failure of one part easily affects the others) causes meltdowns. The authors then offer suggestions on how to make meltdowns less likely, while talking about the research on which these suggestions are built. Some of these suggestions i Great book! Using several fascinating stories, the authors show how a combination of complexity (i.e. when parts of a system interact in non-linear, hidden, and unexpected ways) and tight coupling (i.e. when there is little slack among the parts of a system and the failure of one part easily affects the others) causes meltdowns. The authors then offer suggestions on how to make meltdowns less likely, while talking about the research on which these suggestions are built. Some of these suggestions include: making systems more transparent; removing unnecessary bells and whistles (even warning systems that are more complex than they need to be!); conducting pre-mortems before embarking on a project; making teams more diverse; and welcoming dissent (even if it hurts). It's a fun read, with insights that could be useful at work and maybe even in daily life.

  16. 4 out of 5

    Daryl Moad

    I have never received this book; when are you planning on sending it?

  17. 5 out of 5

    Matthew Wynd

    Overcoming system failure in an increasingly complex world is a daunting topic. This book lays out how every one of us can contribute to building simple and successful systems to stay ahead of complex system meltdown. Favorite topics within this book include Pre-mortems and how Charles Parrow's Technology Classification Matrix can help us. Don't skip the epilogue. There is an excellent reference to how Yeats' The Second Comming is used inaccurately to describe why world news seems to be getting p Overcoming system failure in an increasingly complex world is a daunting topic. This book lays out how every one of us can contribute to building simple and successful systems to stay ahead of complex system meltdown. Favorite topics within this book include Pre-mortems and how Charles Parrow's Technology Classification Matrix can help us. Don't skip the epilogue. There is an excellent reference to how Yeats' The Second Comming is used inaccurately to describe why world news seems to be getting progressively worse. One of the most useful and strangely comforting books I have read in the past few years.

  18. 5 out of 5

    Ricky Duncan

    I entered (and won) a giveaway for the book based on the description of the book given. And I was not disappointed, fascinating account of how simple things cause complex systems to fail, sort of a explanation of human hubris, and why we need to think out all possible effects, but can’t as some of them we can’t even conceptualize.

  19. 5 out of 5

    Melissa T

    *I won a copy of this via Goodreads Giveaways* This is an interesting look at different types of meltdowns and failures throughout history. It covers topics ranging from stock market and business failures, to retail and medical failures. It seems that the main causes of meltdowns are the intricacy of systems and organizations, and the combination of small errors that can add up and lead to big problems. There are a lot of concrete, well placed examples of these different types of meltdowns. I appre *I won a copy of this via Goodreads Giveaways* This is an interesting look at different types of meltdowns and failures throughout history. It covers topics ranging from stock market and business failures, to retail and medical failures. It seems that the main causes of meltdowns are the intricacy of systems and organizations, and the combination of small errors that can add up and lead to big problems. There are a lot of concrete, well placed examples of these different types of meltdowns. I appreicated the digestibility of these examples. This book is full of research, which can sometimes lead to dry reading, and boredom. The research is solid, and the execution is as well, which makes this easy to take in. The book also highlights the true importance of diversity in business and the workplace. The true importance is not the numbers, or feel good quota that diversity can bring to a company, but the variety of ideas, and therefore, more innovation. One thing that stood out of me in particular was a section on mandatory workplace programs. The book talked about mandatory diversity programs. When programs are mandatory, people are less receptive to them, and they actually made diversity in these workplaces worse, not better. I applied the concept to a mandatory workplace program that I had to go through, on bullying. Everyone grumbled and groused about it, therefore making all of us less receptive. After the implementation of the mandatory sessions, I actually noticed more instances of what I would consider bullying/disrespectful behavior. This does get a little oversimplified and repetitive in places, but overall a solid book.

  20. 5 out of 5

    Eddie Choo

    Understanding Complex Systems A wonderful intellectual successor to Charles Perrow’s Normal Accidents, which forms the intellectual spine of the book. This book doesn’t look at the broader economic and social causes of why these meltdowns along the way, though- but still a sufficient read.

  21. 4 out of 5

    Anna

    **I received a copy of this book through the Goodreads giveaway program.** I don't think I'm the intended audience for this book, but I still enjoyed it. Probably someone with a background in engineering or manufacturing would get more out of this book. Really good use of case studies and anecdotes and the book covers a lot of ground in a very accessible way. It was a little dry at times, but again, take my perspective with a grain of salt. To learn more about everything that can go wrong with nu **I received a copy of this book through the Goodreads giveaway program.** I don't think I'm the intended audience for this book, but I still enjoyed it. Probably someone with a background in engineering or manufacturing would get more out of this book. Really good use of case studies and anecdotes and the book covers a lot of ground in a very accessible way. It was a little dry at times, but again, take my perspective with a grain of salt. To learn more about everything that can go wrong with nuclear power systems and Three Mile Island I would recommend reading Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety

  22. 4 out of 5

    Kevin

    This is a thought-provoking and highly readable book that was a source of fresh, new ideas for me. I knew some of the stories quite well (VW, Flint, the New York Times serial plagiarist scandal, the Academy Awards debacle etc.) and was familiar with some of the research, but this books offered the excitement of looking at these events through a different lens.

  23. 5 out of 5

    Alex

    I enjoyed the book - some reviewers say it is heavy on case studies and light on takeaways. I agree, although the case studies are good. Below are my notes of takeaways: Meltdowns are more often caused by a number of small component failures that interact in unpredictable ways than by a particular component or operator failure. Failures are more likely to happen when systems are both: · Complex: many components are joined by complex (non-linear) interaction, so that the failure of one affects many I enjoyed the book - some reviewers say it is heavy on case studies and light on takeaways. I agree, although the case studies are good. Below are my notes of takeaways: Meltdowns are more often caused by a number of small component failures that interact in unpredictable ways than by a particular component or operator failure. Failures are more likely to happen when systems are both: · Complex: many components are joined by complex (non-linear) interaction, so that the failure of one affects many others. Complexity can cause such rare interactions that it’s impossible to predict most of the error chains that will emerge. Complex systems are characterised by indirect of inferential information sources (not directly observable) and unfamiliar or unintended feedback loops. · Tightly coupled: when the components of a complex system are tightly coupled, failures propagate though the system quickly. Tightly coupled systems are characterised by: Delays are “not possible”; Sequence of events are invariant; Alternative paths not available; Little opportunity for substitution or slack; Redundancies are designed in and deliberate In the last few decades complexity and coupling have increase in most systems, shifting many of them in to the ‘danger zone’. We are in the ‘golden age’ of meltdown so what should organisations do? 1) Build better systems and improve decision making Organisations need to consider complexity and coupling as key variables when planning and building systems. For instance, sometimes redundancy, intended as a safety feature, adds complexity, creating the potential for unexpected interactions. In fact, safety systems are the biggest single source of catastrophic failure in complex, tightly coupled systems. When building systems, a way to reduce complexity is to add transparency. Transparency makes it hard for us to do the wrong thing and makes it easy to realise if we have done a mistake. A number of tools can also be used to make decisions in wicked environments (where there is no feedback), such as: · SPIES (subjective probability interval estimates) is a decision making method shown to produce more precise estimates within a desired probability threshold. Useful for estimating projects lengths or anti-tsunami wall height. · All-pairs testing – using a set of predetermined criteria for decision making. If many criteria the decision becomes too complex. Using paired comparison is a way of bringing everything together into one whole picture. · Premortem – a managerial strategy in which a project team uses ‘prospective hindsight: imagining that a project has failed, and then working backward to determine what potentially could lead to. 2) Detect and learn from the early warnings that complex systems often provide To manage complexity we need to learn from the information our systems throw at us in the form of weak signals of failure: small errors, close calls and other warning signs. Anomalising: Organisations such as airlines have figured out a process for learning from small lapses and near misses: Gather (e.g. by collecting close call reports); fix; find root causes (dig deeper); share (sometimes not just within a company but, as with Callback, across the entire aviation industry, a regulator could play a role in that); and audit (ensure solutions actually work). An organisation’s culture sits at the centre of all this – a culture must be created where mistakes and incidents that have occurred in the system are openly shared. 3) Make effective teams Features of effective teams to prevent catastrophic failures · Members speak up - when they know about hidden risks or have a suspicion that something is not right. Sceptical voices are crucial. Conditions for this require that 1) people know how to raise concerns, 2) power cues are softened, 3) leaders speak last. · Diversity – diversity creates less familiar and even less comfortable environments that make people more sceptical, critical and vigilant, all of which makes them more likely to catch errors. · Outsiders input – from other parts of the organisation or outside of the organisation are very valuable as their position is more objective and lets them see different things than insiders do. · Monitor and diagnose - A common source of failure is the plan continuation bias: failure to stop performing the task even when the circumstances change. In very tightly coupled systems stopping is not an option (critical surgery, runaway nuclear reactor), but the teams that prevent failures better are the ones that have learned to rapidly move to the cycle ‘perform task’, ‘monitor’, ‘suggest diagnosis’ · Role shifting – When teams without much cross-training faces a surprise in a complex system, a meltdown can result. Role shifting requires several people to know how to accomplish a particular task, but it also means everyone needs to understand how the various tasks fit into the bigger picture.

  24. 5 out of 5

    Mark Mitchell

    Clearfield's book is a compendium of disaster tales, a collection of ghost stories for the modern age. He tells of train crashes, plane crashes, oil spills, and nuclear meltdowns. In each case, Clearfield shows how "complex, tightly-coupled" systems (defined as those whose inner workings are opaque and where interactions between the parts can cause the failure of the system as a whole) can result in calamity. Clearfield's work references Charles Perrow's research and thinking extensively. Perrow Clearfield's book is a compendium of disaster tales, a collection of ghost stories for the modern age. He tells of train crashes, plane crashes, oil spills, and nuclear meltdowns. In each case, Clearfield shows how "complex, tightly-coupled" systems (defined as those whose inner workings are opaque and where interactions between the parts can cause the failure of the system as a whole) can result in calamity. Clearfield's work references Charles Perrow's research and thinking extensively. Perrow's Normal Accidents: Living with High-Risk Technologies is an older book covering some of the same ground. Clearfield provides an excellent exposition, in easily readable form, of Perrow's thesis: that complex, tightly-coupled systems fail in unpredictable, often-catastrophic ways. Clearfield leads the reader through the actual failure showing how a cascade of failures caused the whole thing came crashing down. Then, in the second portion of the book, he attempts to show how better practices can mitigate the risks of such systems. While the ideas are valuable (make it hard for operators to do the wrong thing, learn from near-failure situations, encourage dissent), I am skeptical that they are sufficient. Nassim Nicholas Taleb's Antifragile: Things That Gain from Disorder discusses some of the same ideas. One of Taleb's deep insights is that one should not build systems whose failure is truly catastrophic. A system with a small probability of an unacceptable failure is not a well-designed system. To be fair, in this framework, a plane crash is an acceptable failure as it kills hundreds and costs merely tens of millions of dollars. In contrast, an unacceptable failure causes tremendous damage over a wide area or to a huge number of people. Clearfield is more concerned with how to reduce the risk of what I have termed an "acceptable" failure. None-the-less, those interested in Clearfield's book should also read Taleb's work. Those of us trained as engineers tend to lack the humility to refrain from building systems that we cannot possibly make safe, and Clearfield's work helps to show how difficult it is to create a truly safe system. Clearfield's book makes breezy reference to other popular works in related fields, including the outstanding Thinking, Fast and Slow by Daniel Kahneman by Malcolm Gladwell, and Smarter Faster Better: The Secrets of Being Productive in Life and Business by Charles Duhigg. Readers who have already digested those titles will find parts of Clearfield's book repetitive. Still, those who have not will receive the benefit of a quick overview of other recent contributions to psychology and habit research. All-in-all, Clearfield has done readers a service by providing a clear view of the risks of modern systems and important risk-mitigation ideas.

  25. 4 out of 5

    David

    The space shuttle Challenger exploded shortly after its launch on a chilly January morning in 1986. The story of the accident is now well known. O-rings, intended to seal joints in the solid rocket boosters that helped propel the shuttle into orbit, didn’t work because of the cold. Engineers knew that low temperatures affected O-rings, but after a tense conference call the night before the launch, they decided to proceed anyway. --- In the danger zone, our systems are so complex that it’s hard t The space shuttle Challenger exploded shortly after its launch on a chilly January morning in 1986. The story of the accident is now well known. O-rings, intended to seal joints in the solid rocket boosters that helped propel the shuttle into orbit, didn’t work because of the cold. Engineers knew that low temperatures affected O-rings, but after a tense conference call the night before the launch, they decided to proceed anyway. --- In the danger zone, our systems are so complex that it’s hard to predict exactly what will go wrong ahead of time. But there are warning signs—the writing is often on the wall. We just need to read it. Rating: 5/5 meltdown / 'mεlt∙daʊn / noun 1: an accident in a nuclear reactor in which the fuel overheats and melts the reactor core; may be caused by earthquakes, tsunamis, reckless testing, mundane mistakes, or even just a stuck valve 2: collapse or breakdown of a system We struggle in complex systems, but adding a little bit of structure to our decisions gives us a fighting chance. One reason people resist these ideas is that they assume that avoiding failure means taking fewer risks; they assume that preventing meltdowns requires sacrificing innovation and efficiency. Indeed, there are trade-offs. Adding slack to a system or redesigning it to cut complexity can mean higher costs and fewer capabilities. And there’s a lot of value in talking about these trade-offs explicitly—and using complexity and coupling as basic parameters when we consider costs, benefits, and risks. But the things that help us manage complex systems don’t always involve painful trade-offs. In fact, there’s now plenty of research showing that many of the solutions we’ve seen in this book—structured decision tools, diverse teams, and norms that encourage healthy skepticism and dissent—tend to fuel, rather than squelch, innovation and productivity. Adopting these solutions is a win-win. — That’s why we decided to write this book. We wanted people to realize that preventing meltdowns was within their grasp. IN THE MIDDLE AGES, humanity faced a grave threat. In October 1347, a fleet of trading ships arrived in Sicily. Most of the sailors were dead; the rest were coughing and vomiting blood. Other ships ran aground before reaching port because everyone aboard had died. It was the start of the Black Death, an epidemic that would go on to kill tens of millions. The disease, which originated in Asia, moved west along the Silk Road with traders and Mongol soldiers. The Mongol army used it as a weapon, catapulting infected bodies over the walls of a trading city it had besieged. The epidemic soon spread to Africa and the Middle East. — The world was ripe for the spread of the disease. New trade routes connected cities and encouraged movement. People lived in closer quarters than ever before. But humanity wouldn’t develop antibiotics, epidemiology, sanitation, or the germ theory of disease for centuries. It was what one historian called “the golden age of bacteria.” We were vulnerable to epidemics, but our ability to understand—let alone prevent—them lagged far behind. Today, we are in the golden age of meltdowns. More and more of our systems are in the danger zone, but our ability to manage them hasn’t quite caught up. The result: things fall apart. But times are changing. We now know how to bring the golden age of meltdowns to a close. We just need the conviction to try.

  26. 4 out of 5

    Volodya

    This review has been hidden because it contains spoilers. To view it, click here. You would think that a book about systems failures would be technically difficult to read and you’d be wrong. This was a page turner. I would have devoured this book in days but chose to make highlights and notes in the margin to apply to our own business. Lessons presented seemed applicable and easy to implement. A few spoilers that I found interesting are below. Review cannot do the book justice, you should read the book. But I’ll list some of my highlights and scribbles because these were sur You would think that a book about systems failures would be technically difficult to read and you’d be wrong. This was a page turner. I would have devoured this book in days but chose to make highlights and notes in the margin to apply to our own business. Lessons presented seemed applicable and easy to implement. A few spoilers that I found interesting are below. Review cannot do the book justice, you should read the book. But I’ll list some of my highlights and scribbles because these were surprising and useful to me. A complex system is impossible to monitor inner workings, we rely on external indicators. And if there is little to no slack, tight coupling, then a failure is likely. How much slack is in your business? Such accidents are normal according to Petrow. Not frequent, like it is normal for us to die but we do it only once. The biggest contributor to catastrophic failures are safety systems. Think of the fixes to complex issues that make the system even more complex and tightly coupled. We are bad at forecasting. When we feel 90% sure of the outcome, we are right less than half the time. Instead of thinking of two endpoints, better to have an entire range of possibilities and probabilities. Wicked environments are such that it’s hard to check our predictions. It’s like learning to cook without being able to taste the food. Complex systems are wicked environments and our intuition often fails us. The authors made a great example of a study that showed that during a busy day of hearing parole cases, judges are much more likely to grant parole right after meal breaks. They don’t get much independent feedback and their intuition clearly fails them. I liked that the book provided a tool to avoid such issues in our own lives, pairwise wiki survey. The idea of imagining that it’s two years from now and the idea failed, a pre-mortem was already known to me from other texts, but it is important. Prospective hindsight boosts our ability to identify reasons for an outcome. Our brain fills in the gaps in information by making things up. There is an example of a plain crash because the pilot filled a reason for missing information on approach altitude by making it up and not realizing he did it. How many times did we do it without such catastrophe? Since a fundamental problem with complex systems is that we cannot find all problems by just thinking of them, the authors correctly suggest the importance of rewarding staff for voicing and showing abnormalities they see. It’s rarely done, we usually shush people who bring us problems. My favourite part was about the outcome bias. We tend to assume that our system is working well even if our success is due to dumb luck. Since we are mostly judged by the outcomes, not the processes of our decisions, this is scary. To manage complexity we need to pay attention to hints such as small errors, close calls. This means openly sharing stories of failures without blame. Research shows that having power, even perceived power is a bit like having brain damage - insensitive and impulsive behaviour dismissing others opinions, even experts. That’s why airline industry had a more accidents while captain was flying and not a copilot, 3/4 of all accidents between 1979 and 1990. Thankfully the industry made changes. If you are not encouraging people to speak up, you are discouraging them!

  27. 4 out of 5

    Gail

    Let’s say you wanted to create the most boring sounding field possible. You might call it “systems science” and choose topics of study like dams, oil rigs, and water treatment plants. But Chris Clearfield and András Tilcsik will have thwarted your plans, producing as they have a page-turner about the paradox of progress: “as our systems have become more capable, they have also become more complex and less forgiving, creating an environment where small mistakes can turn into massive failures.” “M Let’s say you wanted to create the most boring sounding field possible. You might call it “systems science” and choose topics of study like dams, oil rigs, and water treatment plants. But Chris Clearfield and András Tilcsik will have thwarted your plans, producing as they have a page-turner about the paradox of progress: “as our systems have become more capable, they have also become more complex and less forgiving, creating an environment where small mistakes can turn into massive failures.” “Meltdown” covers “large-scale meltdowns like BP’s oil spill in the Gulf of Mexico, the Fukushima nuclear disaster, and the global financial crisis” as well as smaller failures that “seem to stem from very different problems,” but have similar underlying causes and means of prevention. “That shared DNA means that failures in one industry can provide lessons for people in other fields: dentists can learn from pilots, and marketing teams from SWAT teams.” In prose that’s both gripping and easily digested, “Meltdown” summarizes research on “why diversity helps us avoid big mistakes and what Everest climbers and Boeing engineers can teach us about the power of simplicity” as well as “how film crews and ER teams manage surprises—and how their approach could have saved the mismanaged Facebook IPO and Target’s failed Canadian expansion.” Clearfield and Tilcsik demonstrate a knack for choosing fascinating subjects like hackers who can use an antenna and a laptop to control your insulin pump and the “La La Land” flub at the 2017 Oscars. They also abide their own lessons in the imparting. Since systems are ripe for failure when they’re (1) complicated and (2) tightly coupled (meaning lots of stuff is closely tied together in a way that begs for a dominos-style reaction), the authors dumb down the material covered as much as possible (e.g., “TEPCO’s engineers worked in what psychologists call a wicked environment. In such environments, it’s hard to check how good our predictions and decisions are. It’s like trying to learn how to cook without being able to taste the food. Without feedback, experience doesn’t make us into better decision makers.”). Then they reformulate key points so as to add in a little slack for the reader to catch up. The result? A failure-free work of nonfiction.

  28. 4 out of 5

    Paiman Chen

    Learn from small failures to avoid big ones. Diversity feels strange. It’s inconvenient. But it makes us work harder and ask tougher questions. Build diverse teams and listen to skeptics. Training those in power to listen to dissent is crucial, as is encouraging subordinates to speak out. Diversity moves people out of their comfort zone, thus increasing vigilance and avoiding big mistakes. Diversity is critical to catching bad ideas before they manifest in bad consequences. Diverse groups push peo Learn from small failures to avoid big ones. Diversity feels strange. It’s inconvenient. But it makes us work harder and ask tougher questions. Build diverse teams and listen to skeptics. Training those in power to listen to dissent is crucial, as is encouraging subordinates to speak out. Diversity moves people out of their comfort zone, thus increasing vigilance and avoiding big mistakes. Diversity is critical to catching bad ideas before they manifest in bad consequences. Diverse groups push people outside their comfort zone, which makes for skepticism and vigilance. Those qualities help identify errors. Encouraging dissenting views helps identify problems. Human brain structure reinforces conformity with groupthink. When a person goes against the group, his or her brain activity mirrors that experienced during a highly emotional event. The brain doesn’t want the pain of being an outsider. This explains, in part, why it is difficult to hear dissenting voices. People who feel they are in a position of power tend to dismiss other opinions and are less willing to accept even expert advice. Acknowledging small problems and learning from them helps you find larger threats “Tight coupling” – a situation where the margin for error is slim – can make small problems in complex systems spiral out of control. No one can see everything going on in a complex system. Its components link in a web, not a linear chain. A tightly coupled system has no tolerance for failure, and the sequence of events is critical. Complex web systems rely on all components functioning correctly at all times. If one thing goes wrong, people have no time to correct it. As a result, multiple failures create a chain reaction.

  29. 4 out of 5

    TheCosyDragon

    This review has been crossposted from my blog at The Cosy Dragon . Please head there for more in-depth reviews by me, which appear on a timely schedule. Why do failures happen in huge industrial and nuclear plants? How do we avoid aeroplane crashes with thousands of planes landing in LAX each day? These ideas are extensively explored in this non-fiction novel that provides a base for you to apply these principles to your own business. I freely admit that business management and so forth is not usu This review has been crossposted from my blog at The Cosy Dragon . Please head there for more in-depth reviews by me, which appear on a timely schedule. Why do failures happen in huge industrial and nuclear plants? How do we avoid aeroplane crashes with thousands of planes landing in LAX each day? These ideas are extensively explored in this non-fiction novel that provides a base for you to apply these principles to your own business. I freely admit that business management and so forth is not usually in my personal interest area. However this book caught my fancy because I knew the stories would interest me. I’ve always enjoyed the Hudson River plane landing, and I knew this book would explore why that actually happened and how it could be prevented. Some of the case studies here are actually talking about how large businesses can hide illegal practices under legal looking paperwork. My partner (who does have the business management background and a piece of paper to prove it) hasn’t read this book yet but is planning on doing so. I was going to wait to review until she had read it… but it’s been a month. Let’s just say I think it’s appropriate for a range of experience levels. The authors don’t dumb it down, but they do explain the terms that they use so even a novice like me can understand. It’s sensibly laid out with helpful chapter headings that aren’t obtuse. I’d recommend this novel for anyone who runs a business or has an interest in how big ones work. I enjoyed it and would read it again, which is high praise from me for a non-fiction. It’s traveled with me overseas and it is going to travel back home with me.

  30. 4 out of 5

    Eric Lawton

    I was a little disappointed because I'd read an excerpt in a newspaper - probably the best pages. Still, an important subject. I know quite a bit about this subject because my profession was complex IT system design, and I found the book was overly simplistic. This is not because some of the tools they suggest, such as for structured decision making on a small scale, are too simple; they're appropriate at the small scales and often things seem almost too obvious that people thing they don't need I was a little disappointed because I'd read an excerpt in a newspaper - probably the best pages. Still, an important subject. I know quite a bit about this subject because my profession was complex IT system design, and I found the book was overly simplistic. This is not because some of the tools they suggest, such as for structured decision making on a small scale, are too simple; they're appropriate at the small scales and often things seem almost too obvious that people thing they don't need emphasizing. Atul Gawande's book The Checklist Manifesto: How to Get Things Right is a great example. But these authors are not Atul Gawande and he did not go into detail on how and when the several ideas he introduced are useful. In the end, I think they did an OK layman's job on Why Our Systems Fail, giving lots of anecdotes about failures and how they arose, but less so on What We can Do About It . There is a lot more to the subject than they hinted at. However, I gave it four stars because they did give a few useful ideas such as the pre-mortem, the idea of using structure decision making and so on, that are useful tools to have in your box. It also links examples of what we now know about the blind spots in human psychology to how to use teams to compensate. In summary, not a bad book but I was hoping for better.

Add a review

Your email address will not be published. Required fields are marked *

Loading...
We use cookies to give you the best online experience. By using our website you agree to our use of cookies in accordance with our cookie policy.