Full description not available
J**I
Must-read for NASA fans or critics
This book will likely change your perspective on the popular-culture narrative about the Challenger accident. Anyone wanting to think about how a complex workplace creates its own structure and operates within it also is encouraged to read it. It's a thick text, but readable by the non-expert, and it is worth your time.
K**H
Reliability/Maintenance/Refinery Engineering Application
I started reading this book to improve my Root Cause Failure Analysis skills after hearing that it covers, in fine detail, a failure that cost the lives of 7 astronauts and destroyed a multi-billion dollar asset. We are first presented with the popular media viewpoint that describes how performance-driven NASA administrators aggressively pursued production, political, and economic goals at the expense of personal safety. How a mechanical flaw formally designated as a potentially catastrophic anomaly by NASA and Thiokol engineers became a normal flight risk on the basis of previous good launches. How a last minute plea from subject matter experts to halt the countdown on an uncommonly cold day in January 1986 was ignored by engineering managers on the decision chain so the launch schedule would not be compromised.I remember an early feeling of relief in knowing that while similar performance, production, and scheduling pressures exist in my career, the attitudes that were mostly at fault for the Challenger incident are absent from my refinery and violate all 10 of my parent company's business principles starting with #1 (conduct all business lawfully and with integrity).The author then proceeds to shatter every element of this popular emotional impression by presenting a credible account of the failure based on public record. This is an important point because unlike with Enron's collapse, there is no shredding of pertinent documents behind the Challenger incident. And it is this matter of public record that can benefit anyone having reliability or production engineering responsibilities within a refinery. Here we find evidence that NASA's best friend - a reliable system built to assure the utmost safety in engineering - was to blame for the tragedy. A system that encourages the challenging of engineering data to validate its meaning. A system that prioritizes safety above any other initiative. A system that requires operation within specified safety limits in order to function. A system that requires vendor/customer interaction. A system with multiple departments, requiring effective communication between each.I soon realized that the book that I was reading was not a book about a tragic point in American history, but a book about managing risks we routinely encounter in a refinery, using the Challenger incident as the case history to relate them to. Like so many case histories in industry, we benefit by understanding what went wrong and taking proactive measures to prevent against it from happening again.If I owned this refinery and someone came to me saying, "Hey, I'd really like to work here" I would send him or her off with a copy of this book. If that person returned still interested, chances are he or she would get the job.
R**S
Normalization Of Deviance
As a sociological explanation of disastrous decision making in high risk applications, this book is without peer, exceeding even Charles Perrow's work by a fair measure. Vaughan, a sociologist, obviously worked very hard at understanding the field joint technology that caused the "Challenger" accident, and even harder at understanding the extremely complex management and decision making processes at NASA and Morton Thiokol.The book ultimately discards the "amoral calculation" school of thought (which she was preconditioned to believe at the outset of her research by media coverage of the event) and explains how an ever expanding definition of acceptable performance (despite prior joint issues) led to the "normalization of deviance" which allowed the faulty decision to launch to be made. The sociological and cultural analyses are especially enlightening and far surpass the technical material about the actual physical cause of the accident presented.This is a masterful book, and is impeccably documented. The reference portion of the book in the back is especially useful, in that she reproduces several key original documents pertinent to the investigation which are difficult to obtain elsewhere. My only objection to the book is the extreme use of repetition, which I think needlessly lengthened the book in several areas, and obfuscating sociological terminology like "paradigm obduracy" which not only fails to illuminate the non-sociologists among us, but makes for somewhat tortured prose.In praise of the book, however, it is a brilliant analysis of how decisions are made in safety-critical programs in large institutions. Chapter ten, "Lessons Learned," is particularly noteworthy in its analysis and recommendations. It's a shame that managerial turnover has ensured that few of the "Challenger" era managers were still at the agency during the "Columbia" accident era. Those who forget history are doomed to repeat it.This book makes for very weighty and difficult reading. Having said that, I highly recommend it to technical professionals, particularly engineers and managers involved with high-risk technologies. Likewise, it is absolutely imperative reading for safety professionals, consultants, and analysts.
M**O
Thought Provoking
I have purchased this book several times in different formats. This book goes into the culture behind NASAs Challenger Launch. While that is the primary focus the deeper focus is on why many modern companies fail because their culture dooms them. When companies become so focused on moving forward that they will not listen to any dissenting voices in the ranks disasters happen. It is especially significant that when a disaster happens many companies immediately "react" to it - however their culture brings them right back into another disaster because change cannot really take place. Many companies talk about the need to change or grow, yet often look on people who have different opinions or ideas as outsiders who "poison" the company culture. Instead of getting rid of these people or alienating them companies should look at ways to investigate even the wildest claims with open minds instead of instant dismissal. In the case of NASA, their culture fell right back into their old ways resulting in the loss of the second shuttle something this book made clear was a possibility - before it happened.
M**R
The Blame Game
A book with ideas totally relevant to today, tomorrow and the day after.
S**M
A great read- very thorough
Bought this as part of my Criminology degree and investigation into the shortcomings of some organisations in making appropriate decisions. A great read- very thorough.
T**T
Five Stars
Great book.
E**R
Risikomanagement bei neuer und inherent risikobehafteter Technologie
In der Nacht zum 28. Januar 1986 erklärten Manager der NASA nach einer denkwürdigen Telefonkonferenz mit Morton Thiokol ' dem Hersteller der Feststoffraketen (Solid Rocket Booster SRB) ' das Space Shuttle Challenger als 'ready for flight'. Die Dichtungen zwischen den Segmenten der SRBs wurden als 'akzeptierbares Risiko' eingestuft. Was dann am Tag darauf geschah ist bekannt.Wie konnte dies passieren? Es ist natürlich naheliegend und einfach den beteiligten NASA Managern die Schuld an der falschen Einschätzung des Risikos zu geben und ihnen unverantwortliches Verhalten vorzuwerfen. Tatsächlich aber stellte sich die Situation in der Telefonkonferenz anders dar als wir sie im Rückblick sehen.Zu der Telefonkonferenz war es gekommen, weil einige Ingenieure von Morton Thiokol Bedenken wegen der ungewöhlich tiefen Temperaturen am Startplatz Cape Canaveral angemeldet hatten. In einer eilig zusammengestellten Präsentation versuchten sie die tiefen Temperaturen als ungewöhnlich und als Risiko für den Start darzustellen. Diese Argumentation in der Präsentation ging aber gründlich schief, da es bereits ein Jahr vorher ähnlich tiefe Temperaturen beim einem Start gegeben hatte und keine Probleme aufgetreten waren. Qualitative Aussagen ' wie z.B. der extreme Grad der Beschädigung der Dichtringe der SRBs ein Jahr zuvor ' wurden nicht angeführt. Schlimmer: es kam zu einer Umkehr der 'Beweislast', d.h. die Grundannahme war 'Start freigegeben' und diese Grundannahme war durch die Ingenieure zu widerlegen. Dies gelang nicht.Diane Vaughan weist in ihrem Buch nach, dass der Ablauf der Telefonkonferenz und des Entscheidungsprozesses zur Startfreigabe kein singuläres Ereignis war, sondern die Folge einer Vielzahl von Faktoren. Da war zunächst der Druck der Politik, welche die Flüge der Weltraumfähre als Routine darstellte und von der NASA erwartete, daraus kommerziellen Profit zu erzielen. Dies hatte zur Folge, daß die im Grunde schlechte Konstruktion der Abdichtung zwischen den Segmenten der SRBs aus Zeitgründen nie korrigiert wurde. Dann gab es die schleichende Erweiterung des 'akzeptablen Risikos' - sprich: was bisher funktioniert hat wird auch in Zukunft funktionieren (auch wenn sich die Randbedingungen geändert haben). Ein weiterer Faktor war die Redundanz ' zwei in Serie geschaltete Dichtungen ' und der damit verbundene Irrglaube, wenn die erste Dichtung durch die heissen Gase durchgeblasen wird, dann hält die zweite. Man muß auch wissen, daß die Dichtungen eine kritische Komponente unter hunderten anderen kritischen Komponenten der Raumfähre war. Dann war dann noch die 'can do' Mentalität der NASA: Was wir als NASA unternehmen, das gelingt auch. Schließlich waren die Kommunikationsstrukturen und die Hierarchie der NASA mit ein Faktor, da sie verhinderte, dass wesentliche technische Details nach oben kommuniziert wurden.Gibt es eine Lösung für diese ernsten systemimmanenten Gefahren bei der Risikikobewertung in großen und vielleicht auch kleinen Unternehmen? Möglich, aber sie ist sicher nicht einfach. Vaughan macht zwar einige Vorschläge, aber dies ist vermutlich das Thema für ein weiteres Buch.Der Wert dieses Buches liegt natürlich in der minutiösen Darstellung der Ereignisse 'bis zum 'warum' der Aussagen in der Telefonkonferenz ' und der Entwicklung der NASA-Kultur. Vor allem aber liegt er darin, daß es zu einem Nachdenken über Risiko nicht nur als technischen Faktor, sondern auch als (unternehmens-) kulturellen Faktor führt. Gerade in einer Zeit, in der viel über 'funktionales Risiko' und über 'fehlerfreie Produkte' gesprochen wird, sollten alle, die irgendwie mit Risikobewertungen zu tun haben, dieses Buch lesen.
P**N
Excellent, serious book.
This book deserves its reputation as a classic in the field of decision-making. Superb. This is not a light treatment of the topic; the length and detail are best suited to 'serious' students of the topic
Trustpilot
1 week ago
3 weeks ago