— FEATURED —
— FOLLOW US —
— POLICIES —
Except where indicated, Boing Boing is licensed under a Creative Commons License permitting non-commercial sharing with attribution
— FONTS —
Grand Guignol was a Parisian theater that between 1897 and 1962 staged macabre plays known for their cartoon horror and violence. LIFE shares with us vintage photos of this splatterpunk paradise. Above, "Burned by vitriol thrown at him by his girl who comes to seek forgiveness, her lover turns slowly to reveal his elaborately blighted face. Then he strangles her." "Shock Value: Inside Paris’ Grand Guignol Theater, 1947"
Right now, I'm reading a book about why catastrophic technological failures happen and what, if anything, we can actually do about them. It's called Normal Accidents by Charles Perrow, a Yale sociologist.
I've not finished this book yet, but I've gotten far enough into it that I think I get Perrow's basic thesis. (People with more Perrow-reading experience, feel free to correct me, here.) Essentially, it's this: When there is inherent risk in using a technology, we try to build systems that take into account obvious, single-point failures and prevent them. The more single-point failures we try to prevent through system design, however, the more complex the systems become. Eventually, you have a system where the interactions between different fail-safes can, ironically, cause bigger failures that are harder to predict, and harder to spot as they're happening. Because of this, we have to make our decisions about technology from the position that we can never, truly, make technology risk-free.
I couldn't help think of Charles Perrow this morning, while reading Popular Mechanics' gripping account of what really happened on Air France 447, the jetliner that plunged into the Atlantic Ocean in the summer of 2009.
As writer Jeff Wise works his way through the transcript of the doomed plane's cockpit voice recorder, what we see, on the surface, looks like human error. Dumb pilots. But there's more going on than that. That's one of the other things I'm picking up from Perrow. What we call human error is often a mixture of simple mistakes, and the confusion inherent in working with complex systems.
Read the rest