— FEATURED —
— FOLLOW US —
— POLICIES —
Except where indicated, Boing Boing is licensed under a Creative Commons License permitting non-commercial sharing with attribution
— FONTS —
When bombs exploded at the Boston Marathon on Monday, my Facebook feed was immediately filled with urgent messages. I watched as my friends and family implored their friends and family in Boston to check in, and lamented the fact that nobody could seem to get a solid cell phone connection. Calls were made, but they got dropped. More often, they were never connected to begin with. There was even a rumor circulating that all cell phone service to the city had been switched off at the request of law enforcement.
That rumor turns out to not be true. But it is a fact that, whenever disaster strikes, it becomes difficult to reach the people you care about. Right at the moment when you really need to hear a familiar voice, you often can't. So what gives?
To find out why it's frequently so difficult to successfully place a call during emergencies, I spoke with Brough Turner, an entrepreneur, engineer, and writer who has been been working with phone systems (both wired and wireless) for 25 years. Turner helped me understand how the behind-the-scenes infrastructure of cell phones works, and why that infrastructure gets bogged down when lots of people are suddenly trying to make calls all at once from a single place. He says there are some things that can be done to fix this issue, but, ultimately, it's more complicated than just asking what the technology can and cannot do. In some ways, service failures like this are a price we pay for having a choice and not being subject to a total monopoly.
Read the rest
A meteor has exploded over Chelyabinsk , a remote part of Russia 150km north of Kazahstan. The meteor's descent was captured by many video cameras (largely the ubiquitous Russian dashboard cams, it seems). There are no reports of deaths, but apparently there are now 400 reported injuries. At least one large building, a zinc factory, had its roof demolished by the explosion.
A witness in Chelyabinsk reported hearing a huge blast early in the morning and feeling a shockwave in a 19-storey building in the town centre.
The sounds of car alarms and breaking windows could be heard in the area, the witness said, and mobile phones were working intermittently. "Preliminary indications are that it was a meteorite rain," an emergency official told RIA-Novosti. "We have information about a blast at 10,000-metre altitude. It is being verified."
"I was driving to work, it was quite dark, but it suddenly became as bright as if it was day," said Viktor Prokofiev, a 36-year-old resident of Yekaterinburg in the Urals mountains.
"I felt like I was blinded by headlights," he told Reuters.
Meteorite explosion over Russia injures hundreds [The Guardian]
Now this is how you do multimedia.
At The New York Times, John Branch tells the amazing, terrifying story of 16 backcountry skiers and snowboarders caught in an avalanche in the Cascade mountains in February 2012. The article, by itself, is a must-read. But you should also take a look at the absolutely fantastic way that Branch and his editors put the online medium to good use — embedding interactive maps, photos that move like something out of Harry Potter, and more standard videos into a lovely, fluid design.
Alissa Walker, who pointed me toward this piece, said that she felt cold just reading it. And you really do get that feeling. All the elements of Branch's article are brought together in a way that enhances the urgency and amplifies your sense of experiencing somebody else's story. It's really, really, really fantastic.
We have this idea that physical crowds are stupid herds. Give them half a chance, and they'll form a stampeding riot mob driven by emotion. Look at history, though, and you'll see many examples of large groups of people being perfectly well-behaved. In fact, in disaster situations, like on 9/11, crowds can even organize themselves in practical ways to help others to safety.
Meanwhile, we tend to talk about virtual crowds — the kind that form online, or between physically distant members of a professional community — as smart. But if that's always true, why do these groups get caught up in financial bubbles and why isn't Twitter a more reliable place to pick up breaking news?
Physical crowds and virtual crowds are different things. But our stereotypes about them stem from a common problem. In both cases, we tend to treat "the crowd" as if it's a distinct entity — as if, at some point, individuals in a group stop being themselves and start to become limbs of a crowd creature. In my latest column for The New York Times magazine, I learned that that's not the way people work in real life. As Clark McPhail, emeritus professor of sociology at the University of Illinois at Urbana-Champaign, told me, "Crowds don't have a central nervous system."
Gustave Le Bon was one of the first people to write about crowds as entities separate from the people in them. His 1895 book, “The Crowd: A Study of the Popular Mind,” shaped academic discussions of human gatherings for half a century and encouraged 20th-century fascist dictators, including Benito Mussolini, to treat crowds as emotional organisms — something to be manipulated and controlled. (Perhaps a Le Bonian understanding of crowds makes us feel more comfortable about the atrocities of the 20th century.) But “The Crowd” was more a work of philosophy than of science, McPhail told me. Le Bon’s ideas were based on armchair analysis of past events, not on carefully documented studies of crowds in action. In the 1960s, sociologists began to study protests and public gatherings, and they realized that the things they believed about crowd behavior didn’t align with what took place in the real world.
This image, made using a laser mapping technology called LIDAR, was taken on September 17, 2001. It shows a 3-D model of the rubble left behind in lower Manhattan following the attacks on the World Trade Center.
Minnesota Public Radio's Paul Tosto has a really interesting peek into the way mapping techniques like LIDAR were used to help rescuers and clean-up crew understand the extent of the damage, look for survivors, and rehabilitate the area around the disaster zone.
The Library of Congress work also includes data from a a thermal sensor flown at 5,000 feet over Ground Zero that provided images to track underground fires that burned for weeks at the site.
It's worth remembering that Google Earth didn't exist back then. The ancient science of cartography has been reborn with the technology of the last decade. Let's hope it's not called on again to map destruction.
Via Peter Aldhous
On February 1, 2003, the space shuttle Columbia broke up in the sky over Texas, bits and pieces falling onto at least two states. All seven astronauts on board died. As we close in on the 10-year anniversary of the disaster, you can expect lots of media outlets and experts to start offering their take on what happened and what we've learned from it. But there's one voice that you should really be listening to ... and he's speaking already.
Wayne Hale was a flight director on the space shuttle for 40 or 41 missions (His blog says 40, his NASA bio says 41). Flight controllers are the people who manage a space flight—they deal with the logistics, monitor all the various systems of the vehicle, make the decision to launch or abort, and deal with trouble-shooting. In other words, they play a key role in safety, and the flight director is the person in charge of all the flight controllers.
More importantly, Wayne Hale is one of the people who suspected something might be wrong with Columbia before its fatal reentry, and tried to get his superiors at NASA to pay attention to the risks. Here's Dwayne Day writing at The Space Review:
During the Columbia accident investigation I was one of over 100 staff members who worked for the CAIB (not all of them worked simultaneously, and for the many months I was there, the staff probably numbered no more than 50–60). There were so many aspects to the investigation that it was impossible to follow them all, and my responsibility was for policy, history, and budget, and later, some of the issues concerning schedule pressure. But I remember one afternoon when I was talking with an Air Force colonel skilled in aircraft accident investigations when Hale’s name came up and I asked how Hale had been involved in the accident. The colonel explained how Hale had been one of the people who had been concerned about the foam strike during the flight and had tried to obtain on-orbit imagery of the orbiter during its mission, only to be rebuffed by upper level managers. Then, after a short pause, the colonel added: “Hale was one of the good guys.”
But being one of the good guys doesn't mean you don't feel guilty when something goes horribly wrong. On Tuesday, Hale posted on his blog about the Columbia disaster and what is going on in his head as the anniversary creeps closer. It's a sad, poignant post, and Hale promises it's just the beginning of a series of articles addressing his experiences before, during, and after the Challenger disaster:
All of this has brought the searing memories from a decade ago into the forefront of my mind. Not that those memories has ever left me; the memories of early 2003. I was intimately involved in the events leading up to the Columbia tragedy so maybe that is to be expected. But often in the wee hours of the morning when sleep fails, the questions return: why did it happen, how did we allow it to happen, and what could I have done to prevent it.
Some others who lived through those days remember things from different perspectives, they had different experiences, but – somewhat frighteningly – remember events we shared in common in different ways. The passage of time, too, is riddling my memories with holes like Swiss cheese. Names escape me, details are getting fuzzy, and though concentrated thought can bring some things back from the recesses, others are gone forever. Some memories stand out like a lightning bolt in a dark night; many others of those events are gone into the darkness. If I am ever to write down my experience, the time is now.
Basically, you should bookmark Wayne Hale's blog and check it frequently. He'll be posting regularly, over the next several months, and I'm am certain you'll want to read the full series.
Via Alexandra Witze
It began with a few small mistakes.
Around 12:15, on the afternoon of August 14, 2003, a software program that helps monitor how well the electric grid is working in the American Midwest shut itself down after after it started getting incorrect input data. The problem was quickly fixed. But nobody turned the program back on again.
A little over an hour later, one of the six coal-fired generators at the Eastlake Power Plant in Ohio shut down. An hour after that, the alarm and monitoring system in the control room of one of the nation’s largest electric conglomerates failed. It, too, was left turned off.
Those three unrelated things—two faulty monitoring programs and one generator outage—weren’t catastrophic, in and of themselves. But they would eventually help create one of the most widespread blackouts in history. By 4:15 pm, 256 power plants were offline and 55 million people in eight states and Canada were in the dark. The Northeast Blackout of 2003 ended up costing us between $4 billion and $10 billion. That’s “billion”, with a “B”.
But this is about more than mere bad luck. The real causes of the 2003 blackout were fixable problems, and the good news is that, since then, we’ve made great strides in fixing them. The bad news, say some grid experts, is that we’re still not doing a great job of preparing our electric infrastructure for the future.
Read the rest
Image: A worker at Rocky Flats handles a piece of plutonium using gloves built into a sealed box. The plutonium was bound for the innards of a nuclear bomb. National Archives via Wikipedia.
Kristen Iversen grew up in the shadow of two big secrets. The first was private. Her father was an alcoholic, and his problem grew bigger and harder to ignore or hide as Iversen got older. But the other secret didn't belong to just her and her family. Instead, it encompassed whole Colorado communities, two major corporations, and the US government.
Iversen grew up near Rocky Flats, a nuclear weapons plant near Denver. In much the same way as Iversen's family related to her father's alcoholism, Rocky Flats presented risks that nearly everyone involved preferred to ignore or cover up. In fact, years after several public exposes had made it very clear that Rocky Flats made nuclear bombs and that the corporate and government entities that ran the facility had cut corners and allowed massive amounts of plutonium to escape into the surrounding environment, people who lived in Iversen's neighborhood near the plant still refused to give up their long-held belief that it produced nothing more than Scrubbing Bubbles and dishwashing detergent.
Full Body Burden: Growing Up in the Nuclear Shadow of Rocky Flats is memoir—albeit one that captures documented history as well as a family's private struggles. It's not really meant to be a book about science. But I think it's a powerful, well-written memoir that science buffs should read.
Read the rest
This photo, taken by the National Oceanic and Atmospheric Administration, shows a pair of shoes resting on the sea floor amidst the wreck of the Titanic. It's a powerful image, on its own. But with the background information, it becomes downright tear jerking.
Note the position of those shoes. Now think about the position of your feet if you were to lie on the floor, on your side, with your ankles crossed. This is not a coincidence. On his blog, NPR's Robert Krulwich quotes Titanic explorer Robert Ballard:
"There used to be bodies in those shoes. The body parts deteriorated, and the skeletal remains decalcified. The only thing left are the shoes, and the leather is perfectly preserved." The tannin in the shoe leather had apparently resisted the bacteria.
Read the rest of Robert Krulwich's post on shoes at sea, including information about the Titanic, as well as shoes spilled from lost shipping containers.
Yesterday, I got to host an eye-opening Q&A with Dan Edge, a PBS FRONTLINE producer who just finished a documentary about what happened at Fukushima during the first few days of the nuclear crisis there.
During that discussion, we touched a bit on the psychological impact all of this—the earthquake, the tsunami, the nuclear meltdowns—has had on the Japanese people. From studies of what's happened to the people who lived near Chernobyl and Three Mile Island, we know that the fear and stress associated with these kinds of disasters can have complex and long-ranging health effects.
Today, Paul Voosen, a journalist with Greenwire, emailed me a story he wrote last year, during the first month of the Fukushima crisis, that delves into some of the science behind how disasters (and especially nuclear disasters) affect the human psyche. If you've already read it, it's worth reading again.
Certainly, lasting scars of emotional distress -- which, at its worst, can manifest itself as serious depression or post-traumatic stress, among other symptoms -- are what researchers found in young mothers and others directly affected by past nuclear accidents at Three Mile Island in 1979 and seven years later at the much more serious Chernobyl meltdown in Ukraine.
"What's most striking," Bromet said, "both about Three Mile Island and Chernobyl, which are obviously completely different events with different environmental consequences, is that the emotional consequences just never end."
The Fukushima crisis is, of course, an incredibly difficult situation for Japan's authorities and residents. Caution is more than justifiable when it comes to radiation, and the fear and stress that could stem from radiation risk warnings would be difficult to prioritize over immediate health concerns, said Johan Havenaar, a Dutch psychiatrist who has worked with Chernobyl evacuees.
"It is an understandably frightening situation for [the Japanese]," he said, "even if the risk is small and the measure predominantly precautionary. ... It would be unfair to suggest that the psychological effects -- i.e. their fears -- are unjustified."
What authorities should do, and often fail to do, is treat mental and physical health problems with equal respect, understanding that the two go hand in hand, Bromet said. They must respect the persistent fears that will form about radiation exposure in Japan, no matter how low the exposure and how this can take a permanent toll on people's lives, she said.
If you want to know more about this, there are several other links I'd recommend:
• Charles Q. Choi wrote a great piece during his tour of Chernobyl last year about the health effects of that disaster, and why it's actually easier to spot the mental health impacts than the effects of radiation exposure.
• The Centers for Disease Control and Prevention has a primer that explains how disasters affect the mental health of different groups of people, and how the impacts vary a lot based on how close you were to the tragedy.
• Chernobyl's Legacy is a document produced by a study group made up of the United Nations, the World Health Organization, the International Atomic Energy Agency and others. It summarizes a lot of the research showing both the mental health impact of that disaster, and how authorities have failed to respond to it.
• Another good paper, if you can find a full, free copy of it: Psychological and Perceived Health Effects of the Chernobyl Disaster: A 20-year Review.
Last night, PBS FRONTLINE aired a new documentary about what happened at the Fukushima nuclear power plant during the crucial first days of that crisis. Using amateur video shot during the earthquake and tsunami, interviews with power plant workers who were on the scene, and some astounding footage taken inside the power plant itself, the documentary is extremely powerful. It feels weird to say this, given the effect the meltdowns have had on Japan's energy situation and the lives of the people who lived and worked near the plant ... but it seems as though Fukushima could have been a lot worse. The documentary shows us the valiant risks taken by firemen and plant workers. It also shows us the moments where, in the midst of the Japanese government and utility company TEPCO doing a lot of things very wrong, individuals stepped up to make decisions that saved lives. Without those things, this would have been a very different (and much darker) story.
In about ten minutes, I'm going to be moderating a live Q&A with Dan Edge, the producer of Inside Japan's Nuclear Meltdown. I'll be asking him some questions about the story, and the process of filming a documentary like this. There will also be opportunities for you to ask Edge some questions, as well. (And I already know y'all are good at coming up with interview questions.)
You can follow along, or join in on the discussion, using the chat box embedded in this post. Hope to see you there!
Right now, I'm reading a book about why catastrophic technological failures happen and what, if anything, we can actually do about them. It's called Normal Accidents by Charles Perrow, a Yale sociologist.
I've not finished this book yet, but I've gotten far enough into it that I think I get Perrow's basic thesis. (People with more Perrow-reading experience, feel free to correct me, here.) Essentially, it's this: When there is inherent risk in using a technology, we try to build systems that take into account obvious, single-point failures and prevent them. The more single-point failures we try to prevent through system design, however, the more complex the systems become. Eventually, you have a system where the interactions between different fail-safes can, ironically, cause bigger failures that are harder to predict, and harder to spot as they're happening. Because of this, we have to make our decisions about technology from the position that we can never, truly, make technology risk-free.
I couldn't help think of Charles Perrow this morning, while reading Popular Mechanics' gripping account of what really happened on Air France 447, the jetliner that plunged into the Atlantic Ocean in the summer of 2009.
As writer Jeff Wise works his way through the transcript of the doomed plane's cockpit voice recorder, what we see, on the surface, looks like human error. Dumb pilots. But there's more going on than that. That's one of the other things I'm picking up from Perrow. What we call human error is often a mixture of simple mistakes, and the confusion inherent in working with complex systems.
Read the rest