IEEE Spectrum has a big special feature online now about the Fukushima nuclear disaster and its after-effects. It includes an interactive map showing the impact that Fukushima has had on evacuation of residents, contamination of soil, and contamination of food and water supplies.
It also includes a blow-by-blow account of what happened during the first 24-hours of the disaster. This solid investigative reporting by Eliza Strickland highlights several key points where simple changes could have lead to a very different outcome than the one we got.
Read the rest
True, the antinuclear forces will find plenty in the Fukushima saga to bolster their arguments. The interlocked and cascading chain of mishaps seems to be a textbook validation of the "normal accidents" hypothesis developed by Charles Perrow after Three Mile Island. Perrow, a Yale University sociologist, identified the nuclear power plant as the canonical tightly coupled system, in which the occasional catastrophic failure is inevitable.
On the other hand, close study of the disaster's first 24 hours, before the cascade of failures carried reactor 1 beyond any hope of salvation, reveals clear inflection points where minor differences would have prevented events from spiraling out of control. Some of these are astonishingly simple: If the emergency generators had been installed on upper floors rather than in basements, for example, the disaster would have stopped before it began. And if workers had been able to vent gases in reactor 1 sooner, the rest of the plant's destruction might well have been averted.
The world's three major nuclear accidents had very different causes, but they have one important thing in common: In each case, the company or government agency in charge withheld critical information from the public.
I've talked here before about how difficult it is to attribute any individual climactic catastrophe to climate change, particularly in the short term. Patterns and trends can be said to link to a rise in global temperature, which is linked to a rise in greenhouse gas concentrations in the atmosphere. But a heatwave, or a tornado, or a flood? How can you say which would have happened without a rising global temperature, and which wouldn't?
Some German researchers are trying to make that process a little easier, using a computer model and a whole lot of probability power. They published a paper about this method recently, using their system to estimate an 80% likelihood that the 2010 Russian heatwave was the result of climate change. Wired's Brandon Keim explains how the system works:
Read the rest
The new method, described by Rahmstorf and Potsdam geophysicist Dim Coumou in an Oct. 25 Proceedings of the National Academy of Sciences study, relies on a computational approach called Monte Carlo modeling. Named for that city’s famous casinos, it’s a tool for investigating tricky, probabilistic processes involving both defined and random influences: Make a model, run it enough times, and trends emerge.
“If you roll dice only once, it doesn’t tell you anything about probabilities,” said Rahmstorf. “Roll them 100,000 times, and afterwards I can say, on average, how many times I’ll roll a six.”
Rahmstorf and Comou’s “dice” were a simulation made from a century of average July temperatures in Moscow. These provided a baseline temperature trend.
"A Taxonomy of Operational Cyber Security Risks" by CMU's James J. Cebula and Lisa R. Young is a year-old paper that attempts to classify all the ways that technology go wrong, and the vulnerabilities than ensue. Fascinating reading, a great primer on technology and security, and as a bonus, there's a half-dozen science fiction/technothriller plots lurking on every page.
This report presents a taxonomy of operational cyber security risks that attempts to identify and
organize the sources of operational cyber security risk into four classes: (1) actions of people,
(2) systems and technology failures, (3) failed internal processes, and (4) external events. Each
class is broken down into subclasses, which are described by their elements. This report discusses
the harmonization of the taxonomy with other risk and security activities, particularly those de-
scribed by the Federal Information Security Management Act (FISMA), the National Institute of
Standards and Technology (NIST) Special Publications, and the CERT Operationally Critical
Threat, Asset, and Vulnerability Evaluation (OCTAVE) method.
A Taxonomy of Operational Cyber Security Risks (PDF) Read the rest