I've been following the story about the scientists who have been working to figure out how H5N1 bird flu might become transmissible from human to human, the controversial research they used to study that question, and the federal recommendations that are now threatening to keep that research under wraps. This is a pretty complicated issue, and I want to take a minute to help you all better understand what's going on, and what it means. It's a story that encompasses not just public health and science ethics, but also some of the debates surrounding free information and the risk/benefit ratio of open-source everything.

H5N1, the famous bird flu, is deadly to humans. Of the 566 people who have contracted this form of influenza, 332 have died. But, so far, the people who have caught bird flu don't seem to have contracted the disease from other humans, or passed it on. Instead, they got it from birds, often farm animals with whom the victims were living in close contact. H5N1 was first identified 14 years ago, and there's never been a documented case of it being passed from person to person.

But that doesn't mean such a leap is impossible.

That's because of how the influenza virus works. Influenza is made up of eight pieces of RNA, containing 10 genes, and they all replicate independently of one another and there's no system for error correction*. That means you have more opportunity for mutations to arise that change what the virus does and who it can infect. Think of it like dice. Genetic replication is like putting a die in a jar, shaking it up and seeing what you get. Everybody does that. But influenza has eight die, not one. So it accumulates mutations faster. As a bonus, influenza viruses that infect the same host can share genes—essentially creating a baby virus that carries traits from different parents.

That's why, despite 14 years of relatively low-risk behavior, scientists are still concerned about what H5N1 might do in the future. All it would take, theoretically, is the right roll of the dice, and suddenly you have a flu virus with a 60% kill rate that can pass from person to person.

At least, theoretically. Could that actually happen? And, if so, how likely is it that the "right" bad combination of genes will come up? You can see why these are important questions to ask, and that brings us to the controversy.

Studying the genetics of H5N1 is nothing new. Its genome has been sequenced since 2004, for instance. But, until 2011, nobody had ever tested out a pretty fundamental idea in the control and management of H5N1: The theory that its genetics prevented it from simultaneously being both super deadly and passed from person to person.

That theory hinged on what we know about one of the proteins in H5N1. Specifically, the protein designated H5. Here's the LA Times:

Strains carrying the H5 type of a key influenza protein that helps the virus bind to cells in a host had never evolved to travel through the air from person to person. Even if H5N1 did evolve such an ability, some researchers reasoned that it might do so at the expense of its ability to take hold deep in the lung. And that would make it less lethal.

As one scientist described it to the LA Times, this theory was, basically, "We've not seen this happen before so it can't happen." But that's not a particularly strong basis on which to pin all your fears about a global pandemic.

That's why researchers in Europe and the U.S. decided to try something risky—see whether they could prompt existing H5N1 viruses to mutate into the very thing everybody's been dreading. Nobody knows a lot about this research, but, at Slate.com, Carl Zimmer explains what is known:

They've carried out their experiments on ferrets, which respond to flu viruses much like humans do. What few details we know of the unpublished research comes from a talk Dutch virologist Ron Fouchier gave in August at a virology conference, along with subsequent news reports. Fouchier began the experiment by altering the H5N1 virus's genes in two spots. Then he passed the virus from one ferret to another, allowing the virus to mutate and evolve on its own inside the animals. After several rounds, Fouchier ended up with an H5N1 virus that could spread through the air from one ferret to the other. If unleashed—and if proven capable of spreading from human to human with the same high mortality rate—it could make the deadly 1918 pandemic look like a pesky cold.

So that's one part of the controversy. Was this a responsible thing to do?

On the one hand, zomgwe'reallgonnadierunhide, right? On the other, this research has already taught us something really, really important. Not only can H5N1 make the leap to mammal-to-mammal transmission, but it did so faster and easier than the researchers had guessed. Knowing that matters, because it could help public health officials make better plans for where to use limited resources, and it could help other scientists figure out a way to fight a human transmissible H5N1 pandemic if it did happen in nature. But, if I may flip the waffle back over again, there are some legitimate scientists who don't think the benefits outweigh the risks of creating this thing. Carl Zimmer again:

Ian Lipkin, the director of the Center for Infection and Immunity at Columbia University, believes there's no reason to assume that the mutations that arose in Fouchier's experiments would be the ones that would arise out in the real world. "On the other hand," Lipkin says, "publishing this information would give people a roadmap to creating Frankenstein viruses."

And that brings us to the other part of the controversy: What to do with Fouchier's research.

This is where the government gets involved. These studies were funded by the National Institutes for Health. When NIH got the papers, they passed them on to the National Science Advisory Board for Biosecurity. On December 20, the NSABB recommended that Fouchier's study, and a similar one conducted by the University of Wisconsin's Yoshihiro Kawaoka, only be published once key data and details are removed, effectively rendering the studies un-reproducible.

The board can't technically force this. But the board is also a big deal and so Science, Nature, the NIH, and the paper's authors are all listening. That's why the papers haven't actually been published yet. The people involved are still figuring out how to handle them.

This matters a lot. Reproducibility—being able to read another scientist's research paper and independently test out their conclusions—is a key part of how science works. Remove that element, and it becomes harder to verify claims like this, not to mention much harder to actually get the benefits out of this risky research. The people involved are trying to work out a system under which qualified scientists could have access to the full data, but others say that isn't good enough. Especially considering the fact that H5N1 wouldn't make the best bioterrorism tool, anyway. Peter Christian Hall writing for Reuters:

[No one in the history of biological weapons] ever tried to weaponize a flu strain—for good reason.

Influenza in general is an equal-opportunity menace, particularly dangerous when a strain is so unfamiliar that humanity lacks immunity to it. This would put at great risk anyone trying to assemble a pandemic H5N1 to launch at "target" populations. Indeed, such an attack would unleash global contagion that would swiftly and inevitably incapacitate an aggressor's own people. Influenza doesn't respect borders.

Even arguably irrational terrorists like Aum Shinrikyo never got into anything near as notoriously unpredictable and uncontrollable as the flu, Hall writes. Of course, his argument is pretty similar to the one scientists used to use to reassure themselves that H5N1 couldn't be both deadly and human transmissible: We've not seen this happen before, so it won't.

Of course, it's also worth pointing out that these experiments were a lot more technologically complex than the short description here makes them sound. This isn't just about taking a bunch of ferrets and making them sick. It required some serious lab equipment that not just anybody has access to.

Moreover, this isn't the first time scientists have made a deadly flu virus in the lab. Back in 2005, a team reverse-engineered the 1918 pandemic flu. After a lot of debate, their research was eventually published in full, reproducible form. Peter Palese was one of the scientists on that team, and he's written an essay on Nature about his experience, as part of a plea to publish the H5N1 research in full, too.

He makes a case both for the importance of risky research, and for why all science (even kind of scary science) needs to remain open source.

During our discussions with members of the NSABB, we explained the importance of bringing such a deadly pathogen back to life. Although these experiments may seem dangerously foolhardy, they are actually the exact opposite. They gave us the opportunity to make the world safer, allowing us to learn what makes the virus dangerous and how it can be disabled. Thankfully, the discussions were largely constructive — within a week, the NSABB recommended that we continue to study the virus under biocontainment conditions, and publish the results so that other scientists could participate in the research. After we published our full paper in 2005, researchers poured into the field who probably would not otherwise have done, leading to hundreds of papers about the 1918 virus. As a result, we now know that the virus is sensitive to the seasonal flu vaccine, as well as to the common flu drugs amantadine (Symmetrel) and oseltamivir (Tamiflu). Had we not reconstructed the virus and shared our results with the community, we would still be in fear that a nefarious scientist would recreate the Spanish flu and release it on an unprotected world. We now know such a worst-case scenario is no longer possible.

I make the same argument today that we made in 2005 — publishing those experiments without the details is akin to censorship, and counter to science, progress and public health. … Giving the full details to vetted scientists is neither practical nor sufficient. Once 20–30 laboratories with postdoctoral fellows and students have such information available, it will be impossible to keep the details secret. Even more troublesome, however, is the question of who should decide which scientists are allowed to have the information. We need more people to study this potentially dangerous pathogen, but who will want to enter a field in which you can't publish your most scientifically interesting results?

*This passage has been changed from the original. Thanks to Carl Zimmer for the corrections.