There's been a lot of news freakout over Defense Distributed (previously) and "3D printed guns" (a term that confusingly encompasses milled guns, 3D printed guns, and files that describe the shapes of guns).
But there's been precious little about the actual legal principles in play and the stakes they raise. That's a pity, because however you feel about guns (disclosure: I don't own a gun, have never owned a gun, and favor strong gun control), the legal fight is nothing short of bizarre and implicates all technical knowledge, not just (or even primarily) guns.
When Defense Distributed posted its first 3D printable gun files, the State Department ordered the files to be taken down, arguing that export control rules gave them the power to remove any technical information, with no legal oversight, no objective standards for what could be removed, and no deadlines to explain the order or rescind it.
These export controls were longstanding, but they had always applied to devices, not information. Under the US government's theory, they could censor virtually any online speech simply by declaring the censorship orders to be "advisable."
There are obvious problems with this legal theory, problems that have nothing to do with guns. Presumably, that's how Defense Distributed was able to get the State Department to walk back its actions (and arguably overstep in the other direction by failing to solicit public comment on a major shift in policy).
In the meantime, Defense Distributed's files had become widely distributed, mirrored across the web in multiple jurisdictions — meaning that even if the US government hadn't caved on its export control gambit, taking down Defense Distributed's files would have had an unmeasurably tiny effect on the availability of files for 3D printing guns.
Now that Defense Distributed has won a license to distribute those files — in addition to all the other people, inside and outside the USA, who were already hosting them — the states have gotten in on the act with several attorneys general suing the US government. They've advanced a number of legal theories about why the feds are in the wrong: the strongest argument is that the State Department didn't publish a notice and solicit public comment before changing its rules. A weaker argument — and again, a potentially dangerous one — is that technical information about guns is a "nuisance" and thus should be censored.
This is a dangerous argument for the same reason that the federal government's export control theory was dangerous: it gives the government virtually unlimited power to censor speech, without any objective standards, legal oversight, or due process rights.
I'm personally not very happy about any of this. I think that the American gun control debate is distorted by the monied gun lobby and the years of disinformation and paranoia — so much of it openly racist — it had fed to gun advocates.
I'm also keenly aware that "hard cases make bad law": when abusing a law gives the government the power to do something politically popular, they're tempted to stretch the law to score political points (or, more charitably, to stop some undesirable thing by any available means). The problem is that the next time the state decides to censor something on the internet in the name of export controls or creating a nuisance, we'll be operating on the precedent set by the most emotive, polarizing circumstances (that is, 3D printed guns).
I think that Defense Distributed is making the same bet, but they calculate the odds differently. While I fear that tying the future of the free and open internet to guns puts the internet at risk, Defense Distributed seems to be betting that there are so many diehard firearms partisans that tying an internet freedom question to the Second Amendment will create an unstoppable coalition that will lock the internet open.
Meanwhile, the entire debate continues to be mischaracterized as "stopping 3D printed guns." That debate was settled in 2012 when the files were first published and mirrored all over the world, outside of the US government's control. Whether or not Defense Distributed hosts the files in question or not, people who want to 3D print guns will 3D print guns.
And, of course: 3D printing a gun is arguably the most expensive, least convenient way to get a gun in America, which is awash in guns, a fact that (once again), I keenly regret and want to do something about. But if the problem is real, that's all the more reason to reject solutions are are unreal — incoherent political grandstanding and legal overreach inevitably have a price-tag of their own, and with 3D printed guns, it's all price, no benefit, because none of these measures will have any impact on the availability of guns — 3D printed or otherwise.
Even when heady interests such as national security or physical harm are potentially at stake, the government has a heavy burden to prove the urgency of the harm and the appropriateness of a speech restriction as the proper remedy. It's generally not appropriate to order one person not to publish material that is readily available elsewhere.
The government has a history of characterizing ideas as dangerous in an attempt to suppress speech about those technologies and ideas. First Amendment standards ensure that speech cannot be suppressed as an easy measure of first resort, or where those speech constraints aren't necessary to address a proven harm or effective at addressing that harm.
If the states in this case are successful, they will bypass legal doctrines that we rely on to protect your right to encrypt and your right to advocate for social change. The arguments from the states are clear on this point – the states are arguing that the government should be required to prevent publication because foreigners abroad might do things that the U.S. opposes and they are arguing that the courts themselves should order the designs to be kept offline because people might make the guns and use them in domestic crimes.
These arguments are dangerous because they threaten to empower current (and future) U.S. government officials to play pre-publication gatekeeper of what information you can publish online based on the barest, unproven claim of national interest or the possibility that others might use your information to further crimes. It could bar us from publishing and discussing artificial intelligence technologies, something that has increasing importance to our online lives and even how the government makes decisions about bail and sentencing. It could censor information about how to survive a chemical weapons attack. It could force us to compromise our secure communications technologies, making our personal information vulnerable to unlawful surveillance and identity theft.