Australian attorney general wants the power to launch man-in-the-middle attacks on secure Internet connections


The Australian attorney general has mooted a proposal to require service providers to compromise their cryptographic security in order to assist in wiretaps. The proposal is given passing mention in a senate submission from the AG's office, where it is referenced as "intelligibility orders" that would allow "law enforcement, anti-corruption and national security agencies" to secure orders under which providers like Google, Facebook and Yahoo would have to escrow their cryptographic keys with the state in order to facilitate mass surveillance.

Edward Snowden referenced this possibility in his SXSW remarks, pointing out that any communications that are decrypted by service providers are vulnerable to government surveillance, because governments can order providers to reveal their keys. This is why Snowden recommended the use of "end-to-end" security, where only the parties in the discussion -- and not the software vendor -- have the ability to spy on users.

The "intelligibility order" is the same kind of order that led to the shutdown of Lavabit, the secure email provider used by Snowden, whose creator shut the service down rather than compromising his users' security.

Read the rest

Robots and the law: what's after cyberlaw?


Ryan Calo, the organizer of the annual Stanford conference on Robots and the Law has written a new paper called Robotics and the New Cyberlaw , examining the new legal challenges posed by the presence of robots in our public spaces, homes and workplaces, as distinct from the legal challenges of computers and the Internet.

I'm not entirely convinced that I believe that there is such a thing as a robot, as distinct from "a computer in a special case" or "a specialized peripheral for a computer." At least inasmuch as mandating that a robot must (or must not) do certain things is a subset of the problem of mandating that computers must (or must not) run certain programs.

It seems to me that a lot of the areas where Calo identifies problems with "cyberlaw" as it applies to robots are actually just problems with cyberlaw, period. Cyberlaw isn't very good law, by and large, having been crafted by self-interested industry lobbyists and enacted on the basis of fearmongering and grandstanding, so it's not very surprising that it isn't very good at solving robot problems.

But the paper is a fascinating one, nevertheless.


Update: The organizer of Robots and the Law is Michael Froomkin; Ryan Calo is the person who sent it in to Boing Boing. The conference isn't held at Stanford every year; next year it will be in Miami. Sorry for the confusion!

Read the rest

Facial recognition mobile app


A new mobile app called "Nametag" adds facial recognition to phone photos; take a pic of someone and feed it to the app and the app will search Facebook, Twitter, sex offender registries and (if you'd like) dating sites to try and put a name to the face. Kevin Alan Tussy, speaking for Facialnetwork (who make Nametag) promises that this won't be a privacy problem, because "it's about connecting people that want to be connected."

Read the rest

High-end CNC machines can't be moved without manufacturers' permission


On Practical Machinst, there's a fascinating thread about the manufacturer's lockdown on a high-priced, high-end Mori Seiki NV5000 A/40 CNC mill. The person who started the thread owns the machine outright, but has discovered that if he moves it at all, a GPS and gyro sensor package in the machine automatically shuts it down and will not allow it to restart until they receive a manufacturer's unlock code.

Effectively, this means that machinists' shops can't rearrange their very expensive, very large tools to improve their workflow from job to job without getting permission from the manufacturer (which can take a month!), even if their own the gear.

Read the rest

NSA: National Insecurity

Tom sez, "This clip takes aim at the NSA and their spying, snooping ways - it's made by somegreybloke, and features Jeremiah McDonald (who clocked up 11 million views on YouTube with conversation with my six year old self) & Max Koch, another US based comedian, cartoon maker and funnyman."

This is pretty good, but moves into "inspired" territory around 2:01.

NSA: National Insecurity / somegreybloke | MASHED (Thanks, Tom!)

Pwning a house


Badly configured home automation systems are easy to locate using Google, and once you discover them, you can seize control of a stranger's entire home: "lights, hot tubs, fans, televisions, water pumps, garage doors, cameras, and other devices." The manufacturers blame their customers for not following security advice, but even "enthusiast" customers who think they've locked down their networks are sometimes in for a nasty surprise.

Insteon chief information officer Mike Nunes says the systems that I’m seeing online are from a product discontinued in the last year. He blamed user error for the appearance in search results, saying the older product was not originally intended for remote access, and to set this up required some savvy on the users’ part. The devices had come with an instruction manual telling users how to put the devices online which strongly advised them to add a username and password to the system. (But, really, who reads instruction manuals closely?)

“This would require the user to have chosen to publish a link (IP address) to the Internet AND for them to have not set a username and password,” says Nunes. I told Nunes that requiring a username/password by default is good security-by-design to protect people from making a mistake like this. “It did not require it by default, but it supported it and encouraged it,” he replied.

In Thomas Hatley’s case, he created a website that acted as the gateway for a number of services for his home. There is a password on his website, but you can circumvent that by going straight to the Insteon port, which was not password protected. “I would say that some of the responsibility would be mine, because of how I have my internal router configured,” says Hatley who describes himself as a home automation enthusiast. “But it’s coming from that port, and I didn’t realize that port was accessible from the outside.”

The company’s current product automatically assigns a username and password, but it did not during the first few months of release — which is one of the products that Trustwave’s Bryan got. If you have one of those early products, you should really go through with that recall. Bryan rated the new authentication as “poor” saying that cracking it would “be a trivial task for most security professionals.”

When 'Smart Homes' Get Hacked: I Haunted A Complete Stranger's House Via The Internet [Kashmir Hill/Forbes]

Why the FBI's plan to require weak security in all American technology is a terrible, terrible idea

Bruce Schneier's editorial on CALEA-II is right on. In case you missed it, CALEA II is the FBI's proposal to require all American computers, mobile devices, operating systems, email programs, browsers, etc, to have weak security so that they can eavesdrop on them (as a side note, a CALEA-II rule would almost certainly require a ban on free/open source software, since code that can be modified is code that can have the FBI back-doors removed).

The FBI believes it can have it both ways: that it can open systems to its eavesdropping, but keep them secure from anyone else's eavesdropping. That's just not possible. It's impossible to build a communications system that allows the FBI surreptitious access but doesn't allow similar access by others. When it comes to security, we have two options: We can build our systems to be as secure as possible from eavesdropping, or we can deliberately weaken their security. We have to choose one or the other.

This is an old debate, and one we've been through many times. The NSA even has a name for it: the equities issue. In the 1980s, the equities debate was about export control of cryptography. The government deliberately weakened U.S. cryptography products because it didn't want foreign groups to have access to secure systems. Two things resulted: fewer Internet products with cryptography, to the insecurity of everybody, and a vibrant foreign security industry based on the unofficial slogan "Don't buy the U.S. stuff -- it's lousy."

In 1994, the Communications Assistance for Law Enforcement Act mandated that U.S. companies build eavesdropping capabilities into phone switches. These were sold internationally; some countries liked having the ability to spy on their citizens. Of course, so did criminals, and there were public scandals in Greece (2005) and Italy (2006) as a result.

In 2012, we learned that every phone switch sold to the Department of Defense had security vulnerabilities in its surveillance system. And just this May, we learned that Chinese hackers breached Google's system for providing surveillance data for the FBI.

The Problems with CALEA-II

Canada's business groups wants to hack your computer even more than the creeps at the Commission on the Theft of American Intellectual Property

Michael Geist writes,

The Internet is buzzing over a new report from the Commission on the Theft of American Intellectual Property that recommends using spyware and ransomware to combat online infringement. The recommendations are shocking as they represent next-generation digital locks that could lock down computers and even "retrieve" files from personal computers:

"Software can be written that will allow only authorized users to open files containing valuable information. If an unauthorized person accesses the information, a range of actions might then occur. For example, the file could be rendered inaccessible and the unauthorized user's computer could be locked down, with instructions on how to contact law enforcement to get the password needed to unlock the account."

While many of the recommendations sound outrageous, it is worth noting that earlier this year Canadian business groups led by the Canadian Chamber of Commerce recommended that the Canadian government introduce a regulation that would permit the use of spyware for these kinds of purposes.

The proposed regulation would remove the need for express consent for:

"a program that is installed by or on behalf of a person to prevent, detect, investigate, or terminate activities that the person reasonably believes (i) present a risk or threatens the security, privacy, or unauthorized or fraudulent use, of a computer system, telecommunications facility, or network, or (ii) involves the contravention of any law of Canada, of a province or municipality of Canada or of a foreign state;"

This provision would effectively legalize spyware in Canada on behalf of these industry groups. The potential scope of coverage is breathtaking: a software program secretly installed by an entertainment software company designed to detect or investigate alleged copyright infringement would be covered by this exception. This exception could potentially cover programs designed to block access to certain websites (preventing the contravention of a law as would have been the case with SOPA), attempts to access wireless networks without authorization, or even keylogger programs tracking unsuspecting users (detection and investigation).

The Canadian Link to Copyright Enforcement Spyware Tools

Impossible Programs: a great lecture on some of computer science's most important subjects

Here's a 40-minute video in which Tom Stuart gives a talk summarizing one of the chapters from him new book Understanding Computation, describing the halting state problem and how it relates to bugs, Turing machines, Turing completeness, computability, malware checking for various mobile app stores, and related subjects. The Halting State problem -- which relates to the impossibility of knowing what a program will do with all possible inputs -- is one of the most important and hardest-to-understand ideas in computer science, and Stuart does a fantastic job with it here. You don't need to be a master programmer or a computer science buff to get it, and even if you only absorb 50 percent of it, it's so engagingly presented, and so blazingly relevant to life in the 21st century, that you won't regret it.

At Scottish Ruby Conference 2013 I gave a talk called Impossible Programs, adapted from chapter 8 of Understanding Computation. It’s a talk about programs that are impossible to write in Ruby — it covers undecidability, the halting problem and Rice’s theorem, explained in plain English and illustrated with Ruby code. The slides are available

Impossible Programs

US entertainment industry to Congress: make it legal for us to deploy rootkits, spyware, ransomware and trojans to attack pirates!


The hilariously named "Commission on the Theft of American Intellectual Property" has finally released its report, an 84-page tome that's pretty bonkers. But amidst all that crazy, there's a bit that stands out as particularly insane: a proposal to legalize the use of malware in order to punish people believed to be copying illegally. The report proposes that software would be loaded on computers that would somehow figure out if you were a pirate, and if you were, it would lock your computer up and take all your files hostage until you call the police and confess your crime. This is the mechanism that crooks use when they deploy ransomware.

It's just more evidence that copyright enforcers' network strategies are indistinguishable from those used by dictators and criminals. In 2011, the MPAA told Congress that they wanted SOPA and knew it would work because it was the same tactic used by governments in "China, Iran, the UAE, Armenia, Ethiopia, Saudi Arabia, Yemen, Bahrain, Burma, Syria, Turkmenistan, Uzbekistan, and Vietnam." Now they've demanded that Congress legalize an extortion tool invented by organized criminals.

Additionally, software can be written that will allow only authorized users to open files containing valuable information. If an unauthorized person accesses the information, a range of actions might then occur. For example, the file could be rendered inaccessible and the unauthorized user’s computer could be locked down, with instructions on how to contact law enforcement to get the password needed to unlock the account. Such measures do not violate existing laws on the use of the Internet, yet they serve to blunt attacks and stabilize a cyber incident to provide both time and evidence for law enforcement to become involved.

It gets better:

While not currently permitted under U.S. law, there are increasing calls for creating a more permissive environment for active network defense that allows companies not only to stabilize a situation but to take further steps, including actively retrieving stolen information, altering it within the intruder’s networks, or even destroying the information within an unauthorized network. Additional measures go further, including photographing the hacker using his own system’s camera, implanting malware in the hacker’s network, or even physically disabling or destroying the hacker’s own computer or network.

USA Intellectual Property Theft Commission Recommends Malware! (Thanks, Adam!)

(Image: [211/365] Off with her head!, a Creative Commons Attribution (2.0) image from pasukaru76's photostream)

Cory's Sense About Science lecture

I gave the annual Sense About Science lecture last week in London, and The Guardian recorded and podcasted it (MP3). It's based on the Waffle Iron Connected to a Fax Machine talk I gave at Re:publica in Berlin the week before. Cory

Black Code: how spies, cops and crims are making cyberspace unfit for human habitation


I reviewed Ronald Diebert's new book Black Code in this weekend's edition of the Globe and Mail. Diebert runs the Citizen Lab at the University of Toronto and has been instrumental in several high-profile reports that outed government spying (like Chinese hackers who compromised the Dalai Lama's computer and turned it into a covert CCTV) and massive criminal hacks (like the Koobface extortion racket). His book is an amazing account of how cops, spies and crooks all treat the Internet as the same kind of thing: a tool for getting information out of people without their knowledge or consent, and how they end up in a kind of emergent conspiracy to erode the net's security to further their own ends. It's an absolutely brilliant and important book:

Ronald Deibert’s new book, Black Code, is a gripping and absolutely terrifying blow-by-blow account of the way that companies, governments, cops and crooks have entered into an accidental conspiracy to poison our collective digital water supply in ways small and large, treating the Internet as a way to make a quick and dirty buck or as a snoopy spy’s best friend. The book is so thoroughly disheartening for its first 14 chapters that I found myself growing impatient with it, worrying that it was a mere counsel of despair.

But the final chapter of Black Code is an incandescent call to arms demanding that states and their agents cease their depraved indifference to the unintended consequences of their online war games and join with civil society groups that work to make the networked society into a freer, better place than the world it has overwritten.

Deibert is the founder and director of The Citizen Lab, a unique institution at the University of Toronto’s Munk School of Global Affairs. It is one part X-Files hacker clubhouse, one part computer science lab and one part international relations observatory. The Citizen Lab’s researchers have scored a string of international coups: Uncovering GhostNet, the group of Chinese hackers taking over sensitive diplomatic computers around the world and eavesdropping on the private lives of governments; cracking Koobface, a group of Russian petty crooks who extorted millions from random people on the Internet, a few hundred dollars at a time; exposing another Chinese attack directed at the Tibetan government in exile and the Dalai Lama. Each of these exploits is beautifully recounted in Black Code and used to frame a larger, vivid narrative of a network that is global, vital and terribly fragile.

Yes, fragile. The value of the Internet to us as a species is incalculable, but there are plenty of parties for whom the Internet’s value increases when it is selectively broken.

How to make cyberspace safe for human habitation

Black Code: Inside the Battle for Cyberspace

Computer scientists to FBI: don't require all our devices to have backdoors for spies

In an urgent, important blog post, computer scientist and security expert Ed Felten lays out the case against rules requiring manufacturers to put wiretapping backdoors in their communications tools. Since the early 1990s, manufacturers of telephone switching equipment have had to follow a US law called CALEA that says that phone switches have to have a deliberate back-door that cops can use to secretly listen in on phone calls without having to physically attach anything to them. This has already been a huge security problem -- through much of the 1990s, AT&T's CALEA controls went through a Solaris machine that was thoroughly compromised by hackers, meaning that criminals could listen in on any call; during the 2005/6 Olympic bid, spies used the CALEA backdoors on the Greek phone company's switches to listen in on the highest levels of government.

But now, thanks to the widespread adoption of cryptographically secured messaging services, law enforcement is finding that its CALEA backdoors are of declining utility -- it doesn't matter if you can intercept someone else's phone calls or network traffic if the data you're captured is unbreakably scrambled. In response, the FBI has floated the idea of "CALEA II": a mandate to put wiretapping capabilities in computers, phones, and software.

As Felten points out, this is a terrible idea. If your phone is designed to secretly record you or stream video, location data, and messages to an adverse party, and to stop you from discovering that it's doing this, it puts you at huge risk when that facility is hijacked by criminals. It doesn't matter if you trust the government not to abuse this power (though, for the record, I don't -- especially since anything mandated by the US government would also be present in devices used in China, Belarus and Iran) -- deliberately weakening device security makes you vulnerable to everyone, including the worst criminals:

Our report argues that mandating a virtual wiretap port in endpoint systems is harmful. The port makes it easier for attackers to capture the very same data that law enforcement wants. Intruders want to capture everything that happens on a compromised computer. They will be happy to see a built-in tool for capturing and extracting large amounts of audio, video, and text traffic. Better yet (for the intruder), the capability will be stealthy by design, making it difficult for the user to tell that anything is amiss.

Beyond this, the mandate would make it harder for users to understand, monitor, and fix their own systems—which is bad for security. If a system’s design is too simple or its operation too transparent or too easy to monitor, then wiretaps will be evident. So a wiretappability mandate will push providers toward complex, obfuscated designs that are harder to secure and raise the total cost of building and operating the system.

Finally, our report argues that it will not be possible to block non-compliant implementations. Many of today’s communication tools are open source, and there is no way to hide a capability within an open source code base, nor to prevent people from simply removing or disabling an undesired feature. Even closed source systems are routinely modified by users—as with jailbreaking of phones—and users will find ways to disable features they don’t want. Criminals will want to disable these features. Ordinary users will also want to disable them, to mitigate their security risks.

Felten's remarks summarize a report [PDF] signed by 20 distinguished computer scientists criticizing the FBI's proposal. It's an important read -- maybe the most important thing you'll read all month. If you can't trust your devices, you face enormous danger.

CALEA II: Risks of wiretap modifications to endpoints

Defense Distributed claims working 3D printed handgun


Defense Distributed's Cody Wilson claims he has attained his stated goal of 3D printing a working handgun. There's no footage of it firing yet, nor details on how many rounds it fires before the plastic is worn out. And although this is a fascinating provocation, it is not (yet) a game-changer, especially in America where traditional guns (capable of firing thousands of rounds without melting down) are cheap and easy to get. You can even "3D print" a gun by asking different CNC shops to cut and overnight you all the parts to make up a working gun, breaking the job down into small pieces that are unlikely to arouse suspicion.

All sixteen pieces of the Liberator prototype were printed in ABS plastic with a Dimension SST printer from 3D printing company Stratasys, with the exception of a single nail that’s used as a firing pin. The gun is designed to fire standard handgun rounds, using interchangeable barrels for different calibers of ammunition.

Technically, Defense Distributed’s gun has one other non-printed component: the group added a six ounce chunk of steel into the body to make it detectable by metal detectors in order to comply with the Undetectable Firearms Act. In March, the group also obtained a federal firearms license, making it a legal gun manufacturer.

This Is The World's First Entirely 3D-Printed Gun (Photos) [Andy Greenberg/Forbes]

Video from my book tour: Cincinnati presentation

Kevin Loughin came out to my Homeland tour-stop in Cincinnati on Valentine's Day and made a great video of the presentation and Q&A. He was kind enough to post it to YouTube -- thanks, Kevin!

Cory Doctorow talk on Homeland.