Kaspersky Labs (Russia) and Citizen Lab (University of Toronto) have independently published details of phone-hacking tools sold to police departments worldwide by the Italian firm Hacking Team (here's Kaspersky's report and Citizen Lab's). The tools can be used to attack Android, Ios, Windows Mobile and Blackberry devices, with the most sophisticated attacks reserved for Android and Ios.
The spyware can covertly record sound, images and keystrokes, capture screenshots, and access the phones' storage and GPS. The tools are designed to detect attempts to search for them and to delete themselves without a trace if they sense that they are under attack.
Hacking Team insists that its tools are only sold to "democratic" police forces, but Citizen Lab's report suggests that the tool was used by the Saudi government to target dissidents.
The means of infection is device-specific. If police have physical access, it's simple. Android devices can be attacked by infecting a PC with a virus that installs the police malware when the device is connected to it. This attack also works on jailbroken Iphones.
Read the rest
As the California legislature moves to mandate "kill switches" that will allow owners of stolen phones to shut them down, the Electronic Frontier Foundation sounds an important alarm: if it's possible for someone to remotely switch off your phone such that you can't switch it back on again, even if you're physically in possession of it, that facility could be abused in lots of ways. This is a classic War on General Purpose Computation moment: the only way to make a kill-switch work is to design phones that treat their possessors as less trustworthy than a remote party sending instructions over the Internet, and as soon as the device that knows all your secrets and watches and listens to your most private moments is designed to do things that the person holding it can't override, the results won't be pretty.
There are other models for mitigating the harm from stolen phones. For example, the Cyanogen remote wipe asks the first user of the phone to initialize a password. When it is online, the device checks in with a service to see whether anyone using that password has signed a "erase yourself" command. When that happens, the phone deletes all the user-data. A thief can still wipe and sell the phone, but the user's data is safe.
Obviously, this isn't the same thing as stolen phones going dead and never working again, and won't have the same impact on theft. But the alternative is a system that allows any bad guy who can impersonate, bribe or order a cop to activate the kill-switch to do all kinds of terrible things to you, from deactivating the phones of people recording police misconduct to stalking or stealing the identities of mobile phone owners, with near-undetectable and unstoppable stealth.
Read the rest
Writing in the Atlantic, Bruce Schneier explains the NSA's insane program of creating, discovering and hoarding vulnerabilities in computer systems in order to weaponize them. These vulnerabilities allow the NSA to attack its enemies (everyone), but let other states, hackers, and crooks attack Americans. The NSA claims it is "securing" cyberspace, but its dominant tactic requires that everyone be made less secure so that the NSA can attack them if they feel the need.
Read the rest
More than 100 people around the world have been arrested in a coordinated sweep of RATers (people who deploy "remote access trojans" that let them spy on people through their computers cameras and mics, as well as capturing their keystrokes and files). The accused are said to have used the Blackshades trojan, which sold for $40 from bshades.eu, mostly for sexual exploitation of victims (though some were also accused of committing financial fraud).
A US District Court in Manhattan handed down indictments for Alex Yücel and Brendan Johnston, who are said to have operated bshades.eu. Yücel, a Swedish national, was arrested in Moldova and is awaiting extradition to the USA. Johnstone is alleged to have been employed by Yücel to market and support Blackshades.
Read the rest
Following on from yesterday's brutal, awful news that Mozilla is going to add DRM to its Firefox browser, the Electronic Frontier Foundation's Danny O'Brien has published an important editorial explaining how Mozilla's decision sets back the whole cause of fighting for a free and open Internet.
Read the rest
Here's a reading (MP3) of a my recent Guardian column, Why it is not possible to regulate robots, which discusses where and how robots can be regulated, and whether there is any sensible ground for "robot law" as distinct from "computer law."
One thing that is glaringly absent from both the Heinleinian and Asimovian brain is the idea of software as an immaterial, infinitely reproducible nugget at the core of the system. Here, in the second decade of the 21st century, it seems to me that the most important fact about a robot – whether it is self-aware or merely autonomous – is the operating system, configuration, and code running on it.
If you accept that robots are just machines – no different in principle from sewing machines, cars, or shotguns – and that the thing that makes them "robot" is the software that runs on a general-purpose computer that controls them, then all the legislative and regulatory and normative problems of robots start to become a subset of the problems of networks and computers.
If you're a regular reader, you'll know that I believe two things about computers: first, that they are the most significant functional element of most modern artifacts, from cars to houses to hearing aids; and second, that we have dramatically failed to come to grips with this fact. We keep talking about whether 3D printers should be "allowed" to print guns, or whether computers should be "allowed" to make infringing copies, or whether your iPhone should be "allowed" to run software that Apple hasn't approved and put in its App Store.
Practically speaking, though, these all amount to the same question: how do we keep computers from executing certain instructions, even if the people who own those computers want to execute them? And the practical answer is, we can't.
Mastering by John Taylor Williams: email@example.com
John Taylor Williams is a audiovisual and multimedia producer based in Washington, DC and the co-host of the Living Proof Brew Cast. Hear him wax poetic over a pint or two of beer by visiting livingproofbrewcast.com. In his free time he makes "Beer Jewelry" and "Odd Musical Furniture." He often "meditates while reading cookbooks."
Japanese police arrested a 27 year old man called Yoshitomo Imura, alleging that he 3D printed several guns and posted videos to Youtube of himself firing it. They say they seized five guns from Imura's home in Kawasaki City. The videos showed that two of these guns were capable of firing rounds -- what sort isn't specified -- through a stack of ten sheets of plywood, and this caused Japanese police to class them as lethal weapons. A Japanese press account has Imura admitting to printing the guns, but insisting that he "didn't know they were illegal."
As I wrote a year ago when 3D printed guns first appeared on the scene, the regulatory questions raised by them are much more significant than the narrow issue of gun control. But there's a real danger that judges, lawmakers and regulators will be distracted by the inflammatory issue of firearms when considering the wider question of trying to regulate computers.
Read the rest
Rebecca from EFF writes, "How would you feel about having your computer taken over by online test-taking software - complete with proctors peering through your laptop camera? Reporters at the Spartan Daily (the student paper for San Jose State University) have an interesting story about new software in use there
, and the legitimate concerns that some students have. The data-broker connection is especially chilling to those worried about their personal information." The company's response? "We're a customer service business, so it’s really not advantageous for us to violate that trust." Oh, well, so long as that's sorted out then.
My new Guardian column is "Why it is not possible to regulate robots," which discusses where and how robots can be regulated, and whether there is any sensible ground for "robot law" as distinct from "computer law."
Read the rest
The Australian attorney general has mooted a proposal to require service providers to compromise their cryptographic security in order to assist in wiretaps. The proposal is given passing mention in a senate submission from the AG's office, where it is referenced as "intelligibility orders" that would allow "law enforcement, anti-corruption and national security agencies" to secure orders under which providers like Google, Facebook and Yahoo would have to escrow their cryptographic keys with the state in order to facilitate mass surveillance.
Edward Snowden referenced this possibility in his SXSW remarks, pointing out that any communications that are decrypted by service providers are vulnerable to government surveillance, because governments can order providers to reveal their keys. This is why Snowden recommended the use of "end-to-end" security, where only the parties in the discussion -- and not the software vendor -- have the ability to spy on users.
The "intelligibility order" is the same kind of order that led to the shutdown of Lavabit, the secure email provider used by Snowden, whose creator shut the service down rather than compromising his users' security.
Read the rest
the organizer of the annual Stanford conference on Robots and the Law has written a new paper called
Robotics and the New Cyberlaw
, examining the new legal challenges posed by the presence of robots in our public spaces, homes and workplaces, as distinct from the legal challenges of computers and the Internet.
I'm not entirely convinced that I believe that there is such a thing as a robot, as distinct from "a computer in a special case" or "a specialized peripheral for a computer." At least inasmuch as mandating that a robot must (or must not) do certain things is a subset of the problem of mandating that computers must (or must not) run certain programs.
It seems to me that a lot of the areas where Calo identifies problems with "cyberlaw" as it applies to robots are actually just problems with cyberlaw, period. Cyberlaw isn't very good law, by and large, having been crafted by self-interested industry lobbyists and enacted on the basis of fearmongering and grandstanding, so it's not very surprising that it isn't very good at solving robot problems.
But the paper is a fascinating one, nevertheless.
Update: The organizer of Robots and the Law is Michael Froomkin; Ryan Calo is the person who sent it in to Boing Boing. The conference isn't held at Stanford every year; next year it will be in Miami. Sorry for the confusion!
Read the rest
A new mobile app called "Nametag" adds facial recognition to phone photos; take a pic of someone and feed it to the app and the app will search Facebook, Twitter, sex offender registries and (if you'd like) dating sites to try and put a name to the face. Kevin Alan Tussy, speaking for Facialnetwork (who make Nametag) promises that this won't be a privacy problem, because "it's about connecting people that want to be connected."
Read the rest
On Practical Machinst, there's a fascinating thread about the manufacturer's lockdown on a high-priced, high-end Mori Seiki NV5000 A/40 CNC mill. The person who started the thread owns the machine outright, but has discovered that if he moves it at all, a GPS and gyro sensor package in the machine automatically shuts it down and will not allow it to restart until they receive a manufacturer's unlock code.
Effectively, this means that machinists' shops can't rearrange their very expensive, very large tools to improve their workflow from job to job without getting permission from the manufacturer (which can take a month!), even if their own the gear.
Read the rest
Tom sez, "This clip takes aim at the NSA and their spying, snooping ways - it's made by somegreybloke, and features Jeremiah McDonald (who clocked up 11 million views on YouTube with conversation with my six year old self) & Max Koch, another US based comedian, cartoon maker and funnyman."
This is pretty good, but moves into "inspired" territory around 2:01.
NSA: National Insecurity / somegreybloke | MASHED
Badly configured home automation systems are easy to locate using Google, and once you discover them, you can seize control of a stranger's entire home: "lights, hot tubs, fans, televisions, water pumps, garage doors, cameras, and other devices." The manufacturers blame their customers for not following security advice, but even "enthusiast" customers who think they've locked down their networks are sometimes in for a nasty surprise.
Insteon chief information officer Mike Nunes says the systems that I’m seeing online are from a product discontinued in the last year. He blamed user error for the appearance in search results, saying the older product was not originally intended for remote access, and to set this up required some savvy on the users’ part. The devices had come with an instruction manual telling users how to put the devices online which strongly advised them to add a username and password to the system. (But, really, who reads instruction manuals closely?)
“This would require the user to have chosen to publish a link (IP address) to the Internet AND for them to have not set a username and password,” says Nunes. I told Nunes that requiring a username/password by default is good security-by-design to protect people from making a mistake like this. “It did not require it by default, but it supported it and encouraged it,” he replied.
In Thomas Hatley’s case, he created a website that acted as the gateway for a number of services for his home. There is a password on his website, but you can circumvent that by going straight to the Insteon port, which was not password protected. “I would say that some of the responsibility would be mine, because of how I have my internal router configured,” says Hatley who describes himself as a home automation enthusiast. “But it’s coming from that port, and I didn’t realize that port was accessible from the outside.”
The company’s current product automatically assigns a username and password, but it did not during the first few months of release — which is one of the products that Trustwave’s Bryan got. If you have one of those early products, you should really go through with that recall. Bryan rated the new authentication as “poor” saying that cracking it would “be a trivial task for most security professionals.”
When 'Smart Homes' Get Hacked: I Haunted A Complete Stranger's House Via The Internet [Kashmir Hill/Forbes]