Even the most stringent privacy rules have massive loopholes: they all allow for free distribution of "de-identified" or "anonymized" data that is deemed to be harmless because it has been subjected to some process.
Ernie Smith's Motherboard article on the early years of DRM gets into some fascinating stories about things like IBM's Cryptolope and Xerox PARC's Contentguard (which became a patent troll), Intertrust's belief that it is "developing the basis for a civil society in cyberspace" and the DeCSS fight.
Princeton computer scientist and former White House Deputy CTO Ed Felten (previously) writes about the security lessons of the 2016 election: first, that other nation-states are more aggressive than generally supposed, and second, that you don't need to hack the vote-totals to effect devastation on an adversary — it's sufficient to undermine the election's legitimacy by messing with voter rolls, "so there is uncertainty about whether the correct people were allowed to vote."
Eminent computer scientist Ed Felten has posted a short, extremely useful taxonomy of four ways that an algorithm can fail to be accountable to the people whose lives it affects: it can be protected by claims of confidentiality ("how it works is a trade secret"); by complexity ("you wouldn't understand how it works"); unreasonableness ("we consider factors supported by data, even when you there's no obvious correlation"); and injustice ("it seems impossible to explain how the algorithm is consistent with law or ethics").
Ed Felten (previously) — copyfighter, Princeton computer scientist, former deputy CTO of the White House — has published a four-and-a-half-page "primer for policymakers" on cryptography that explains how encryption for filesystems and encryption for messaging works, so they can be less ignorant.
Donald J Trump's executive order banning Muslims from entering the US threw the world into chaos yesterday, as US citizens, lawful permanent residents and visa holders found themselves stranded abroad, detained at airports on arrival to the USA, or helplessly waiting outside immigration checkpoint for news of sick and vulnerable family members who were held incommunicado by US immigration officials who refused to obey a US federal court order.
Ever since the Supreme Court ordered the nation's voting authorities to get their act together in 2002 in the wake of Bush v Gore, tech companies have been flogging touchscreen voting machines to willing buyers across the country, while a cadre computer scientists trained in Ed Felten's labs at Princeton have shown again and again and again and again that these machines are absolutely unfit for purpose, are trivial to hack, and endanger the US election system.
I have a new op-ed in today's Privacy Tech, the in-house organ of the International Association of Privacy Professionals, about the risks to security and privacy from the World Wide Web Consortium's DRM project, and how privacy and security pros can help protect people who discover vulnerabilities in browsers from legal aggression.
The Clinton campaign has struggled to win support among young voters of every description, including traditional Democratic Party voters: women, African-Americans, people of Latinamerican or Hispanic origin, etc.
The Princeton Bitcoin Book by Arvind Narayanan, Joseph Bonneau, Edward Felten,
Andrew Miller and Steven Goldfeder is a free download — it's over 300 pages and is intended for people "looking to truly understand how Bitcoin works at a technical level and have a basic familiarity with computer science and programming."
Iphone 6s that have been repaired by independent service centers are bricking themselves, seemingly permanently, with a cryptic message about "Error 53."
Michael Daniel thinks "being too down in the weeds at the technical level could actually be a little bit of a distraction"; Ed Felten counters, "Imagine reaction if White House economic advisor bragged about lack of economics knowledge, or Attorney General bragged about lack of legal expertise."
Cory Doctorow summarizes the problem with the idea that sensitive personal information can be removed responsibly from big data: computer scientists are pretty sure that's impossible.
The Trustycon folks have uploaded over seven hours' worth of talks from their event, an alternative to the RSA security conference founded by speakers who quit over RSA's collusion with the NSA. I've just watched Ed Felten's talk on "Redesigning NSA Programs to Protect Privacy" (starts at 6:32:33), an absolutely brilliant talk that blends a lucid discussion of statistics with practical computer science with crimefighting, all within a framework of respect for privacy, liberty and the US Bill of Rights. — Read the rest
With admirable clarity and brevity, Princeton's Ed Felten explains why Lavabit's owner was right to design his email service to be resistant to court orders. The whole piece is good and important, but here's the takeaway: "At Lavabit, an employee, on receiving a court order, copies user data and gives it to an outside party—in this case, the government. — Read the rest