Here's an important consideration for Europeans in light of the NSA dragnet surveillance revealed by the recent leaks: some of the amendments to the controversial new EU Data Protection Regulation would open the door to the secret transfer of EU citizens' private information to US intelligence agencies. The UK Liberal Democrat MEP Baroness Ludford has advocated amendments that do this. The Open Rights Group and principled UK LibDems are calling on the Baroness to withdraw her support for these amendments and support transparency and accountability in the handling of sensitive personal information of Europeans.
For instance, the Baroness is behind amendment number 1210.
This removes the right to know if your data might be transferred to a third country or international organisation. It does this by deleting the following bit of the proposed Regulation:
Article 14 – paragraph 1 – point g (g) where applicable, that the controller intends to transfer to a third country or international organisation and on the level of protection afforded by that third country or international organisation by reference to an adequacy decision by the Commission;
It hardly needs spelling out given the recent news about PRISM and state surveillance, but knowing which companies or countries your data might be moved to is likely to increasingly be a fundamental consideration for someone deciding whether to share personal data.
My latest Guardian column is "Data protection in the EU: the certainty of uncertainty," a look at the absurdity of having privacy rules that describes some data-sets as "anonymous" and others as "pseudonymous," while computer scientists in the real world are happily re-identifying "anonymous" data-sets with techniques that grow more sophisticated every day. The EU is being lobbied as never before on its new data protection rules, mostly by US IT giants, and the new rules have huge loopholes for "anonymous" and "pseudonymous" data that are violently disconnected from the best modern computer science theories. Either the people proposing these categories don't really care about privacy, or they don't know enough about it to be making up the rules -- either way, it's a bad scene.
Read the rest
Since the mid-noughties, de-anonymising has become a kind of full-contact sport for computer scientists, who keep blowing anonymisation schemes out of the water with clever re-identifying tricks. A recent paper in Nature Scientific Reports showed how the "anonymised" data from a European phone company (likely one in Belgium) could be re-identified with 95% accuracy, given only four points of data about each person (with only two data-points, more than half the users in the set could be re-identified).
Some will say this doesn't matter. They'll say that privacy is dead, or irrelevant, or unimportant. If you agree, remember this: the reason anonymisation and pseudonymisation are being contemplated in the General Data Protection Regulation is because its authors say that privacy is important, and worth preserving.