Tech companies: you have 63 days to make these 5 changes to protect your users before Trump is sworn in


When the next president takes office, he brings with him an anti-encryption, anti-free-press, Islamophobic, racist, anti-transparency agenda that will depend on the tech sector's massive databases of identifiable information and their sophisticated collection capabilities to bring his agenda to fruition.

The inauguration is in 63 days. That's how long companies have to take countermeasures that will prevent them from being deputized in Trump's war on Constitutional liberties, human dignity and human rights.

The Electronic Frontier Foundation's Erica Portnoy has assembled a list of five steps tech companies must take while they can: "1. Allow pseudonymous access; 2. Stop behavioral analysis; 3. Free up disk space and delete those logs; 4. Encrypt data in transit; 5. Enable end-to-end encryption by default."

Lots of the people reading these words work in tech. I am speaking to you now. You can take steps today to make your tools useless for mass roundups and mass surviellance, or you can face the consequences later: making the choice between going to jail to protect your users, or going home and looking your kids in the eyes, knowing that your code was just used to betray the users who trusted you.

On March 27, 1943, Dutch resistance fighters torched the hall of records in Amsterdam, because those files would have told their Nazi occupiers which doors to smash in. When the firefighters arrived on the scene, they made sure the hoses kept running until every scrap of paper was ruined.


Do your duty.

1. Allow pseudonymous access.

Give your users the freedom to access your service pseudonymously. As we've previously written, real-name policies and their ilk are especially harmful to vulnerable populations, including pro-democracy activists and the LGBT community. For bonus points, don't restrict access to logged-in users.

2. Stop behavioral analysis. "Don't collect it. If you have to collect it, don't store it. If you have to store it, don't store it long."

When you expose inferences to users, allow them both to remove or edit individual inferences and to opt out entirely. If your algorithms make a mistake or mislabel a person, the user should be able to correct you. Furthermore, ensure that the internal systems mirror and respect these preferences. When users opt out, delete their data and stop collecting it moving forward. Offering an opt out of targeting but not out of tracking is unacceptable.

Tech Companies, Fix These Technical Issues Before It's Too Late

[Erica Portnloy/EFF]


(Image: The devastation after the March 27th 1943 attack on the Muncipal Register in Amsterdam)