I tried to access my secret consumer data. Their facial recognition software told me to smile.

In early November, the New York Times published an article called "I Got Access to My Secret Consumer Score. Now You Can Get Yours, Too." Naturally, this struck my curiosity, and I decide to try and navigate the various labyrinthine processes to try and find out what kind of information the conglomerates have on me, and how I can potentially get rid of it.

One of the main databrokers featured in the article is a company named Sift. They're reportedly easy enough to get your information from, and they're said to have a lot of it, too. I sent in my initial request, and they wrote back, saying they just needed to confirm my identity. Makes sense, I guess. I clicked the link, and they asked me to upload a photo of my Driver's License and scan the barcode on the back. Okay, fine; so I did it.

The next step required me to confirm my identity with a selfie. I assume that I am giving them more data to feed their facial recognition algorithms, which in turn will be sold to other companies to use for who-knows-it. But again, I went along with it. I took my hat off, smoothed out my greasy bedhead, and took a selfie:

Notice that little red alert at the bottom of the screen: "Make sure you are looking joyful or happy and try again."

I think I look pretty "joyful" here, all things considered. Besides, I'm not smiling in my driver's license photo; in fact, I was specifically told not to smile. Read the rest

An important, elegant thought experiment on content moderation regulation

Kate Klonick (previously) logged into Twitter to find that her trending topics were: "Clarence Thomas," "#MakeADogsDay," "Adam Neumann" and "#Lynching" (if you're reading this in the future, Thomas is the subject of a new documentary and Trump just provoked controversy by characterizing impeachment proceedings as a "lynching.") Read the rest

Police cameras to be augmented with junk-science "microexpression" AI lie-detectors

The idea that you can detect lies by analyzing "microexpressions" has absorbed billions in spending by police forces and security services, despite the fact that it's junk science that performs worse than a coin-toss. Read the rest

Facebook's "celebration" and "memories" algorithms are auto-generating best-of-terror-recruiting pages for extremist groups

Facebook isn't very good at selling you things on behalf of its advertisers, so the company has to gather as much data as possible on you and use it keep you clicking as much as possible in the hopes of eventually scoring a hit with its targeting system, and that means that it often commits unwitting -- but utterly predictable -- acts of algorithmic cruelty. Read the rest

Google mistakenly started handing out a reporter's cellphone number to people searching for Facebook tech support

If Facebook is broken for you in some way large or small, you can't call them to complain -- the company doesn't have a customer service number, it has a "support portal" for people suffering with the service, which combines the worst of autoresponders with the worst of underpaid, three-ring-binder constrained support staff to make a system that runs like a cost-conscious version of Kafka's "The Trial." Read the rest

Edward Snowden to keynote London's ORGCON!

ORGCON19 is the annual conference put on by the UK Open Rights Group (disclosure: I co-founded ORG and volunteer on its advisory board); it is "the UK’s largest human and digital rights conference," and this year's conference -- held on July 13 in central London -- is centred on "Data and Democracy, Digital Privacy, Online Censorship & the Role of Algorithms," so it only follows that the whistleblower Edward Snowden as its keynote speaker! Read the rest

UK minister says airlines used "exploitative algorithms" to split up families unless they paid extra

UK Digital Minister Margot James has vowed to crack down on "exploitative algorithms" used by airlines that deliberately split up families' seat assignments if they did not pay for pre-assigned seats; James says that the airlines used these algorithms to coerce families into paying for pre-assigned seating. Read the rest

A successful no-platforming means we can talk about Alex Jones again

Zeynep Tufekci (previously) says that Big Tech's "engagement maximization" algorithms meant that any time you talked about Alex Jones critically, the algorithms would start relentlessly recommending that you watch some Alex Jones videos, because they were so well designed to please the algorithms by sucking up our attention. Read the rest

A human being at Facebook manually approved the idea of targeting ads to users interested in "white genocide"

A year ago, Facebook apologized for allowing advertisers to target its users based on their status as "Jew haters" and blamed an algorithmic system that automatically picked up on the most popular discussions on the platform and turned them into ad-targeting segments. Read the rest

Stet, a gorgeous, intricate, tiny story of sociopathic automotive vehicles

Sarah Gailey's micro-short-story STET is a beautiful piece of innovative storytelling that perfectly blends the three ingredients for a perfect piece of science fiction: sharply observed technological speculation that reflects on our present moment; a narrative arc for characters we sympathize with; and a sting in the tail that will stay with you long after the story's been read. Read the rest

Facebook's spam filter blocked the most popular articles about its 50m user breach

When news broke yesterday that Facebook had suffered a breach affecting at least 50,000,000 users, Facebook users (understandably) began to widely share links to articles about the breach. Read the rest

Facebook sends man animation featuring cartoon characters dancing on his mother's grave

Facebook wants you to "engage" with its service, so they have an algorithm that plucks your most favorited images out of your past stream and adds dancing whimsical cartoon characters and then rams the resulting animation into your eyeballs, because why not? Read the rest

ICE hacked its algorithmic risk-assessment tool so it recommended detention for everyone

One of the more fascinating and horrible details in Reuters' thoroughly fascinating and horrible long-form report on Trump's cruel border policies is this nugget: ICE hacked the risk-assessment tool it used to decide whom to imprison so that it recommended that everyone should be detained. Read the rest

Thanks to 2016's trade secret law and algorithmic justice, America's courts have become AI-Kafka nightmares

In 2014, the Alice decision made it much harder to patent software in the USA; in 2016, Congress passed the Defend Trade Secrets Act, creating the first federal trade secrets statute: the result of these two developments is that software companies aggressively switched from patents to trade secrets as a means of controlling competition and limiting inspection and criticism of their products. Read the rest

Fired by an algorithm, and no one can figure out why

Ibrahim Diallo was eight months into a three year contract with a big company when its systems abruptly decided that he was fired: first it told his recruiter that he'd been let go, then it stopped accepting his pass for the parking and the turnstyles, then his logins stopped working, and at each turn, his supervisor, and that person's boss, and the HR people, were at a loss to explain or reverse the steady, automated disappearance of Ibrahim from the company. Read the rest

"Friendly" apps are good at maximizing engagement, but their context-blindness is a cesspit of algorithmic cruelty

Designers use metrics and a/b splitting experiments to maximize engagement with their products, seeking out the phrases that trigger emotional responses in users -- like a smart scale that congratulates you on losing weight -- but these systems are context-blind, so they are unable to distinguish between people who might be traumatized by their messages (say, a woman who's just miscarried late in her pregnancy, being congratulated on her "weight loss"). Read the rest

Algorithms try to channel us into repeating our lives

Molly Sauter (previously) describes in gorgeous, evocative terms how the algorithms in our life try to funnel us into acting the way we always have, or, failing that, like everyone else does. Read the rest

More posts