China: Unsecured facial recognition database leaks, thousands of kids from 20 schools, half are majority Tibetan areas

An unsecured facial recognition database that contained info on thousands of children from 20 schools in China, half of which are located in historically ethnic Tibetan areas, has been found online. Read the rest

NIST confirms that facial recognition is a racist, sexist dumpster-fire

While NIST doesn't speculate as to why, it did find that the performance of 189 facial recognition algorithms from 89 different vendors varied by "race, sex and age" -- that is, the systems performed significantly worse when asked to recognize people who weren't young, white and male. Read the rest

Privacy activists spent a day on Capitol Hill scanning faces to prove that scanning faces should be banned

Activists from Fight for the Future prowled the halls of Congress in "jumpsuits with phone strapped to their heads conducting live facial recognition surveillance" to "show why this tech should be banned." Read the rest

AI Now's annual report: stop doing "emotion detection"; stop "socially sensitive" facial recognition; make AI research diverse and representative -- and more

Every year, the AI Now Institute (previously) publishes a deep, thoughtful, important overview of where AI research is and the ethical gaps in AI's use, and makes a list of a dozen urgent recommendations for the industry, the research community, and regulators and governments. Read the rest

Chinese law professor's social media denunciation of facial recognition in the Beijing subway system

Lao Dongyan is a professor specializing in Criminal Law at Tsinghua University; on Oct 31, she posted a long, thoughtful piece to their public Wechat account about the announcement that Beijing's metro system will soon deploy facial recognition to "improve efficiency of passenger traffic." Prof Lao makes a smart, thorough argument against this, drawing on both China's rule of law, international privacy norms, and lack of meaningful consent. Read the rest

Second wave Algorithmic Accountability: from "What should algorithms do?" to "Should we use an algorithm?"

For ten years, activists and theorists have been developing a critique of "algorithms" (which have undergone numerous renamings over the same time, e.g. "filter bubbles"), with the early critiques focusing on the way that these can misfire with dreadful (or sometimes humorous) consequences, from discrimination in which employment and financial ads get served to the "dark patterns" that "maximized engagement" with services that occupied your attention but didn't bring you pleasure. Read the rest

China tech firms shape new facial recognition and surveillance standards at UN: Report

“Chinese technology companies are shaping new facial recognition and surveillance standards at the UN, according to leaked documents, as they try to open up new markets in the developing world for their cutting-edge technologies,” reports the Financial Times in a piece making the rounds on Monday. Read the rest

Behind the One-Way Mirror: EFF's "deep dive into corporate surveillance"

EFF's Behind the One-Way Mirror: A Deep Dive Into the Technology of Corporate Surveillance is a long, comprehensive look at corporate tracking, particularly invisible, third-party tracking, as with ad-networks, license-plate readers and facial recognition. Read the rest

I tried to access my secret consumer data. Their facial recognition software told me to smile.

In early November, the New York Times published an article called "I Got Access to My Secret Consumer Score. Now You Can Get Yours, Too." Naturally, this struck my curiosity, and I decide to try and navigate the various labyrinthine processes to try and find out what kind of information the conglomerates have on me, and how I can potentially get rid of it.

One of the main databrokers featured in the article is a company named Sift. They're reportedly easy enough to get your information from, and they're said to have a lot of it, too. I sent in my initial request, and they wrote back, saying they just needed to confirm my identity. Makes sense, I guess. I clicked the link, and they asked me to upload a photo of my Driver's License and scan the barcode on the back. Okay, fine; so I did it.

The next step required me to confirm my identity with a selfie. I assume that I am giving them more data to feed their facial recognition algorithms, which in turn will be sold to other companies to use for who-knows-it. But again, I went along with it. I took my hat off, smoothed out my greasy bedhead, and took a selfie:

Notice that little red alert at the bottom of the screen: "Make sure you are looking joyful or happy and try again."

I think I look pretty "joyful" here, all things considered. Besides, I'm not smiling in my driver's license photo; in fact, I was specifically told not to smile. Read the rest

About Face: EFF's new campaign to end government use of face surveillance

Today, the Electronic Frontier Foundation launched About Face, a new national campaign to end governmental use of facial recognition technology for surveillance at all levels -- city, state and federal. Read the rest

Microsoft hires former AG Eric Holder to audit facial recognition tech used on West Bank

Microsoft is hiring former Obama administration Attorney General Eric Holder to provide legal window dressing for their AnyVision technology, which the company says complies with the ethical principles stipulated during the facial recognition company's Series A. Read the rest

Facial recognition tools shared by 'Massive, secretive network of police departments'

At Medium's OneZero [@ozm], new reporting based on “thousands of pages of previously undisclosed emails” confirms “the existence of a massive, secretive network of police departments working together to share controversial facial recognition tools.” Read the rest

Make: a facial-recognition confounding "Opt Out Cap"

Mac Pierce created a simple wearable to challenge facial recognition: do a little munging to an image of a face, print it on heat transfer paper, iron it onto see-through mosquito netting, slice, and affix to a billed cap -- deploy it in the presence of facial recognition cameras and you'll be someone else. It's the kind of "adversarial example" countermeasure that fools computers pretty reliably but wouldn't work on a human. (via JWZ) Read the rest

The CIA is offering…privacy advice? For trick-or-treaters? WTF?

I don't think I ever related to the White Guy Blinking Meme as much as I did after this tweet crossed my timeline.

I did in fact click through to their "CIA Kids Guide: 5 Ways To Stay Covert This Halloween" and I…I don't even know where to begin.

I kind of love this as a piece of propaganda because the first tip and the last three are at least useful (if completely fucking obvious for anyone who's ever watched a spy movie). But then there's #2, "Think Simple," which … I know this is meant for the CIA's kids' outreach section, but come on. You're not even pretending that you're not indoctrinating kids to make it easier to surveil them!

I guess it'd be too much to hope for that the CIA might offer helpful advice on VPNs and anti-surveillance attire—but even then, I probably wouldn't trust it.

Image via Katerha/Flickr Read the rest

Podcast of Affordances: a new science fiction story that climbs the terrible technology adoption curve

In my latest podcast (MP3), I read my short story "Affordances," which was commissioned for Slate/ASU's Future Tense Fiction. it's a tale exploring my theory of "the shitty technology adoption curve," in which terrible technological ideas are first imposed on poor and powerless people, and then refined and normalized until they are spread over all the rest of us. Read the rest

"Affordances": a new science fiction story that climbs the terrible technology adoption curve

"Affordances" is my new science fiction story for Slate/ASU's Future Tense project; it's a tale exploring my theory of "the shitty technology adoption curve," in which terrible technological ideas are first imposed on poor and powerless people, and then refined and normalized until they are spread over all the rest of us. Read the rest

We killed facial recognition at music festivals: next, we kill it everywhere

Evan from Fight for the Future writes, "Today Rage Against the Machine guitarist Tom Morello and I published an op-ed in Buzzfeed about how grassroots activism combined with backlash from artists and fans to kill the terrible idea of using facial recognition technology at US music festivals. We wanted to tell this story because everyone needs to know that the corporate-government surveillance dystopia of our nightmares is NOT inevitable, but it's coming fast unless we organize to stop it." Read the rest

More posts