This shitty toilet paper dispenser in some Chinese restrooms won't let you have any paper until you watch a 30-second commercial. Then, you get only three squares. This thing is begging for a devious lick.
Bruce Schneier writes in the New York Times that banning facial recognition (as cities like San Diego, San Francisco, Oakland, Brookline and Somerville have done) is not enough: there are plenty of other ways to automatically recognize people (gait detection, high-resolution photos of hands that reveal fingerprints, voiceprints, etc), and these will all be used for the same purpose that makes facial recognition bad for our world: to sort us into different categories and treat us different based on those categories.
Dan Doctoroff and Stephen Diamond could hardly suppress their affection for each other at their January 13 joint luncheon address hosted by the Toronto Region Board of Trade.
Rodney Brooks (previously) is a distinguished computer scientist and roboticist (he's served as as head of MIT's Computer Science and Artificial Intelligence Laboratory and CTO of Irobot); two years ago, he published a list of "dated predictions" intended to cool down some of the hype about self-driving cars, machine learning, and robotics, hype that he viewed as dangerously gaseous.
Dten is a "certified hardware provider" for Zoom, making smart screens and whiteboards for videoconferencing; a Forescout Research report reveals that Dten committed a string of idiotic security blunders in designing its products, exposing its customers to video and audio surveillance, as well as theft of presentations and whiteboard data.
Every year, the AI Now Institute (previously) publishes a deep, thoughtful, important overview of where AI research is and the ethical gaps in AI's use, and makes a list of a dozen urgent recommendations for the industry, the research community, and regulators and governments.
The annual Germeval natural language processing event solicits German-language "shared tasks"; one of this year's proposed tasks from the University of Hamburg is Prediction of Intellectual Ability and Personality Traits from Text, which proposes to mine test subjects' essays as a predictor of IQ.
Back in September, a Congressional committee investigating anticompetitive conduct by America's tech giants sent a letter to Apple (among other Big Tech firms) asking it for details of business practices that seem nakedly anticompetitive; Apple's response seeks to justify much of that conduct by saying that it is essential to protecting its users' privacy.
James Scott's 1998 classic Seeing Like a State describes how governments shoehorn the governed into countable, manageable ways of living and working so that they can be counted and steered by state bureaucracies. Political scientist Henry Farrell (previously) discusses how networked authoritarianism is touted by its advocates as a way of resolving the problems of state-like seeing, because if a state spies on people enough and allows machine-learning systems to incorporate their behavior and respond to it, it is possible to create "a more efficient competitor that can beat democracy at its home game" — providing for everyone's needs better than a democracy could.
In TrojDRL: Trojan Attacks on Deep Reinforcement Learning Agents, a group of Boston University researchers demonstrate an attack on machine learning systems trained with "reinforcement learning" in which ML systems derive solutions to complex problems by iteratively trying multiple solutions.
Outdoor advertising companies are tapping location data brokers like Placeiq (which aggregates location data leaked by the spying dumpster-fire that is your phone's app ecosystem) and covertly siting Bluetooth and wifi sniffers in public space to gather data on the people who pass near to billboards: "gender, age, race, income, interests, and purchasing habits."
An Australian woman's creepy, violent ex-boyfriend hacked her phone using stalkerware, then used that, along with her car's VIN number, to hack the remote control app for her car (possibly Landrover's Incontrol app), which allowed him to track her location, stop and start her car, and adjust the car's temperature.
Gaggle is one of a handful of creepy companies that sell surveillance software to school districts, which monitor every keystroke and click on school networks — they're the latest evolution in spy-on-kids tech, which started off by promising that they'd stop kids from seeing porn, then promised they could end bullying, and now advertise themselves as a solution for school shootings, under the banner of being a "Safety Management Platform."
Hirevue is an "AI" company that companies contract with to screen job applicants: it conducts an hour-long videoconference session with applicants, analyzing their facial expressions, word-choices and other factors (the company does not actually explain what these are, nor have they ever subjected their system to independent scrutiny) and makes recommendations about who should get the job.