Calculating Empires is a "a genealogy of technology and power since 1500" — a beautiful and interactive monochrome chart you can zoom in and out of to trace the connections between all such things in the modern age. I immediately crash zoomed in and found myself face-to-face with a Debord quote: "In societies where modern conditions of production prevail, all of life presents itself as an immense accumulation of spectacles. — Read the rest
"Predictive policing" is the idea that you can feed crime stats to a machine-learning system and it will produce a model that can predict crime. It is garbage.
Every year, NYU's nonprofit, critical activist group AI Now releases a report on the state of AI, with ten recommendations for making machine learning systems equitable, transparent and fail-safe (2016, 2017); this year's report just published, written by a fantastic panel, including Meredith Whittaker (previously — one of the leaders of the successful googler uprising over the company's contract to supply AI tools to the Pentagon's drone project); Kate Crawford (previously — one of the most incisive critics of AI); Jason Schultz (previously — a former EFF attorney now at NYU) and many others.
"The Trouble with Bias," Kate Crawford's (previously) keynote at the 2017 Neural Information Processing Systems is a brilliant tour through different ways of thinking about what bias is, and when we should worry about it, specifically in the context of machine learning systems and algorithmic decision making — the best part is at the end, where she describes what we should do about this stuff, and where to get started. — Read the rest
Social scientist Kate Crawford (previously) and legal scholar Ryan Calo (previously) helped organize the interdisciplinary White House AI Now summits on how AI could increase inequality, erode accountability, and lead us into temptation and what to do about it.
Writing on Medium, AI researcher Kate Crawford (previously) and Simply Secure (previously) co-founder Meredith Whittaker make the case for a new scholarly discipline that "measures and assesses the social and economic effects of current AI systems."
Meredith from Simply Secure writes, "Artificial Intelligence is already with us, and the White House and New York University's Information Law Institute are hosting a major public symposium to face what the social and economic impacts might be. AI Now, happening July 7th in New York City, will address the real world impacts of AI systems in the next next 5-10 years."
Boing Boing is proud to publish two original documents disclosed by Edward Snowden, in connection with "Sherlock Holmes and the Adventure of the Extraordinary Rendition," a short story written for Laura Poitras's Astro Noise exhibition, which runs at NYC's Whitney Museum of American Art from Feb 5 to May 1, 2016.
The theory of Big Data is that the numbers have an objective property that makes their revealed truth especially valuable; but as Kate Crawford points out, Big Data has inherent, lurking bias, because the datasets are the creation of fallible, biased humans. — Read the rest
The Washington Post today published several big scoops related to the National Security Agency's surveillance programs. The paper's investigations were triggered by documents leaked to them "earlier this summer" by former NSA contractor Edward Snowden. He has sought political asylum from a number of nations, and is currently in Moscow. — Read the rest