Podcast: Fake News is an Oracle

In my latest podcast, I read my new Locus column: Fake News is an Oracle. For many years, I've been arguing that while science fiction can't predict the future, it can reveal important truths about the present: the stories writers tell reveal their hopes and fears about technology, while the stories that gain currency in our discourse and our media markets tell us about our latent societal aspirations and anxieties.

Read the rest

A "Fake News Game" that "vaccinates" players against disinformation

Bad News is a free webgame created by two Cambridge psych researchers; in a 15-minute session, it challenges players to learn about and deploy six tactics used in disinformation campaigns ("polarisation, invoking emotions, spreading conspiracy theories, trolling people online, deflecting blame, and impersonating fake accounts"). Read the rest

Analysis of a far-right disinformation campaign aimed at influencing the EU elections

F-Secure Labs used a bot to harvest and analyze high-ranked disinformation tweets aimed at influencing the EU elections; they found that some of the highest-ranked xenophobic/Islamophobic disinformation came from a pair of related accounts: NewsCompact and PartisanDE, both in "the top three most engaged accounts in the EU election conversation space on Twitter two weeks ago." Read the rest

U.S. Cyber Command DDOS'd Russian troll factory's internet on 2018 midterms voting day: WaPo

The official cyberwarfare division of America's military successfully blocked off Internet access for the Russian government's notorious “troll factory” on the day of the 2018 U.S. midterm elections. Read the rest

'He has learned nothing,' Zuckerberg considers crowdsourcing news fact-checks for Facebook

Facebook founder and CEO Mark Zuckerberg reveals the company may crowdsource fact-checking as a new model for Facebook’s third-party factchecking partnerships, now that they've botched the deal they had with Snopes.

Earlier this month, we wrote that Snopes ended their 'debunking false stuff' partnership with Facebook.

This is the first time we've read that Mark Zuckerberg has come up with a new plan.

It sucks.

From today's new reporting at the Guardian:

In the first of a series of public conversations, Zuckerberg praised the efforts of factcheckers who partnered with Facebook following the 2016 presidential election as a bulwark against the flood of misinformation and fake news that was overtaking the site’s News Feed.

“The issue here is there aren’t enough of them,” he said. “There just aren’t a lot of factcheckers.”

He continued: “I think that the real thing that we want to try to get to over time is more of a crowdsourced model where people, it’s not that people are trusting some sort, some basic set of experts who are accredited but are in some kind of lofty institution somewhere else. It’s like do you trust? Like if you get enough data points from within the community of people reasonably looking at something and assessing it over time, then the question is: can you compound that together into something that is a strong enough signal that we can then use that?”

Here's the bullshit-free response from Snopes' Brooke Binkowski, same Guardian story:

Brooke Binkowski, the former managing editor of Snopes, a factchecking site that previously partnered with Facebook, said Zuckerberg’s comments signaled that he “has learned nothing at all”.

Read the rest

The Convoy: a glimpse of a deepfakes future owned by James O'Keefe-style hoaxers

On Motherboard, Brian Merchant's (previously) new science fiction story The Convoy poses an eerily plausible future for political deepfake hoaxing -- with James O'Keefe-alikes running the show -- that skillfully weaves in elements of the Innocence of Muslims hoax with the current state-of-the-art in high-tech fakery. Read the rest

Regardless of political affiliation, over-65s are most likely to share "fake news" (and there's not much fake news, and it's largely right-wing)

A peer-reviewed study conducted by a trio of Princeton and NYU political scientists and published in Science Advances systematically examined the proliferation of fake news in the 2016 election cycle and found that, contrary to earlier reports, disinformation did not get shared very widely, and that most of it was right-wing, and that the people who shared disinformation of all political orientation were over 65. Read the rest

'SANCTIONS ARE COMING' - Trump has 'Game of Thrones' poster of himself on table in Cabinet meeting

“Sanctions are Coming - November 4.”

For today's Cabinet meeting at the White House, there was a weird Trump Game of Thrones political meme poster, wall sized, displaying Trump's face and those words, placed right on the middle of the table as some kind of weird creepy internet fascist prop. Read the rest

Robert Mueller was target of Russian infowar, Senate report reveals

After Donald Trump was sworn in as President, Russia's information warfare teams focused on a new target: special counsel Robert Mueller. Read the rest

Using information security to explain why disinformation makes autocracies stronger and democracies weaker

The same disinformation campaigns that epitomize the divisions in US society -- beliefs in voter fraud, vaccine conspiracies, and racist conspiracies about migrants, George Soros and Black Lives Matter, to name a few -- are a source of strength for autocracies like Russia, where the lack of a consensus on which groups and views are real and which are manufactured by the state strengthens the hand of Putin and his clutch of oligarchs. Read the rest

Twitter kills pro-Saudi “botnet” spreading Khashoggi disinformation tweets

Twitter today pulled down a disinfo bot network that was amplifying pro-Saudi talking points about disappeared journalist Jamal Khashoggi, who is presumed to have been tortured and killed on orders of the government of Saudi Arabia. Read the rest

When should the press pay attention to trolls, lies and disinformation?

Whitney Phillips (previously), a researcher at the "think/do tank" Data & Society (previously) has prepared a snappy, short report on the paradox of covering disinformation campaigns, trolling, and outright lies? Read the rest

Facebook discloses ongoing political disinfo campaigns (It's probably Russia again)

Facebook is said to be revealing today that it has identified “coordinated political influence campaigns using fake accounts to influence the midterm elections on issues like “Unite the Right” and #AbolishICE,” reports the New York Times. The company has been working with FBI to investigate who's behind the campaigns, which apparently came to light a few weeks ago and have since been shut down by Facebook admins. Read the rest

The Biology of Disinformation: Interview with Rushkoff, Pescovitz, and Dunagan

Over at Mondo 2000, our old pal RU Sirius interviewed Douglas Rushkoff, Jake Dunagan, and I about the "The Biology of Disinformation," a new research paper we wrote for Institute for the Future about how media viruses, bots and computational propaganda have redefined how information is weaponized for propaganda campaigns. While technological solutions may seem like the most practical and effective remedy, fortifying social relationships that define human communication may be the best way to combat “ideological warfare” that is designed to push us toward isolation. From Mondo 2000:

R.U. Sirius: In a sense, you’re offering a different model than the one most of us usually think in, as regards memetics. Instead of fighting bad memes with good, or their memes with ours, are you suggesting that we look at memes themselves as viruses attacking us? Is that right?

Douglas Rushkoff: Yeah, that’s the simplest way of looking at it. That’s why I called memes in media “media viruses.” Even if they end up forcing important ideas into the cultural conversation, and even if they ultimately lead to good things, they do infect us from the outside. They attack our weak code, and continue to replicate until we repair it, or until we come to recognize the “shell” of the virus itself. I think what makes our analysis unique, compared with a lot of what’s out there, is that we’re not proposing yet another technosolutionist fix. Mark Zuckerberg wants to fight fake news with artificial intelligence. Great.

Read the rest

In two days, an EU committee will vote to crown Google and Facebook permanent lords of internet censorship

On June 20, the EU's legislative committee will vote on the new Copyright directive, and decide whether it will include the controversial "Article 13" (automated censorship of anything an algorithm identifies as a copyright violation) and "Article 11" (no linking to news stories without paid permission from the site). Read the rest

"The Biology of Disinformation," a paper by Rushkoff, Pescovitz, and Dunagan

My Institute for the Future colleagues Douglas Rushkoff, Jake Dunagan, and I wrote a research paper on the "Biology of Disinformation" and how media viruses, bots and computational propaganda have redefined how information is weaponized for propaganda campaigns. While technological solutions may seem like the most practical and effective remedy, fortifying social relationships that define human communication may be the best way to combat “ideological warfare” that is designed to push us toward isolation. As Rushkoff says, "adding more AI's and algorithms to protect users from bad social media is counterproductive: how about increasing our cultural immune response to destructively virulent memes, instead?" From The Biology of Disinformation:

The specter of widespread computational propaganda that leverages memetics through persuasive technologies looms large. Already, artificially intelligent software can evolve false political and social constructs highly targeted to sway specific audiences. Users find themselves in highly individualized, algorithmically determined news and information feeds, intentionally designed to: isolate them from conflicting evidence or opinions, create self-reinforcing feedback loops of confirmation, and untether them from fact-based reality. And these are just early days. If memes and disinformation have been weaponized on social media, it is still in the musket stage. Sam Woolley, director of the Institute for the Future’s (IFTF) Digital Intelligence Lab, has concluded that defenders of anything approaching “objective” truth are woefully behind in dealing with computational propaganda. This is the case in both technological responses and neuro-cultural defenses. Moreover, the 2018 and 2020 US election cycles are going to see this kind of cognitive warfare on an unprecedented scale and reach.

Read the rest

Leaked Kremlin memos reveal plan to destabilize Ukraine

The Kiberkhunta hacker group has dumped 2,000 messages from Putin aide Vladislav Surkov's email, including two documents related to the Kremlin's plans to consolidate their annexation of Ukraine: "Priority Action Plan to Destabilize the Social-Political Situation in Ukraine," and "Concrete Action Plan on the Promotion of the Federal Status of Zakarpattia Oblast." Read the rest

More posts