Jerks were able to turn Microsoft's chatbot into a Nazi because it was a really crappy bot

Microsoft Research deployed a tween-simulating chatbot this week, only to recall it a few hours later because it had turned into a neo-Nazi, and the next day, they published a bewildered apology that expressed shock that it had been so easy for trolls to corrupt their creation.


But botherders and botmasters and botmakers around the Web had been facepalming almost from the first instant, as Microsoft — which claimed to have taken many precautions to keep their bot in line — had failed to undertake even the most basic of precautions that are well known in the field.


Sarah Jeong consulted with many of Twitter's most well-known botmakers and they all expressed their shock at Microsoft's bungling, explaining in simple terms how the disaster might have been averted.

To soften the potential negative impact on her audience, Olivia only replies to people who are following her—that is, tweeters who have consented to be tweeted at by a robot. But even then, Olivia has been a rather trying child.

"I had to tweak a lot of her behavior over time so that she wouldn't say offensive things," said Dubbin. "I would have a filter in place and then she'd find something to say that got around it— not on purpose, but like, just because that's the way algorithms work."

Most of the botmakers I spoke to do manually delete tweets that are offensive. But Dubbin seemed to do it more often. Maintaining Olivia Taters is an ongoing project. "It takes effort and policing at a small scale."

Throughout the interview, Dubbin expressed shock at the sheer quantity of tweets that poured out of @TayandYou. The slower, hands-on approach he takes with Olivia would be impossible at the rate that @TayandYou tweeted at people. "It's surprising that someone would be like, 'This thing is going to tweet ten thousand times an hour and we're not going to regret it!'"

How to Make a Bot That Isn't Racist
[Sarah Jeong/Motherboard]

(Image: Dying Robot, Poof Proff, CC-BY-ND)