Yet another chatbot, trained on online utterances, starts spewing hate

Here we go again: Another chatbot is trained on a big pile 'o online utterances, and — surprise, surprise — having soaked up Internet bile, begins repeating it.

Haven't developers learned anything from Microsoft's Tay? Man, this happens basically every time someone tries digesting online talk through the four stomachs of their neural network.

As Vice reports, the bot — named Lee Luda, depicted by the creators in that cartoon above — was the creation of Scatter Labs in Korea:

Launched in late December to great fanfare, the service learned to talk by analyzing old chat records acquired by the company's other mobile application service called Science of Love. [snip]

Before the bot was suspended, users said they received hateful replies when they interacted with Luda. Michael Lee, a South Korean art critic and former LGBTQ activist, shared screenshots showing that Luda said "disgusting" in response to a question about lesbians.

Another user, Lee Kwang-suk, a professor of Public Policy and Information Technology at the Seoul National University of Science and Technology, shared screenshots of a chat where Luda called "Black people" heukhyeong, meaning "black brother," a racial slur in South Korea. The bot was also shown to say, "Yuck, I really hate them," in a response to a question about transgender people. The bot ended the message with a crying emoticon.

In the Monday statement, Scatter Lab defended itself and said it did "not agree with Luda's discriminatory comments, and such comments do not reflect the company's ideas."