Imitating humans, the Korean chatbot Luda was found to be racist and homophobic. A social media-based chatbot developed by a South Korean startup was shut down on Tuesday after users complained that it was spewing vulgarities and hate speech. The fate of the Korean service resembled the demise of Microsoft’s Tay chatbot in 2016 over racist and sexist tweets it sent, raising ethical questions about the use of artificial intelligence (AI) technology and how to prevent abuse.
The Korean startup Scatter Lab said on Monday that it would temporarily suspend the AI chatbot. It apologized for the discriminatory and hateful remarks it sent and a “lack of communication” over how the company used customer data to train the bot to talk like a human. The startup designed Lee Luda, the name of the chatbot, to be a 20-year-old female university student who is a fan of the K-pop girl group Blackpink. Launched in late December to great fanfare, the service learned to talk by analyzing old chat records acquired by the company’s other mobile application service called Science of Love. Unaware that their information was fed to the bot, some users have planned to file a class-action lawsuit against the company. Before the bot was suspended, users said they received hateful replies when they interacted with Luda. Michael Lee, a South Korean art critic and former LGBTQ activist, shared screenshots showing that Luda said “disgusting” in response to a question about lesbians.

via vice: AI Chatbot Shut Down After Learning to Talk Like a Racist Asshole