ElevenLabs is a company that provides text-to-voice services that essentially you to "clone" your own voice using generative AI. This is a potentially positive use of this kind of technology, offering an assistive aide for people who've lost the ability to speak.
Unfortunately, the company's terms of service apparently limit the kind of language that you can actually train your clone voice to say. Or, well, sort of. MIT Technology Review has the wild story of Joyce Esser, a British woman dealing with Bulbar Motor Neuron Disease, which has limited the use of her voice. Joyce turns to ElevenLabs to occasionally communicate with her husband, but soon discovered the limitations of the software:
Joyce doesn't use her voice clone all that often. She finds it impractical for everyday conversations. But she does like to hear her old voice and will use it on occasion. One such occasion was when she was waiting for her husband, Paul, to get ready to go out.
Joyce typed a message for her voice clone to read out: "Come on, Hunnie, get your arse in gear!!" She then added: "I'd better get my knickers on too!!!"
"The next day I got a warning from ElevenLabs that I was using inappropriate language and not to do it again!!!"
Joyce assumed it was an automated flag, and a mistake. So the next day, she tried to use the clone voice again—and a human moderator banned it.
Joyce was ultimately able to appeal the ban and get her account reinstated; after all, as she pointed out, there wasn't anything in the terms of service that specifically restricted the use of curse words or other inappropriate language. (A representative from the company told Technology Review that the system had likely flagged her language as a "threat.")
Still, the situation raises some serious questions about content moderation with assistive AI tools. Should ElevenLabs be monitoring the way that people use their clone voices at all—especially if they're dealing with a degenerative disease that may already be compromising the language centers of their brain?
A woman made her AI voice clone say "arse." Then she got banned. [Jessica Hamzelou / MIT Technology Review]