Taylor Swift threatened Microsoft with legal action over racist chatbot ‘Tay’
{{#rendered}} {{/rendered}}
Don't mess with Tay Tay.
Pop superstar Taylor Swift apparently tried to stop Microsoft from calling its chatbot Tay after the AI-powered bot morphed into a racist troll, according to Microsoft President Brad Smith.
In his new book, Tools and Weapons, Smith wrote about what happened when his company introduced a new chatbot in March 2016 that was meant to interact with young adults and teenagers on social media.
{{#rendered}} {{/rendered}}
“The chatbot seems to have filled a social need in China, with users typically spending fifteen to twenty minutes talking with XiaoIce about their day, problems, hopes, and dreams,” Smith and his co-author wrote in the book. “Perhaps she fills a need in a society where children don’t have siblings?”
MICROSOFT CONTRACTORS ARE LISTENING TO YOUR INTIMATE CHATS ON SKYPE: REPORT
DOZENS OF GOOGLE EMPLOYEES WERE RETALIATED AGAINST FOR REPORTING HARASSMENT
{{#rendered}} {{/rendered}}
The chatbot had been introduced in China first, where it was used for a range of different tasks, under a different name.
Unfortunately, once the bot launched in America, it became something very different after absorbing the racist and sexist vitriol that seems to be woven into the fabric of Twitter. The tech giant was forced to pull the plug on Tay less than 24 hours after its launch in America.
“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” explained a Microsoft spokesperson at the time. “As a result, we have taken Tay offline and are making adjustments.”
{{#rendered}} {{/rendered}}
When Smith was on vaction, he received a letter from a Beverly Hills law firm that said in part: We represent Taylor Swift, on whose behalf this is directed to you. ... the name ‘Tay,’ as I’m sure you must know, is closely associated with our client.”
The lawyer reportedly went on to argue that the use of the name Tay created a false and misleading association between the popular singer and the chatbot, and that it violated federal and state laws.
{{#rendered}} {{/rendered}}
According to Smith's book, the company decided not to fight Swift -- perhaps best for a singer rumored to hold grudges -- and quickly began discussing a new name for the chatbot.