Print Print    Close Close

Taylor Swift threatened Microsoft with legal action over racist chatbot ‘Tay’

By Christopher Carbone

Published September 10, 2019

Fox News
Fox News Flash top headlines for Sept. 10 Video

Don't mess with Tay Tay.

Pop superstar Taylor Swift apparently tried to stop Microsoft from calling its chatbot Tay after the AI-powered bot morphed into a racist troll, according to Microsoft President Brad Smith.

In his new book, Tools and Weapons, Smith wrote about what happened when his company introduced a new chatbot in March 2016 that was meant to interact with young adults and teenagers on social media.

“The chatbot seems to have filled a social need in China, with users typically spending fifteen to twenty minutes talking with XiaoIce about their day, problems, hopes, and dreams,” Smith and his co-author wrote in the book. “Perhaps she fills a need in a society where children don’t have siblings?”

MICROSOFT CONTRACTORS ARE LISTENING TO YOUR INTIMATE CHATS ON SKYPE: REPORT

Taylor Swift arrives at the MTV Video Music Awards at the Prudential Center on Monday, Aug. 26, 2019, in Newark, N.J. (Photo by Evan Agostini/Invision/AP)

Taylor Swift arrives at the MTV Video Music Awards at the Prudential Center on Monday, Aug. 26, 2019, in Newark, N.J. (Photo by Evan Agostini/Invision/AP)

DOZENS OF GOOGLE EMPLOYEES WERE RETALIATED AGAINST FOR REPORTING HARASSMENT

The chatbot had been introduced in China first, where it was used for a range of different tasks, under a different name.

Unfortunately, once the bot launched in America, it became something very different after absorbing the racist and sexist vitriol that seems to be woven into the fabric of Twitter. The tech giant was forced to pull the plug on Tay less than 24 hours after its launch in America.

“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” explained a Microsoft spokesperson at the time. “As a result, we have taken Tay offline and are making adjustments.”

When Smith was on vaction, he received a letter from a Beverly Hills law firm that said in part: We represent Taylor Swift, on whose behalf this is directed to you. ... the name ‘Tay,’ as I’m sure you must know, is closely associated with our client.”

Microsoft president says it's time for big tech to take responsibility in America Video

The lawyer reportedly went on to argue that the use of the name Tay created a false and misleading association between the popular singer and the chatbot, and that it violated federal and state laws.

GET THE FOX NEWS APP

According to Smith's book, the company decided not to fight Swift -- perhaps best for a singer rumored to hold grudges -- and quickly began discussing a new name for the chatbot.

Print Print    Close Close

URL

https://www.foxnews.com/tech/taylor-swift-microsoft-legal-racist-chatbot-tay

  • Home
  • Video
  • Politics
  • U.S.
  • Opinion
  • Entertainment
  • Tech
  • Science
  • Health
  • Travel
  • Lifestyle
  • World
  • Sports
  • Weather
  • Privacy
  • Terms

This material may not be published, broadcast, rewritten, or redistributed. © FOX News Network, LLC. All rights reserved. Quotes displayed in real-time or delayed by at least 15 minutes. Market data provided by Factset. Powered and implemented by FactSet Digital Solutions. Legal Statement. Mutual Fund and ETF data provided by LSEG. Do Not Sell my Personal Information - New Terms of Use - FAQ