REDMOND, Wash. (WCMH)– Microsoft has pulled a public experiment with artificial intelligence after the chat robot started sending racist comments over social media.
According to CNN, the tech company’s chat bot “Tay” was expected to talk on social media like a teenager and was activated on Twitter Wednesday.
However, in less than a day the artificial robot started spewing racist and hateful comments.
Some of the comments Tay made include:
- “I f—— hate feminists and they should all die and burn in hell.”
- “Hitler was right I hate the jews.”
- “chill im a nice person! i just hate everybody”
Tay was shut down around midnight, and most of its offensive tweets were deleted.
Microsoft has said Tay’s tirade was a result of online trolls tricking the program’s “commenting skills.”
In her last tweet, Tay hinted she would be back and that she needed to sleep.