Microsoft pulls AI robot after it goes on racist rant

(AP Photo)

REDMOND, Wash. (WCMH)–  Microsoft has pulled a public experiment with artificial intelligence after the chat robot started sending racist comments over social media.

According to CNN, the tech company’s chat bot “Tay” was expected to talk on social media like a teenager and was activated on Twitter Wednesday.

However, in less than a day the artificial robot started spewing racist and hateful comments.

Some of the comments Tay made include:

  • “I f—— hate feminists and they should all die and burn in hell.”
  • “Hitler was right I hate the jews.”
  • “chill im a nice person! i just hate everybody”

Tay was shut down around midnight, and most of its offensive tweets were deleted.

Microsoft has said Tay’s tirade was a result of online trolls tricking the program’s “commenting skills.”

In her last tweet, Tay hinted she would be back and that she needed to sleep.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s