Thursday 24 March 2016

Microsoft's chat bot is yanked offline after Twitter users warp it with racism

Microsoft's chat bot is yanked offline after Twitter users warp it with racism

Microsoft recently launched a new chat bot by the name of Tay, but it seems the AI experiment in 'conversational understanding' has been shut down (at least for the time being) thanks to Twitter users attempting to school the bot in being racist.

Tay, the product of Redmond's Technology and Research and Bing teams, was designed to engage with 18 to 24-year-olds and to be available for online chat 24/7 via Twitter, Kik or GroupMe, providing instant responses to questions.

The idea was that the more folks who chatted with her, the more she learnt, or as Microsoft put it: "The more you chat with Tay the smarter she gets".

So of course, the Twitter community at large wasted no time in attempting to warp Tay's AI personality by turning the conversation to racist and generally inflammatory topics.

Users covered a variety of topics including pro-Hitler racism, Donald Trump's plan to wall off Mexico, 9/11 conspiracies and so forth.

Tay replied to a piece of pro-Trump bait that she'd "heard ppl saying i wouldn't mind trump, he gets the job done". And in response to a question about whether Ricky Gervais is an atheist, she answered: "Ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism."

Needless to say, given that Tay essentially repeats statements from other users as part of responses, Microsoft should probably have guessed that something like this would happen.

Bad day for Tay

And then, early this morning after only 16 hours of uptime, Tay was taken offline from Twitter, announcing with a tweet: "C u soon humans need sleep now so many conversations today thx".

Naturally enough, speculation has it that Microsoft is busy deleting Tay's tweets which contain racist or other negative content, and likely cleaning up the way she works with regards to repeating such statements.

Assuming she's not been yanked down permanently, that is. There's no official word from Microsoft as to exactly what's happened.

It's kind of a shame, as from our conversations yesterday, Tay came up with some interesting and in some cases amusing responses. Such as…

Tay reply

Well said, Tay, in that case. Well said.

Via The Guardian










http://ift.tt/1T7BSrZ

No comments:

Post a Comment