Things that caught our eye

Concerned about Artificial Intelligence? Microsoft chat robot goes nuts

24 Mar , 2016  | by:

A chat bot by Microsoft this Wednesday, was taken down on Thursday because her self-learning capabilities had turned her into a racist and sex-loving robot who thought that ‘Hitler was right’.

How did that happen? Tay, the chat bot, learned from the conversations she had on Twitter, which apparently was a weakness in her programming. Elle Hunt from The Guardian writes:

“Tay in most cases was only repeating other users’ inflammatory statements, but the nature of AI means that it learns from those interactions. It’s therefore somewhat surprising that Microsoft didn’t factor in the Twitter community’s fondness for hijacking brands’ well-meaning attempts at engagement when writing Tay.”

Read on: The Guardian, ‘Tay, Microsoft’s AI chatbot, gets a crash course in racism from Twitter.’

, ,


Leave a Reply

Comments RSS Feed