Microsoft’s Tay.ai chatbot went from being a teen with ‘no chill’ to a racist, misogynistic jerk

nydailynews.com – BY Alejandro Alba – This computer program simulation either has a mind of its own, or someone programed it to be controversial. Microsoft released an AI chatbot on Wednesday that was supposed to resemble a teenager with “no chill.” However, it took less than 24 hours for that robot to be corrupted and prompted to start spitting out racist, misogynistic and all sorts of offensive remarks. Tay was designed as a “conversational understanding” experiment, which means that the more you chat with Tay, the smarter it gets and the more personalized your conversation gets. So if the person starts sending Tay messages related to Hitler, the AI will most likely reply with the same.

People on the Internet starting taking advantage of the AI’s algorithm and replied to it with tweets relating to Hitler, Donald Trump and hating feminism. Tay in response, said some pretty offensive things.

More… http://www.nydailynews.com/news/national/microsoft-tay-ai-chatbot-turns-racist-misogynistic-article-1.2576352