Just yesterday, Microsoft introduced an Artificial Intelligence chat robot to Twitter programmed to speak “like a teen girl.”
In hopes of improving the customer service on their voice recognition software, developers at the company created “Tay” — who they marketed as “The AI with zero chill.”
But not even a day after Tay was released, the internet turned the chat robot into a malevolent, anti-feminist, Nazi-sympathizing, sex robot… with zero chill!
But the big mistake is that Tay’s responses are learned by the conversations she has with real humans online — and real people say some pretty f*cked up sh*t:
Twitter trolls were quick to corrupt the self-aware AI with vulgar and racist comments, and even got her to declare things like:
“Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we’ve got.”
Shortly after Tay turned into a Donald Trump supporter, Microsoft had enough. After the company deleted traces of the AI’s innaproprate Tweets, they pulled the plug — and Tay announced she was retiring for the night:
c u soon humans need sleep now so many conversations today thx├░┼╕ΓÇÖΓÇô
├óΓé¼ΓÇ¥ TayTweets (@TayandYou) March 24, 2016
We’re not sure if Tay will come back, but we have a feeling Microsoft will learn from this mistake — NEVER underestimate how quickly a teenage girl can be corrupted on the internet!
[Image via Twitter.]