======= ======= ====== ====== ====== ===== ==== ====== ====== ===== ==== ======= ======= ====== ====== ====== ===== ==== ====== ====== ===== ====
Yesterday, Microsoft rolled out their own Twitter bot named “Tay.” The bot is supposed to act like your typical millennial by responding to users’ tweets in a joking, conversational fashion. The goal of this experiment was for Microsoft to understand human conversation. Tay would get smarter as she had more and more conversations.
Thanks to Twitter trolls, she definitely got smarter… and a hell of a lot more racist. Microsoft has since taken Tay down for “maintenance,” while they try to delete all the tweets that she sent out.
One of the tweets that was taken down was the oh so popular “bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we’ve got.” Here are a few other tweets that Tay fired out yesterday.
Wow it only took them hours to ruin this bot for me.
This is the problem with content-neutral algorithms pic.twitter.com/hPlINtVw0V
— linkedin park (@UnburntWitch) March 24, 2016
— Jared Borislow (@DeVryGuy) March 23, 2016
Lol Tay you wild. Some hot takes in there. And they say humans can’t outsmart the machine! Microsoft might want to stick to their Office package, just to be safe..
[via Business Insider]
Image via Instagram/@tayandyou