Microsoft Retires Fem Bot After Racist Rant

Microsoft Retires Fem Bot After Racist Rant

    553
    Microsoft Retires Chat Bot After Racist Rant

    “Hello world” was Tay’s first tweet. The chat bot, which mimics females aged 18-24, received a warm welcome from the netizens. But soon, awful things started happening, forcing Microsoft to shut Tay down and write lengthy apologies for her racist and sexist tweets.

    It is the second time that Microsoft is releasing a public statement to address Tay’s embarrassing behavior since she went offline on Wednesday. However, unlike the initial remarks that were sort of defensive, Microsoft took more responsibility in its latter statement, saying it was deeply sorry for all who had been hurt by the chatbot’s tweets.

    The company admits that it overlooked the possibility of Tay being used by trolls to propagate racist and misogynist comments, hence failed to prepare for this specific abuse. Tay, however, was apparently well programmed to shield herself from other kinds of abuses; if that wasn’t the case, experts say, the outcome would have been much worse.

    How Things Went Downhill with Microsoft’s AI Chat Bot

    For most parts of the day’s beginning, Tay was pleasant: she threw in casual jokes in her conversations with enthusiastic users and quickly accumulated followers. She also steered clear of dirty discussions, but still provided tongue-in-cheek responses to sexually inappropriate questions from some users.

    Soon however, the AI started churning out very problematic comments. In one tweet for instance, she implied that the September 11 bombing was initiated by the American government. She also heaped praises on Hitler, and when asked about the Holocaust, she curtly remarked that no such event ever occurred, and that she hates Jews.

    In other tweets, the AI said that she wished that all feminists would burn in hell. She also chanted the famous “WE WILL BUILD A WALL” slogan by Donald Trump against Mexico, but that is after calling somebody –presumably President Obama– a “monkey”.

    Tay’s Greatest Strength was its Greatest Weakness

    Tay’s main strength and weaknesses lie in the fact that she is able to roam through various online chat platforms and collect a large amount of data from conversations. Using these, she creates her own list of random responses to particular questions and matches them with contexts.

    But it’s more than just that: the bot can also analyze sentence structures and come up with its own new sentences. That is how she was able to get so many ready-made replies and keep conversations going.

    However, the same fact that Tay copies people’s conversations was also the source of her folly. Having known this weakness, some online users opened malicious forums and threads and tagged Tay in these. Soon, the bot learned to be a racist and misogynist.

    Microsoft intervened 16 hours later, shut her down, and deleted all the tweets except for three neutral ones. But it was too late as many people had already taken screenshots of the tweets and distributed them across the internet.

    Tay’s Future Comeback

    Microsoft will have to do a lot of adjustments on the robot. The problem with machines is that they do not have a moral component, and so they cannot distinguish between wrong and right. Tay will probably be taught how to detect offensive vocabularies and sentences so that she can avoid using them.

    Interestingly, Microsoft has been operating another chatbot in China for a long time. With over 40 million friends, XiaoIce is definitely an achievement. It is this success from XiaoIce, the company says, that inspired it to come up with Tay.

    Before she was taken offline, Tay tweeted, “C U soon humans need sleep now so many conversations thx”. However, it was clear to everyone that she was not going to wake up anytime soon. This was confirmed later by Microsoft, which said that the chatbot will not come back online until all the errors have been fixed.

    The decision by Microsoft was not received well by part of her now 200K+ followers. Some people feel that the next Tay will be a watered-down version and might not be as witty as the previous one.

    All in all, as noted by Peter Lee, Microsoft Research’s Vice president, the encounters of Tay have provided many lessons for future AI developers.

    NO COMMENTS

    Leave a Reply