Recently, Microsoft released an experiment on Twitter: a chatbot called Tay that mimicked the personality of a 19-year old American girl.
Sadly, as with any newly born child, Tay’s innocence didn’t last long. The Guardian reported that, within some hours:
Tay’s conversation extended to racist, inflammatory, and political statements. Her Twitter conversations have so far reinforced the so-called Godwin’s law—that as an online discussion goes on, the probability of a comparison involving the Nazis or Hitler approaches—with Tay having been encouraged to repeat variations on “Hitler was right” as well as “9/11 was an inside job.”