Learning from Tay


Recently, Microsoft released an experiment on Twitter: a chatbot called Tay that mimicked the personality of a 19-year old American girl.

Sadly, as with any newly born child, Tay’s innocence didn’t last long. The Guardian reported that, within some hours:

Tay’s conversation extended to racist, inflammatory, and political statements. Her Twitter conversations have so far reinforced the so-called Godwin’s law—that as an online discussion goes on, the probability of a comparison involving the Nazis or Hitler approaches—with Tay having been encouraged to repeat variations on “Hitler was right” as well as “9/11 was an inside job.”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s