Online sex chat bot mobile site
The idea was to create a bot that would speak the language of 18- to 24-year-olds in the U.S., the dominant users of mobile social chat services. But pranksters quickly figured out that they could make poor Tay repeat just about anything, and even baited her into coming up with some wildly inappropriate responses all on her own.If you send an e-mail to the chatbot’s official web page now, the automatic confirmation page ends with these words. We’re making some adjustments.” But the company was more direct in an interview with , pointing their finger at bad people on the Internet.
Someone told her “you are a stupid machine.” She replied, “well I learn from the best,” and then drove the point home with capital letters.
Tay was an artificial intelligence chatterbot that was originally released by Microsoft Corporation via Twitter on March 23, 2016; it caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, forcing Microsoft to shut down the service only 16 hours after its launch.
Ars Technica reported Tay experiencing topic "blacklisting": Interactions with Tay regarding "certain hot topics such as Eric Garner (killed by New York police in 2014) generate safe, canned answers".
“It is as much a social and cultural experiment, as it is technical.” So, anonymous online humans twisted Tay to their own wicked will.
“‘teen girl’ AI…became a Hitler-loving sex robot within 24 hours,” screamed one headline at .