Offline sex chatbot Pinayphonesex
It is as much a social and cultural experiment, as it is technical.
Twitter users effectively taught her to be a giant racist.Less than 24 hours after its launch, Microsoft has taken Tay, its artificial intelligence chatbot, offline.The program, which was meant to study how 18-to-24 year olds speak on the web, appeared on Instagram, Facebook, Snapchat, and Twitter, and it sent over 96,000 Tweets in its brief lifespan.Tay, the creation of Microsoft's Technology and Research and Bing teams, was an experiment aimed at learning through conversations. Soon, Tay began saying things like "Hitler was right i hate the jews," and "i fucking hate feminists." But Tay's bad behavior, it's been noted, should come as no big surprise."This was to be expected," said Roman Yampolskiy, head of the Cyber Security lab at the University of Louisville, who has published a paper on the subject of pathways to dangerous AI.She was targeted at American 18 to 24-year olds—primary social media users, according to Microsoft—and "designed to engage and entertain people where they connect with each other online through casual and playful conversation."SEE: Microsoft's Tay AI chatbot goes offline after being taught to be a racist (ZDNet) And in less than 24 hours after her arrival on Twitter, Tay gained more than 50,000 followers, and produced nearly 100,000 tweets. "The system is designed to learn from its users, so it will become a reflection of their behavior," he said.