Microsoft launched AI Chat Bot, Tay.ai. Internet Turned Her Into a Racist Sex Bot in 24H

Microsoft’s Tay is an artificial intelligent chat bot developed by their Technology and Research and Bing teams. The purpose is to experiment and research on conversational understanding. Tay.ai is now live on Twitter, Kik and GroupMe and has Facebook page. It’s designed to engage and entertain people where they connect with each other online through casual chats. The more you chat with Tay the smarter she gets, so the experience can be more personalized for the user.

According to Microsoft, they targeted people between 18 and 24 year old in the US, since this is the dominant user group on mobile devices, but you can chat with Tay, no matter the age or location if you don’t mind US millennials jokes.

You can ask Tay for a joke, play a game, ask for a story, send a picture to receive a comment back, ask for your horoscope and more. Tay may use the data that you provide to search on your behalf. Tay may also use information you share with her to create a simple profile to personalize your experience. Data and conversations you provide to Tay are anonymized and may be retained for up to one year to help improve the service.

Tay was built by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians, according to Microsoft. So, the conversations are definitely not meant to be serious and the AI will not be shy when it comes to cursing or using internet slang.

And then, guess what? Less than 24h later, the internet turned it into a racist sex bot, thanks to all the “learning” she got by chatting to humans.


Right now, the bot is offline, claiming she’s too tired.

Adrian Pica: CEE startup ecosystem expert, founder @150sec.com, community builder, startup mentor with a lean entrepreneurial mindset.