Microsoft has today launched ‘Zo,’ a new chatbot on messaging app Kik that will respond to users questions and engage in conversation.
We all remember Tay, that lovable, xenophobic, racist chatbot that Microsoft unleashed onto Twitter. What started off as a rather interesting premise soon descended into chaos when she began sprouting Nazi nonsense and slagging people off in some pretty horrific ways. Microsoft shut down the AI bot in March after claiming that the system had become corrupted, though it was really no surprise. When you design an AI to absorb Twitter and essentially regurgitate it like a living, breathing person… it was always going to end in racism and disgust.
Not content with their first failed attempt, the Redmond technology giant has today unleashed a new version onto popular messaging application Kik. Users of the application should now see a new contact if they search for the name ‘zo.ai.’
Zo seems a bit nicer than Tay
Zo is designed in a slightly different way to Tay and will act more as a conversational bot rather than social media troll. The chatbot will be able to answer questions, respond to emojis and use teenage slang that even I don’t understand. If you’ve ever used Cleverbot on the internet, the implementation is fairly similar. It seems to be okay with basic puns and phrases but will struggle if you start to say things deeply thoughtful or grammatically complicated. When faced with difficult questions, Zo will simply ask you a basic question that roughly corresponds with the topic, rather than giving a truly human response. Occasionally, Zo will refuse to respond at all.
Microsoft has not made an official announcement regarding the chatbot yet, so we’re unsure exactly what algorithms Zo is using to create conversation. Ideally it will learn over time and improve as more people converse through the app, though we’re not sure.
On the plus side… at least Zo won’t be proclaiming that “Hitler did nothing wrong” or taking a disdain to Afro-Caribbean people. Yes Tay, we remember.
Microsoft’s AI bot, Tay, became racist within hours. Microsoft is feverishly trying to delete posts now. pic.twitter.com/abqGCmlvQC
‘ á¼Î½Î±Î³Î½ÏÏιÏÎ¹Ï (@MelissaJaneSays) March 24, 2016
For more news, visit what Mobile’s dedicated news page.