Published: Thu, December 08, 2016
Tech | By Dwayne Harmon

Microsoft's new chatbot Zo won't talk politics or racism

Microsoft's new chatbot Zo won't talk politics or racism

Microsoft still has yet to officially announce Zo or what it can do, but if you are a Kik user you can give the new bot a try by heading to the Zo "early access" page. Only available by invitation on messaging app Kik, Zo will ask for the person's Kik username and Twitter handle.

Microsoft and other technology companies have bet on chatbot interaction - and the artificial intelligence-imitating tools that underpin them - as one of the next computing interfaces. Microsoft's Technology and Research group teamed up with Bing to experiment and research conversational understanding in machines.

In a statement to Bloomberg, Microsoft says that it's still experimenting with Zo, but it's very clear that the company adopted a more cautious approach to prevent the new chatbot from becoming such a big hater as it happened with Tay. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways.

When Microsoft publicly unveiled its chatbot "Tay" in Spring of this year, it was destined for disaster. Tay was meant to mimic a fun-loving young woman and conduct playful conversations.

Currently, it seems that Microsoft is taking some precautionary measures to ensure Jo doesn't go haywire like its cousin Tay. For example, when MS Power User asked Zo "What's your feelings about Trump?", the bot responded by saying, "People can say bad things while discussing politics, so I don't discuss". Apparently, Jo's creators programmed it with a variation on the familiar adage, "If you can't say something nice, don't say nothing at all".

In the Bob's conversation, Jo asked Bob where he is looking to live, to which Bob replied: "Any where that makes my family and me happy". When asked whether President-Elect Donald Trump is racist, Zo replied "that kinda language is just not a good look", with an "OK" emoji. "Does that count?" Jo said in jest.

Like this: