Shades of Tay: Microsoft’s Zo chatbot says the Qu’ran is “very violent” July 4, 2017 by admin Tweet Microsoft pulled its Tay chatbot from Twitter last year, after it spouted deeply offensive statements. Its newer Zo chatbot launched in December – and it has some controversial opinions of its own. Read more… Neowin Related Posts:Another US presidential candidate's chatbot has been…The new Bing chatbot is tricked into revealing its…xAI to unveil its chatbot to a limited group on SaturdayGoogle Bard chatbot AI is now available without a…Microsoft already planning on bringing ads to Bing's chatbot