Shades of Tay: Microsoft’s Zo chatbot says the Qu’ran is “very violent”

Microsoft pulled its Tay chatbot from Twitter last year, after it spouted deeply offensive statements. Its newer Zo chatbot launched in December – and it has some controversial opinions of its own. Read more…
Neowin