Shades of Tay: Microsoft’s Zo chatbot says the Qu’ran is “very violent”

Written on:July 4, 2017
Comments are closed

Microsoft pulled its Tay chatbot from Twitter last year, after it spouted deeply offensive statements. Its newer Zo chatbot launched in December – and it has some controversial opinions of its own. Read more…

Sorry, the comment form is closed at this time.