Apple says its new child safety feature will look for images flagged in multiple countries August 14, 2021 by admin Tweet Apple has addressed privacy concerns regarding its sex abuse scanning by clarifying that the new feature would only flag accounts with at least 30 iCloud photos matching Child Sexual Abuse Material. Read more… Neowin Related Posts:Apple sued for allegedly harboring child sexual…UK readying new laws against AI-generated child…TikTok is being investigated in the U.S. for not…The EU revives plan to scan your private chats for…Elon Musk's X fined over $400,000 for failing to…