Apple says its new child safety feature will look for images flagged in multiple countries August 14, 2021 by admin Tweet Apple has addressed privacy concerns regarding its sex abuse scanning by clarifying that the new feature would only flag accounts with at least 30 iCloud photos matching Child Sexual Abuse Material. Read more… Neowin Related Posts:Apple delays plans for iCloud scanning of CSAMTikTok is being investigated in the U.S. for not…Apple VP on iCloud Photos scanning: We know people…Reddit user reverse-engineers what they believe is…Apple reveals more details about its child safety…