Apple says its new child safety feature will look for images flagged in multiple countries

Apple has addressed privacy concerns regarding its sex abuse scanning by clarifying that the new feature would only flag accounts with at least 30 iCloud photos matching Child Sexual Abuse Material. Read more…
Neowin