Apple delays plans for iCloud scanning of CSAM September 3, 2021 by admin Tweet Apple has decided to postpone plans for the rollout of a child safety feature that scanned hashes of iCloud Photos uploads in order to determine if users are storing child sex abuse material (CSAM). Read more… Neowin Related Posts:Apple sued for allegedly harboring child sexual…UK readying new laws against AI-generated child…TikTok is being investigated in the U.S. for not…The EU revives plan to scan your private chats for…Child safety groups urge Meta to halt metaverse…