Policy groups request Apple abandon plans to scan devices for child abuse imagery

Policy groups request Apple abandon plans to scan devices for child abuse imagery

An international coalition of policy and civil rights groups published an open letter Thursday asking Apple to “abandon its recently announced plans to build surveillance capabilities into iPhones, iPads and other Apple products.” The groups include the American Civil Liberties Union, the Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.

Earlier this month,  to use new tech within iOS to detect potential child abuse imagery with the goal of limiting the spread of child sexual abuse material (CSAM) online. Apple also announced a new “communication safety” feature, which will use on-device machine learning to identify and blur sexually explicit images received by children in its Messages app. Parents of children age 12 and younger can be notified if the child views or sends such an image.

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the groups wrote in 

Apple’s  details the plans, which call for on-device scanning before an image is backed up in iCloud. The scanning does not occur until a file is being backed up to iCloud, and Apple says it only receives data about a match if the cryptographic vouchers (uploaded to iCloud along with the image) for an account meet a threshold of matching known CSAM. Apple and other cloud email providers have used hash systems to scan for CSAM sent via email, but the new program would apply the same scans to images stored in iCloud, even if the user never shares or sends them to anyone else.

What's Your Reaction?

like
0
dislike
0
love
0
funny
0
angry
0
sad
0
wow
0