Search

Apple pushes back against child abuse scanning concerns in new FAQ - The Verge

In a new FAQ, Apple has attempted to assuage concerns that its new anti-child abuse measures could be turned into surveillance tools by authoritarian governments. “Let us be clear, this technology is limited to detecting CSAM [child sexual abuse material] stored in iCloud and we will not accede to any government’s request to expand it,” the company writes.

Apple’s new tools, announced last Thursday, include two features designed to protect children. One, called “communication safety,” uses on-device machine learning to identify and blur sexually explicit images received by children in the Messages app, and can notify a parent if a child age 12 and younger decides to view or send such an image. The second is designed to detect known CSAM by scanning users’ images if they choose to upload them to iCloud. Apple is notified if CSAM is detected, and it will alert the authorities when it verifies such material exists.

The plans met with a swift backlash from digital privacy groups and campaigners, who argued that these introduce a backdoor into Apple’s software. These groups note that once such a backdoor exists there is always the potential for it to be expanded to scan for types of content that go beyond child sexual abuse material. Authoritarian governments could use it to scan for politically dissent material, or anti-LGBT regimes could use it to crack down on sexual expression.

“Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” the Electronic Frontier Foundation wrote. “We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of ‘terrorist’ content that companies can contribute to and access for the purpose of banning such content.”

However, Apple argues that it has safeguards in place to stop its systems from being used to detect anything other than sexual abuse imagery. It says that its list of banned images is provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations, and that the system “only works with CSAM image hashes provided by NCMEC and other child safety organizations.” Apple says it won’t add to this list of image hashes, and that the list is the same across all iPhones and iPads to prevent individual targeting of users.

The company also says that it will refuse demands from governments to add non-CSAM images to the list. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” it says.

It’s worth noting that despite Apple’s assurances, the company has made concessions to governments in the past in order to continue operating in their countries. It sells iPhones without FaceTime in countries that don’t allow encrypted phone calls, and in China it’s removed thousands of apps from its App Store, as well as moved to store user data on the servers of a state-run telecom.

The FAQ also fails to address some concerns about the feature that scans Messages for sexually explicit material. The feature does not share any information with Apple or law enforcement, the company says, but it doesn’t say how it’s ensuring that the tool’s focus remains solely on sexually explicit images.

“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts,” wrote the EFF. The EFF also notes that machine-learning technologies frequently classify this content incorrectly, and cites Tumblr’s attempts to crack down on sexual content as a prominent example of where the technology has gone wrong.

Adblock test (Why?)

Article From & Read More ( Apple pushes back against child abuse scanning concerns in new FAQ - The Verge )
https://ift.tt/3jBo4If
Technology

Bagikan Berita Ini

0 Response to "Apple pushes back against child abuse scanning concerns in new FAQ - The Verge"

Post a Comment

Powered by Blogger.