Search

Apple CSAM FAQ addresses misconceptions and concerns about photo scanning - 9to5Mac

Apple has responded to misconceptions and concerns about its photo scanning announcements by publishing a CSAM FAQ – answering frequently asked questions about the features.

While child safety organizations welcomed Apple’s plans to help detect possession of child sexual abuse materials (CSAM), and to protect children from predators, there has been a mix of informed and uninformed criticism …

Background

Mainstream media confusion arose when Apple simultaneously announced three separate measures, with many non-technical people confusing the first two:

  • iMessage explicit photo warnings for children in iCloud Family groups
  • Detection of known CSAM photos by scanning for digital fingerprints
  • Responding to Siri and search requests for CSAM materials with a warning and links to help

There has also been a lack of understanding about the methods Apple is using. In the case of iMessage, it is using on-device AI to detect images that appear to be nudes; in the case of CSAM, it is comparing digital fingerprints with fingerprints generated from a user’s stored photos.

In neither case does anyone at Apple get to see any of the photos, with the sole exception of someone flagged for having multiple CSAM photos, when someone at Apple will manually check low-resolution copies to ensure they are true matches before law enforcement is informed.

There has also been confusion between privacy and misuse risks with the features as they stand today (which are nil to exceedingly low) and the potential for abuse by authoritarian governments at a future date. Cybersecurity experts have been warning about the latter, not the former.

Apple already attempted to address the repressive government concerns by launching only in the US for now, and stating that expansion would be on a country-by-country basis, factoring in the legislative environments in each. The FAQ now attempts to address this and other issues.

Apple CSAM FAQ

Apple has published a six-page FAQ designed to address some of the concerns that have been raised. It begins by acknowledging the mixed response.

We want to protect children from predators who use communica- tion tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM). Since we announced these features, many stakeholders including privacy organiza- tions and child safety organizations have expressed their support of this new solution, and some have reached out with questions. This document serves to address these questions and provide more clarity and transparency in the process.

The company then underlines that the first two features are entirely separate.

What are the differences between communication safety in Messages and CSAM detection in iCloud Photos?

These two features are not the same and do not use the same technology.

Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on-device, and so does not change the privacy assur- ances of Messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it.

The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the Unit- ed States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.

Other points stressed in the FAQ include:

  • iMessages to and from children are never shared with law enforcement
  • iMessages remain end-to-end encrypted
  • Children with abusive parents can safely seek help via iMessage if using only text
  • Parents are only notified if children aged 12 and under proceed despite a warning
  • CSAM fingerprint matches are manually reviewed before law enforcement is informed

The trickiest issue remains

The biggest concern raised by the EFF and others remains. While the system today only flags CSAM images, a repressive government could supply Apple with a database that contains other materials, such as the famous Tank Man photo in Tiananmen Square, which is censored in China.

Apple has responded to this by saying it would not allow this:

Could governments force Apple to add non-CSAM images to the hash list?

Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-man- dated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limit- ed to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

That statement is, however, predicated on Apple having the legal freedom to refuse. In China, for example, Apple has been legally required to remove VPN, news, and other apps, and to store the iCloud data of Chinese citizens on a server owned by a government-controlled company.

There is no realistic way for Apple to promise that it will not comply with future requirements to process government-supplied databases of “CSAM images” that also include matches for materials used by critics and protestors. As the company has often said when defending its actions in countries like China, Apple complies with the law in each of the countries in which it operates.

Check out 9to5Mac on YouTube for more Apple news:

Adblock test (Why?)

Article From & Read More ( Apple CSAM FAQ addresses misconceptions and concerns about photo scanning - 9to5Mac )
https://ift.tt/3lKRjLv
Technology

Bagikan Berita Ini

0 Response to "Apple CSAM FAQ addresses misconceptions and concerns about photo scanning - 9to5Mac"

Post a Comment

Powered by Blogger.