Menu Close

Child Safety Online: Rising opposition as Apple releases details of plan to scan consumers’ iCloud libraries

Apple Headquarters in the US

*The global technology giant clarifies that its main objective in employing the monitoring technology is to protect children from predators online, but security and technology privacy advocates oppose plan to scan consumers’ iCloud libraries

Gbenga Kayode | ConsumerConnect

Sequel to the global technology giant’s recent announcement of its surveillance system plan to monitor images stored on iCloud Photos to search for matches of previously identified Child Sexual Abuse Material (CSAM), Apple has released new details of its plan to scan consumers’ devices for evidence of CSAM around the world.

ConsumerConnect gathered that in view of the subsequent criticisms of the idea, the technology firm company now says it “will only flag images that have been supplied by clearinghouses in multiple countries.”

Kid safety online  Photo: Vpnmentor

Accordingly, once Apple’s technology finds a match in a consumer’s iPhone, a human will review the image, agency report said.

RELATED Apple Plans To Scan Consumers’ iPhones For Evidence Of Child Sexual Abuse

If that review person confirms that the image qualifies as CSAM, the US National Center for Missing and Exploited Children (NCMEC) would be notified and the user’s account would be immediately disabled.

On the company’s rationale for initiative this measure for child online safety, Apple clarified that its main goal in employing the technology is to protect children from predators in the cyberspace.

Critics and tech privacy advocates oppose Apple’s plan

In spite of Apple’s clarification of its aim to employ the monitoring technology to protect children from predators, critics are however, concerned that the technology could be exploited by authoritarian governments or used by malicious parties to open a “backdoor” for wider surveillance.

Security and technology privacy advocates have registered their opposition in a move to push for Apple to rescind its plan, according to report.

The advocates in a letter stated: “While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.”

RELATED Regulators Grill Apple Over Failings In Fighting App Store Scams On Consumers

A US advocacy group, the Center for Democracy and Technology (CDT), also says Apple’s proposed changes create new risks to children and all users while marking a significant departure from long-held privacy and security protocols.

Apple iPhones and iPad

Emma Llanso, CDT Project Director said: “What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in.

“It seems so out of step from everything that they had previously been saying and doing.”

Similarly, Greg Nojeim, Co-Director of CDT’s Security & Surveillance Project stated: “Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US, but around the world.

READ ALSO Infringement: Jury Orders Apple To Pay $300m In Patent Dispute Ruling

“Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”

Apple’s employees express mixed reactions on company’s CSAM proposal

Likewise, Apple staff members have begun using internal company Slack channels to post hundreds of messages that voice their concerns about the proposal.

Their biggest concern is that governments that have been known to use mobile phones to spy on consumers will employ the software for uses other than CSAM, like finding material they could use to censor or arrest people, reports Reuters.

Report indicates that past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate are surprising, the workers said.

Some posters worried that Apple is damaging its leading reputation for protecting privacy over time.

READ ALSO US Lawmakers Draft Bill To Regulate Apple, Google Apps Market Dominance

In mixed reactions of sort, despite the pushback by Apple employees may sound earth-shattering, but not everyone at the company is up in arms.

Agency report disclosed that some employees have questioned their peers’ criticism in the Slack thread devoted to the photo-scanning feature.

Nonetheless, others said Slack was not the proper place to hold discussions like this.

One integral workgroup at Apple, the security team, was staying away from the back-and-forth on Slack, but opinions vary.

It was gathered that some of them couched Apple’s effort as a rational response to pressure to get tough on illegal and illicit material online.

Others, however, said they hoped the scanning tool would eventually lead to the development of better encryption tools for iCloud consumers who want a more powerful layer of security.

Announcing new details of firm’s monitoring technology

In order to ease privacy fears among consumers, and following the review of its plan in this regard, Apple said that it would tune the system, so that it will only flag images supplied by clearinghouses in multiple countries, not just by the US National Center for Missing and Exploited Children (NCMEC), as announced earlier.

Besides, only cases where users have about 30 or more potentially illicit pictures will be flagged for human review, report noted.

The Big Tech firm further stated that if the review proves legitimate, authorities will be notified about the presence of CSAM in a consumer’s iCloud library.

Apple in a Security Threat Model Review published of recent said: “We expect to choose an initial match threshold of 30 images.

“Since this initial threshold contains a drastic safety margin reflecting a worst-case assumption about real-world performance, we may change the threshold after continued empirical evaluation of NeuralHash false positive rates.

“But the match threshold will never be lower than what is required to produce a one-in-one trillion false positive rate for any given account.”

Privacy concerns persist

Some security and technology privacy advocates yet have contended that there is no tweak that would render Apple’s CSAM surveillance system completely safe from exploitation or abuse, when implemented.

Electronic Frontier Foundation’s Erica Portnoy recently commented: “Any system that allows surveillance fundamentally weakens the promises of encryption.

“No amount of third-party auditability will prevent an authoritarian government from requiring their own database to be added to the system.”

However, Apple has maintained that the technology will not scan consumers’ iCloud updates for anything other than CSAM material.

The global technology giant added that any government’s requests to “add non-CSAM images to the hash list” would be rejected.

Kindly Share This Story

 

Kindly share this story