Menu Close

Apple plans to scan consumers’ iPhones for evidence of child sexual abuse

Apple iPhones

*The firm discloses the technology it is employing will monitor images stored on iCloud Photos, searching for matches of previously identified ‘Child Sexual Abuse Material (CSAM), but some experts say the technology could be abused by authorities to spy on consumers

Isola Moses | ConsumerConnect

In view of the reportedly widespread practice, Apple says that it plans to scan iPhones for images of child sexual abuse in the United States (US).

The plan though has received a warm welcome from certain child protection advocacy groups, but it is reportedly causing concern with security researchers, who worry that Apple’s intention could be exploited by authoritarian governments wanting to play Big Brother and spy on their citizens.

Apple Headquarters in the US

ConsumerConnect gathered the technology Apple is employing for the exercise will monitor images stored on iCloud Photos, searching for matches of previously identified “Child Sexual Abuse Material (CSAM),” the new, preferred term over “child pornography.”

The global tech giant claims its system is so accurate that it “ensures less than a one in one trillion chance per year of incorrectly flagging a given account.”

According to Apple, when the system lands on a match, a human will review the image. If that person confirms that the image qualifies as CSAM, the National Center for Missing and Exploited Children (NCMEC) will be notified and the user’s account will be immediately disabled, agency report said.

Apple noted the forthcoming versions of iOS and iPadOS set for release later this year will contain “new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy.”

Alhough most Apple users do not give much thought to cryptography, Apple already applies it, mostly in Safari, to regularly check derivations of a user’s passwords against a publicly available list of breached passwords to keep their account safe and secure, according to report.

A Herculean effort and game-changer?

Report indicates that Apple is looking at a monumental task in this regard.

The NCMEC views over 25 million images a year, and the U.S. is one of the largest producers of these types of images and videos.

In its analysis, the Canadian Centre for Child Protection stated that 67% of child sexual abuse material survivors are impacted much differently by the distribution of their images than they are by hands-on abuse.

Gina Cristiano of ADF Solutions, a mobile and digital forensics company, said: “The reason for this is tragic; distribution goes on perpetuity, and these images are permanent when they are constantly re-shared.”

In his remark on Apple’s proposal, John Clark, President and Chief Executive Officer (CEO) of the NCMEC in the US, stated that “Apple’s expanded protection for children is a game changer.

“With so many people using Apple products, these new safety measures have lifesaving potential for children.”

However, in spite of Apple’s good intentions, some privacy experts have expressed concern that the company is crossing a line.

Matthew Green, a cryptography researcher at Johns Hopkins University, raised concerns that Apple’s system could be deployed to frame innocent people simply by sending the person otherwise innocuous images, but ones created to prompt a match for child pornography, outwit Apple’s algorithm, and alert law enforcement.

Green said: “Researchers have been able to do this pretty easily.

“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content.”

This decision could also prompt governments to ask for all sorts of information about their citizens, said he.

He said: “Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone.”

“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for?’

“Does Apple say no? I hope they say no, but their technology won’t say no,” Green asked.

Kindly Share This Story

 

 

Kindly share this story