Steve Proehl | Corbis Unreleased | Getty Images

Apple defended its new system to scan iCloud for illegal child sexual abuse materials (CSAM) on Monday during an ongoing controversy over whether the system reduces Apple user privacy and could be used by governments to surveil citizens.

Last week, Apple announced it has started testing a system that uses sophisticated cryptography to identify when users upload collections of known child pornography to its cloud storage service. It says it can do this without learning about the contents of a user’s photos stored on its servers.

Apple reiterated on Monday that its system is more private than those used by companies like Google and Microsoft because its system uses both its servers and software running on iPhones.

Privacy advocates and technology commentators are worried Apple’s new system, which includes software that will be installed on people’s iPhones through an iOS update, could be expanded in some countries through new laws to check for other types of images, like photos with political content, instead of just child pornography.

Apple said in a document posted to its website on Sunday governments cannot force it to add non-CSAM images to a hash list, or the file of numbers that correspond to known child abuse images Apple will distribute to iPhones to enable the system.

“Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups,” Apple said in the document. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.”

It continued: “Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.”

Some cryptographers are worried about what could happen if a country like China were to pass a law saying the system has to also include politically sensitive images. Apple CEO Tim Cook has previously said that the company follows laws in every country where it conducts business.

Companies in the U.S. are required to report CSAM to the National Center for Missing & Exploited Children and face fines up to $300,000 when they discover illegal images and don’t report them.

A reputation for privacy

Apple’s reputation for defending privacy has been cultivated for years through its actions and marketing. In 2016, Apple faced off against the FBI in court to protect the integrity of its on-device encryption systems in the investigation of a mass shooter.

But Apple has also faced significant pressure from law enforcement officials about the possibility of criminals “going dark,” or using privacy tools and encryption to prevent messages or other information from being within the reach of law enforcement.

The controversy over Apple’s new system, and whether it’s surveilling users, threatens Apple’s public reputation for building secure and private devices, which the company has used to break into new markets in personal finance and healthcare.

Critics are concerned the system will partially operate on an iPhone, instead of only scanning photos that have been uploaded to the company’s servers. Apple’s competitors typically only scan photos stored on their servers.

“It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours,” technology commentator Ben Thompson wrote in a newsletter on Monday.

Apple continues to defend its systems as a genuine improvement that protects children and will reduce the amount of CSAM being created while still protecting iPhone user privacy.

Apple said its system is significantly stronger and more private than previous systems by every privacy metric the company tracks and that it went out of its way to build a better system to detect these illegal images.

Unlike current systems, which run in the cloud and can’t be inspected by security researchers, Apple’s system can be inspected through its distribution in iOS, an Apple representative said. By moving some processing onto the user’s device, the company can derive stronger privacy properties, such as the ability to find CSAM matches without running software on Apple servers that check every single photo.

Apple said on Monday its system doesn’t scan private photo libraries that haven’t been uploaded to iCloud.

Apple also confirmed it will process photos that have already been uploaded to iCloud. The changes will roll out through an iPhone update later this year, after which users will be alerted that Apple is beginning to check photos stores on iCloud against a list of fingerprints that correspond to known CSAM, Apple said.

You May Also Like

Elon Musk suggests Tesla may have dumped bitcoin holdings

In this article TSLA Elon Musk, founder of SpaceX and chief executive…

Trump to say he is suing Twitter, Facebook, Google and CEOs Dorsey, Zuckerberg, Pichai

Jaap Arriens | NurPhoto | Getty Images Former President Donald Trump on…
Elon Musk debuts ‘Grok’ AI bot to rival ChatGPT, others

Elon Musk debuts ‘Grok’ AI bot to rival ChatGPT, others

Elon Musk announced his new company xAI which he says has the…
Vivo X100 Pro+ May Feature a Samsung 2K Display, Sony LYT-900 Rear Camera

Vivo X100 Pro+ May Feature a Samsung 2K Display, Sony LYT-900 Rear Camera

Vivo X100 series with the regular Vivo X100 and Vivo X100 Pro…