News

Watchdogs Sound Alarm After Apple Reveals Plan To Upload Software To iPhones That Scans User’s Photos

   DailyWire.com
SALT LAKE, UT - AUGUST 04: A person scans a QR code on an Apple Watch to temporarily send their digital driver's license to another mobile phone at a Harmons Grocery store on August 4, 2021 in Salt Lake City, Utah. Utah is the first state in the nation to start to convert and offer digital driver licenses on mobile devices.
George Frey/Getty Images

Privacy watchdog groups sounded the alarm late on Thursday evening after tech giant Apple revealed that the company will be uploading software to user’s iPhones that scans for images of child sex abuse, warning that the move creates a backdoor to user’s private lives that it is essentially opening Pandora’s box and that it will be used by governments.

“Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices,” the Financial Times reported. “The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.”

The report noted that the system will be called “neuralMatch” and will only be initially rolled out in the U.S. with Apple adding in a blog post that the software will “evolve and expand over time”. The software is expected to be included in iOS 15, which is set to be released next month.

The company claimed that the software provides “significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.”

However, despite Apple’s claims, academics and privacy watchdogs are deeply concerned about what the move signals long-term.

Ross Anderson, professor of security engineering at the University of Cambridge, said: “It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of . . . our phones and laptops.”

The New York Times explained how the purported technology will work:

The iPhone operating system will soon store a database of hashes of known child sexual abuse material provided by organizations like the National Center for Missing & Exploited Children, and it will run those hashes against the hashes of each photo in a user’s iCloud to see if there is a match.

Once there are a certain number of matches, the photos will be shown to an Apple employee to ensure they are indeed images of child sexual abuse. If so, they will be forwarded to the National Center for Missing & Exploited Children, and the user’s iCloud account will be locked. Apple said this approach meant that people without child sexual abuse material on their phones would not have their photos seen by Apple or the authorities.

“If you’re storing a collection of [child sexual abuse material], yes, this is bad for you,” said Erik Neuenschwander, Apple’s privacy chief. “But for the rest of you, this is no different.”

“No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this,” Edward Snowden tweeted. “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs—*without asking.*”

Got a tip worth investigating?

Your information could be the missing piece to an important story. Submit your tip today and make a difference.

Submit Tip
The Daily Wire   >  Read   >  Watchdogs Sound Alarm After Apple Reveals Plan To Upload Software To iPhones That Scans User’s Photos