08/24 Update below. This post was originally published on August 21
iPhone users have put up with a lot in recent months but the company’s new CSAM detection system has proved to be a lightning rod of controversy that stands out from all the rest. And if you were thinking of quitting your iPhone over it, a shocking new report might just push you over the edge.
Apple’s will soon scan the iCloud image libraries of iPhone, iPad and Mac users for photos of child … [+]
In a new editorial published by The Washington Post, a pair of researchers who spent two years developing a CSAM (child sexual abuse material) detection system similar to the one Apple plans to install on users’ iPhones, iPads and Macs next month, have delivered an unequivocal warning: it’s dangerous.
08/23 Update: this story has taken a dramatic twist today after Apple admitted it has already been running its controversial CSAM detection system on iCloud Mail for the last three years. “Apple confirmed to me that it has been scanning outgoing and incoming iCloud Mail for CSAM attachments since 2019. Email is not encrypted, so scanning attachments as mail passes through Apple servers would be a trivial task,” reports 9to5Mac writer Ben Lovejoy. “Apple also indicated that it was doing some limited scanning of other data, but would not tell me what that was, except to suggest that it was on a tiny scale.”
Lovejoy notes that Apple quietly admitted this in a now-archived version of its child safety page while CPO Jane Horvath admitted “the company uses screening technology to look for the illegal images,” in a 2020 interview with UK newspaper, The Telegraph. Although the paper notes Apple “does not specify how it discovers it”. These revelations throw petrol on the fire and the kickback against these measures — however well intended — is likely to increase significantly in the coming days and weeks. This story is going to run and run.
“We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous,” state Jonathan Mayer and Anunay Kulshrestha, the two Princeton academics behind the research. “Our system could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.”
This has been the predominant fear regarding Apple’s CSAM initiative. The goal of the technology to reduce child abuse is indisputably important but the potential damage that could come from hackers and governments manipulating a system designed to search your iCloud photos and report abusive content, is clear to all.
“China is Apple’s second-largest market, with probably hundreds of millions of devices. What stops the Chinese government from demanding Apple scan those devices for pro-democracy materials?” ask the researchers.
And critics have plenty of ammunition here. Earlier this year, Apple was accused of compromising on censorship and surveillance in China after agreeing to move the personal data of its Chinese customers to the servers of a state-owned Chinese firm. Apple also states that it provided customer data to the US government almost 4,000 times last year.
iCloud stores photos from iPhones, iPads and Macs
“We spotted other shortcomings,” Mayer and Kulshrestha explain. “The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny.”
And recent history doesn’t bode well. Last month, revelations about the Pegasus project exposed a global business which had been successfully hacking iPhones for years and selling their technology to foreign governments for surveillance of anti-regime activists, journalists, and political leaders from rival nations. With access to Apple technology designed to scan and flag the iCloud photos of a billion iPhone owners, this could go a lot further.
Prior to Mayer and Kulshrestha speaking out, over 90 civil rights groups worldwide had already written a letter to Apple claiming that the technology behind CSAM “will have laid the foundation for censorship, surveillance, and persecution on a global basis.”
Apple has subsequently defended its CSAM system, claiming it was poorly communicated and a “recipe for this kind of confusion” but the company’s responses did little to impress Mayer and Kulshrestha.
“Apple’s motivation, like ours, was to protect children. And its system was technically more efficient and capable than ours,” they said. “But we were baffled to see that Apple had few answers for the hard questions we’d surfaced.”
Now Apple finds itself in a mess of its own making. For years, the company has put considerable effort into marketing itself as the champion of user privacy with the company’s official privacy page declaring:
“Privacy is a fundamental human right. At Apple, it’s also one of our core values. Your devices are important to so many parts of your life. What you share from those experiences, and who you share it with, should be up to you. We design Apple products to protect your privacy and give you control over your information. It’s not always easy. But that’s the kind of innovation we believe in.”
CSAM will launch on iOS 15, iPadOS 15, watchOS 8 and macOS Monterey next month. I suspect for many Apple fans, it will mark the moment to walk away.
Follow Gordon on Facebook
More On Forbes
‘No Service’ iPhone Cellular Problem Reported By iOS 14.7.1 Upgraders
New iPhone ‘Zero Day’ Hack Has Existed For Months
I am an experienced freelance technology journalist. I have written for Wired, The Next Web, TrustedReviews, The Guardian and the BBC in addition to Forbes. I began in
I am an experienced freelance technology journalist. I have written for Wired, The Next Web, TrustedReviews, The Guardian and the BBC in addition to Forbes. I began in b2b print journalism covering tech companies at the height of the dot com boom and switched to covering consumer technology as the iPod began to take off. A career highlight for me was being a founding member of TrustedReviews. It started in 2003 and we were repeatedly told websites could not compete with print! Within four years we were purchased by IPC Media (Time Warner’s publishing division) to become its flagship tech title. What fascinates me are the machinations of technology’s biggest companies. Got a pitch, tip or leak? Contact me on my professional Facebook page. I don’t bite.