WhatsApp has condemned Apple’s new child safety tools as a “very concerning . . . surveillance system”, even as governments around the world have cheered the decision to proactively search for illegal photos of child sexual abuse.
The stand-off sets up a battle between other tech platforms and officials calling for them to adopt similar tools.
An Indian government official on Friday told the Financial Times it “welcomed” Apple’s new technology, which set “a benchmark for other tech companies”, while one EU official said the tech group had designed a “quite elegant solution”.
US senator Richard Blumenthal called Apple’s new system an “innovative and bold step”.
“Time for others — especially Facebook — to follow their example,” tweeted Sajid Javid, the UK’s health secretary and former home secretary.
However, Apple’s rivals in Silicon Valley are said to be “incandescent” over its system to scan photos on US users’ iPhones before they are uploaded to iCloud, which will launch as part of the next version of iOS.
“This approach introduces something very concerning into the world,” said Will Cathcart, head of WhatsApp. “This is an Apple-built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. It’s troubling to see them act without engaging experts.”
“We will not adopt it at WhatsApp,” he added.
The enthusiastic response from lawmakers will only intensify concerns raised by the security and privacy community that Apple has set a dangerous precedent that could be exploited by repressive regimes or overzealous law enforcement.
Apps including Facebook-owned WhatsApp, Telegram and Signal, as well as Google with its Android operating system, are already being urged to replicate Apple’s model.
“To say that we are disappointed by Apple’s plans is an understatement,” said India McKinney and Erica Portnoy of digital rights group the Electronic Frontier Foundation, in a blog post. “Apple’s compromise on end-to-end encryption may appease government agencies in the US and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.”
Jennifer Granick, surveillance and cyber security counsel for the American Civil Liberty Union’s Project on Speech, Privacy and Technology, added: “However altruistic its motives, Apple has built an infrastructure that could be subverted for widespread surveillance of the conversations and information we keep on our phones.”
Political pressure on technology companies has mounted around the world in recent months to allow governmental access to encrypted content, including messages, photos and videos.
India’s prime minister Narendra Modi recently passed laws compelling technology platforms such as WhatsApp to trace the source of unlawful messages, effectively breaking end-to-end encryption. WhatsApp is currently locked in a legal battle with the government, in an effort to stymie the new rules.
Last October, in an open letter signed by the “Five Eyes” countries plus Japan and India, officials including the UK’s home secretary Priti Patel and former US attorney-general William Barr said they “urge industry to address our serious concerns where encryption is applied in a way that wholly precludes any legal access to content”.
They noted child abuse was one of the reasons they felt the tech companies should develop alternative methods to give authorities access to device content, and that there was “increasing consensus across governments and international institutions that action must be taken”.
Critics have expressed scepticism at Apple’s promise to limit itself to scanning for child abuse imagery. “I hate going all slippery-slope but I look at the slope, and governments around the world are covering it in oil, and Apple just pushed its customers over the edge,” said Sarah Jamie Lewis, a cryptography researcher and executive director of Canadian NGO Open Privacy.
While there is not yet any US legislation compelling Apple to seek out this kind of material, its move comes as the UK and EU are preparing new legislation — the Online Safety Bill and Digital Services Act — that would put greater onus on tech companies to limit the spread of child pornography, among other forms of harmful content.
Apple’s decision to push ahead with its own individual system, rather than join a cross-industry negotiation with regulators around the world, has riled its Silicon Valley neighbours — especially after they united in support of its 2016 legal fight against the FBI over accessing a terrorist suspect’s iPhone.
“Some of the reaction I’ve heard from other competitors to Apple is they are incandescent,” said Matthew Green, a security professor at Johns Hopkins University, during an online video discussion with researchers at Stanford University on Thursday.
Alex Stamos, the former Facebook security chief who is now director of the Stanford Internet Observatory, said during the same discussion that Apple “don’t care at all that everyone is trying to come up with this delicate international balance”. “Clearly there’ll be immediate pressure on WhatsApp,” he said.
An Apple executive on Thursday acknowledged the furore its moves had caused in an internal memo. “We’ve seen many positive responses today,” wrote Sebastien Marineau in a note obtained by Apple blog 9to5Mac. “We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we’ve built.”
Facebook and Google have not yet commented publicly on Apple’s announcement.
Apple has previously faced criticism from some quarters for not doing more to prevent abusive material from circulating, especially on iMessage. Because the iPhone’s messaging app is end-to-end encrypted, the company has been unable to see any photos or videos exchanged between its users.
Messages exchanged between two senior Apple engineers, which were produced as evidence in the iPhone maker’s recent legal battle with Epic Games, suggest that some inside the company believed it could do more.
In the exchange, which is dated early last year and was first uncovered by the Tech Transparency Project, Eric Friedman, head of Apple’s Fraud Engineering Algorithms and Risk unit, suggested that compared to Facebook, “we are the greatest platform for distributing child porn”.
“We have chosen to not know in enough places where we really cannot say” how much child sex abuse material might be there, Friedman added.
Additional reporting from Stephanie Findlay in Delhi, Valentina Pop in Brussels and Hannah Murphy in San Francisco