More than one billion Apple users have been put on notice, amid fears a looming update will see devices “spy” on their owners.
Apple users are being warned about a new update that allows iPhones to spy on their owners.
The tech giant is launching a new surveillance system on more than one billion active iPhones, iPads and Macs next month – which IT expert Edward Snowden says is “Apple declaring war on your privacy”.
Mr Snowden, a former computer intelligence consultant, has delved into the company’s CSAM (child sexual abuse material) detection system coming to these products soon, branding it “uniquely intrusive”.
In a website post, he states, “Apple’s new system, regardless of how anyone tries to justify it, will permanently redefine what belongs to you, and what belongs to them.”
The system will work by matching a user’s images to illegal material.
“Under the new design, your phone will now perform these searches on Apple’s behalf before your photos have even reached their iCloud servers, and … if enough ‘forbidden content’ is discovered, law-enforcement will be notified,” Mr Snowden explains.
“Apple plans to erase the boundary dividing which devices work for you, and which devices work for them.
“The day after this system goes live, it will no longer matter whether or not Apple ever enables end-to-end encryption, because our iPhones will be reporting their contents before our keys are even used.”
The 38-year-old says there is “compelling evidence” from researchers that the system is flawed, adding, “Apple gets to decide whether or not their phones will monitor their owners’ infractions for the government, but it’s the government that gets to decide what constitutes an infraction … and how to handle it.”
But Mr Snowden has also pointed out that the system can easily be bypassed – undermining its purpose.
“If you’re an enterprising paedophile with a basement full of CSAM-tainted iPhones, Apple welcomes you to entirely exempt yourself from these scans by simply flipping the ‘Disable iCloud Photos’ switch, a bypass which reveals that this system was never designed to protect children, as they would have you believe, but rather to protect their brand,” he says.
“As long as you keep that material off their servers, and so keep Apple out of the headlines, Apple doesn’t care.”
The tech expert says it could lead to governments forcing Apple to remove the option to disable photo uploads to iCloud.
“If Apple demonstrates the capability and willingness to continuously, remotely search every phone for evidence of one particular type of crime, these are questions for which they will have no answer,” he writes.
“And yet an answer will come — and it will come from the worst politicians of the worst governments. This is not a slippery slope. It’s a cliff.”
Apple has so far defended its CSAM detection system, with its software chief saying it is “widely misunderstood” and “poorly communicated” in an interview with The Wall Street Journal.
The Sun has contacted Apple for a response.
This article originally appeared in The Sun and was reproduced with permission