Author Topic: Do this before iOS 15 is released to stop Apple from scanning your private photo  (Read 426 times)

Offline javajolt

  • Administrator
  • Hero Member
  • *****
  • Posts: 35211
  • Gender: Male
  • I Do Windows
    • windows10newsinfo.com
Coinciding with Tim Cook hitting the 10-year mark as Apple’s CEO, the iPhone maker has found itself in a strange place. The consumer electronics giant that’s spent years positioning itself as the pro-privacy alternative to tech giants like Google and Facebook has inadvertently landed smack in the middle of two things. One, a huge controversy that has normally pliant journalists treating Apple with rare skepticism. And, two, a controversy that also threatens to undermine Apple’s privacy-focused core philosophy under Cook. The culprit here: One of the many new iOS 15 features, included with the next big software update this fall.

Controversial iOS 15 features

By now, if you follow Apple news to any degree, you’re probably familiar with the particulars. Starting with iOS 15, Apple is going to start doing something new. It will hash and compare photos destined to be uploaded to iCloud against a CSAM (child sexual abuse material) database. The National Center for Missing and Exploited Children, or NCMEC, maintains the database in the US. And the new iOS system kicks into action if the following conditions are met. First, if you possess specific CSAM material — which is already marked or hashed, and able to be matched against what’s in the NCMEC database. Also, if you use iCloud to store your photos, which the vast majority of iPhone owners do.

After you hit a threshold of successful comparisons of CSAM material — meaning, material that’s in your possession matches what’s in the database, a certain number of times — Apple notifies law enforcement.

Meanwhile, ironically, there’s actually a pretty easy way to avoid all this new scrutiny from Apple in the first place.

All you’ve got to do? Just disable the sharing of photos to iCloud.

Open the Settings app on your iPhone or iPad > then navigate to “Photos” > and disable the “iCloud Photos” option. After that, choose “Download Photos & Videos” when the popup appears, to pull everything in your iCloud Photos library down to your device.

The problem

If you then want to migrate away from Apple? Maybe, say, because you feel that the iPhone maker is invading your privacy via these new iOS 15 features? Well … all we can say is good luck with that transition. Almost every provider of cloud backup service already does this same kind of scanning. The key difference, and it’s a huge one, is that they do it all in the cloud. On their end.

Apple, however, performs both cloud scanning as well as some of the image matching on your device itself. And therein is the reason for the outcry from privacy advocates. Apple is going to be looking for a specific kind of contraband on your personal device going forward. Like it or not. Unless that is, you disable the setting we noted above.



Snowden: Apple is protecting its brand

Speaking of which, NSA whistleblower Edward Snowden angrily blasted the fact that you can so easily do so in a new post he published to his Substack on Wednesday evening.

“If you’re an enterprising pedophile with a basement full of CSAM-tainted iPhones,” he writes, “Apple welcomes you to entirely exempt yourself from these scans by simply flipping the ‘Disable iCloud Photos’ switch.” It’s “a bypass which reveals that this system was never designed to protect children, as they would have you believe, but rather to protect their brand.”

In other words, he continues, this is about keeping that material off their servers. And thus keeping Apple out of negative headlines.

Do Snowden (and, for that matter, privacy advocates like him) seem overly concerned here about some dark hypothetical future because of these new iOS 15 features? Well, don’t forget: Apple in the past has used the hypothetical misuse of its products at some future date to pull back from implementing very understandable privacy actions in the here and now (here’s one example).

“So what happens when, in a few years at the latest … in order to protect the children, bills are passed in the legislature to prohibit this (Disable iCloud) bypass, effectively compelling Apple to scan photos that aren’t backed up to iCloud?” Snowden continues in his new post. Or, what about if a party in India starts demanding that Apple scan for memes associated with a separatist movement? “How long do we have left before the iPhone in your pocket begins quietly filing reports about encountering ‘extremist’ political material, or about your presence at a ‘civil disturbance’?”

Sure, they’re hypotheticals. They’re also squarely within the realm of the possible.

The response from Apple

Thus far, Apple’s response to the concerns of privacy advocates has been a thoroughly unsatisfying one. Essentially, it’s “just trust our safeguards.” This is from an Apple document the company published to try and allay concerns.

In response to people’s question about whether someone could repurpose Apple’s CSAM detection system to find things other than CSAM — like political material? Here’s Apple’s response. “Our process is designed to prevent that from happening.”

This is not, well, quite as explicit as stating that there is no way this could ever come to fruition. And as to whether governments could force Apple to scan for things other than CSAM?

“Apple would refuse such demands … We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands.”

source