Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are a lot of articles about Apples hadh algorithm and for me they are mostly irrelevant to the main problem.

The main problem is that Apple has backdoored my device.

More types of bad images or other files will be scanned since now apple does not have plausible deniablity to defend any of ghe government’x requests.

In the future a false? positive that happened? to be of a political file that crept in the list can pin point people to the future dictator wannabe.

It’s always about the children or terrorism.



They could have done all that without telling you. And as long as the traffic was combined with normal traffic no one would ever notice (and in this case it would end up mixed with normal traffic since it only applies to images being uploaded to iCloud, so communication with Apples servers would be expected).

What it looks like to me is that Apple is planning on releasing end-to-end encryption for iCloud. But they know that whenever E2EE comes up, people get mad that terrorists, child molesters, and mass shooters can hide their data and communications. Hell, they've been painted as the villain when they say they can't unlock iPhones for the FBI. This heads off those concerns for the most common out of those crimes.


Sticking to the apt analogy from the article, >To reiterate: scanning your device is not a privacy risk, but copying files from your device without any notice is definitely a privacy issue.

> Think of it this way: Your landlord owns your property, but in the United States, he cannot enter any time he wants. In order to enter, the landlord must have permission, give prior notice, or have cause. Any other reason is trespassing. Moreover, if the landlord takes anything, then it's theft. Apple's license agreement says that they own the operating system, but that doesn't give them permission to search whenever they want or to take content.

This viewpoint is like thanking your landlord for warning you that they are going to enter your home and root through your private items, all in the name of some greater good. Let's not spin it as if the landlord is doing us a favor in this scenario.


That first line of the quote is misrepresenting what Apple is doing. They are not copying files from your device. You are sending them the files. As it stands the only images that will be scanned are the ones you are uploading to iCloud. And I'd be shocked if they weren't already analyzing those images on the server side.

When it comes to governments being able to pressure them into being more invasive, nothing has changed with this update. If a government wanted to poison the CSAM database, they could have already. You'd end up reported when the server does the scanning. If the government wanted to expand scanning to include things that you're not uploading, they already could have asked Apple to do that. It would have been possible to silently add a much simpler scanning mechanism or data exfiltration into an update.

This isn't a spin to say anyone is doing us a favor. iCloud should be end-to-end encrypted and there shouldn't be any scanning at all. But why should that opinion on how we should treat privacy be taken as the only valid opinion? The people who do want the scanning are not simply asking for it because they are stupid or uninformed. Instead they put different weights into what they value.


> What it looks like to me is that Apple is planning on releasing end-to-end encryption for iCloud

This is Gruber's optimistic take on it as well. If so, why not make both changes at once? Given that they've walked back E2EE on iCloud before, I'm not holding my breath.


> They could have done all that without telling you.

But in that case it would much more likely be a crime, it would certainly cost them a tremendous amount of good will.

Your personal computing device is a trusted agent. You cannot use the internet without it, and esp. in lockdown you likely can't realistically live your life without use of the internet. You share with it your most private information, more so even than you do with your other trusted agents like your doctor or lawyers (whom you likely communicate with using the device). Its operation is opaque to you: you're just forced to trust it. As such your device ethically owes you a duty to act in your best interest, to the greatest extent allowed by the law. -- not unlike your lawyers obligation to act in your interest.

Apple is reprogramming customer devices, against the will of many users (presumably at the cost of receiving necessary fixes and security updates if you decline) to make it betray that trust and compromise the confidentiality of the device's user/owner.

The fact that Apple is doing it openly makes it worse in the sense that it undermines your legal recourse for the betrayal. The only recourse people have is the one you see them exercising in this thread: Complaining about it in public and encouraging people to abandon apple products.

E2EE should have been standard a decade ago, certainly since the Snowden revelations. No doubt apple seeks to gain a commercial advantage by simultaneously improving their service while providing some pretextual dismissal of child abuse concerns. But this gain comes at the cost of deploying and normalizing an automated surveillance infrastructure, one which undermines their product's ethical duty to their customers, and one that could be undetectable retasked to enable genocide by being switched to match on images associated with various religions, ethniticities, or political ideologies.


Eh, I think it is simply the fact that Apple doesn’t want to be associated with individuals violating their terms of service in an unlawful way.

This is a way to root them out and report them to law enforcement.


If this is a prelude to E2E encryption for iCloud they are going to be under TREMENDOUS pressure from law enforcement to expand the list of bad material way beyond just CSAM.


Not if it's only E2EE for photos


They might be able to do it without anyone ever knowing, but at a business level they can’t they do it better with everyone knowing and children safety as an excuse?


This is mainly where I come down on this. I care about what they’re doing not why they’re doing it. What they are doing is examining my private files, and that is simply unacceptable.


Apple has front-doored your device from Day 1.


Exactly. The issue is that the iPhone is now a snitch AI for whatever purpose Apple deems fit.


Drug dogs have a 50~80% false positive rate, because after they've been certified as capable of detecting drugs, they get that trained out of them the field to instead indicate whatever the handler wants indicated.


Drug dogs are "probable cause on four legs".

https://www.npr.org/2017/11/20/563889510/preventing-police-b...


> More types of bad images or other files will be scanned since now apple does not have plausible deniablity to defend any of ghe government’x requests.

The mechanism doesn’t scan anything except images, and won’t trigger on a single bad image - only a set.

Yes, that set could be something other than child porn, assuming Apple and NCMEC conspire, but this is not a general purpose backdoor.


Literally Apple could just add a different set to the set of hashes they push? That seems very naive.


What seems naive? Adding hashes will only match images, not other files, and even then only if a group is matched, not just one.


Because people take and store images of huge numbers of different things? Like the things mentioned upthread, memes, documents? along with screenshots of a huge number of other things.

And the details around matching (and groups, etc) are trivial to change in a later update.


> or other files will be scanned since now

Ok, so not other files at all then. Just photos of documents.

> And the details around matching (and groups, etc) are trivial to change in a later update.

Ok, so this mechanism can’t do more than us claimed, but they could add a new mechanism later?


As I said, incredibly naive.


You said it, but then you failed to back it up with anything other than your fears.


What reason do we have to trust that the NSA won't knock on the door of apple and ask for a small expansion, as a matter of national security + here is your NDA outlining that any canary tampering will result in jail time? It's a closed system so we would have no way of knowing.


Sure, but that has nothing to do with this mechanism or this discussion.

They could have done that at any time in the past, and could do so in future.

‘NSA could force Apple to do something in secret’ is an evergreen fear just like ‘think of the children’. Such comments get added to every thread about Apple and privacy or security.


It’s one thing to have a company build and keep secret such a system from everyone, from the ground up.

It’s a different thing entirely to do a minor extension to a system that they rolled out publicly that already essentially does this!

The first one will almost certainly be noticed, and will be clearly illegal/violate contracts, and can therefore be identified and rooted out.

The second one you could do for groups of people or targeted individuals trivially and would be under the radar and probably never noticed - and could be denied unless truly rock solid evidence existed. Which would be easy to avoid existing if you used the same mechanisms (but different types of matches) you were public about - in a closed ecosystem with a Secure Enclave, for instance. It’s not like anyone is going to be able to do step-by-step instruction debugging on the code running on their iPhone!

There is a long history of this happening. Not everyone is as blatant as the stasi - and even then, no one knew who was working for them or what was tapped until the whole system collapsed and their records became public. It still took a long time to unravel.


> The first one will almost certainly be noticed, and will be clearly illegal/violate contracts, and can therefore be identified and rooted out.

Well since they have made very detailed public statements about the limits of this system and. not letting it be misused, they would certainly be in violation of contracts if they did start misusing it.

> if you used the same mechanisms (but different types of matches)

What kinds of ‘different matches’ do you think this mechanism can be used to make?


It also only scans images uploaded to icloud


It does that on the device. All it takes is enabling it by default, instead of triggering the scan prior to upload, and add more hashes to flag. The result would be a total loss of privacy, basically Apple is privatising mass surveillance with that. Without oversight, no matter how small, or accountability.


How would this let them scan the text of your signal messages or even your iMessages?

How would this let them search for a subversive PDF?

It wouldn’t. This is just fearmongering.

> The result would be a total loss of privacy, basically Apple is privatising mass surveillance with that. Without oversight, no matter how small, or accountability.

You are talking about an imagined system someone could build in the future, not the system Apple has put in place.


"Don't worry this malware is only triggered if you double click it. -- signed the author"


> The main problem is that Apple has backdoored my device.

Isn't that the shtick with Apple though? That they own the devices you rent and you don't have to worry too much about it. They always had the backdoor in place, they used it for software updates. Now they will also use it for another thing.


> That they own the devices you rent and you don't have to worry too much about it.

You didn't need to worry about it because they did a sufficiently good job at making the choices for you. This is a sign that they stopped doing so.

An appropriate metaphor might be a secretary. They can handle a lot of busy work for you so you don't have to worry about it, but they need access to your calendar, mails etc. to do so. This is not an intrusion as long as they work on your favor. If you suddenly find your mails on the desk of your competitor, though, you might reconsider. That, however, does not mean that the whole idea of a secretary is flawed.


I agree. I am just saying that Apple has always had massive backdoors in place. People were always pointing it out as a huge plus for iDevices. So no, the parent's main problem is not the backdoor. Otherwise they would have complained long ago.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: