> The idea, that companies or politicians can force the user's machine to work against it's owner, is wrong.
You are hinting at something important here. Let me strengthen your point: to own an object means to subject it fully to your own will. If the object can act in a way that favors someone else's interests over yours, you do not own it. This is true of pretty much any device running proprietary software.
A litmus test: can you make your device lie to the manufacturer's servers? Regardless of the legality or morality of doing so.
However this article is really about something else: the vulnerability of centralized services in the face of government oppression. Signal only has the ability to log messages because it is a centralized service that controls both the client and the server. The benefits of E2EE is greatly reduced if the client and the server is controlled by the same entity (tomorrow Signal can push out an update that would send a plaintext backup to their servers, and you wouldn't know it until later). Moreover, the non-free distribution mechanisms on mobile phones (stores) limits a company's ability to resist.
Also only possible because we use Signal as compiled by themselves and not by trusted third parties from a source kept clean of any future client-side backdoors. The client is open source, right? https://github.com/signalapp
Yes, this is part of the problem. Application developers and the packagers should be distinct unrelated entities to reduce the chance of a malicious update being pushed to users if the developer sells out.
Without reproducible builds, this just means you have to trust the packager instead of the developer. Sometimes that's a good trade-off, but you still haven't really solved the problem, just moved it.
With reproducible builds, you don't have to trust the packager or the developer as long as you trust at least one person who reviewed the source code.
Packagers have proven to be more reliable. Sometimes they make mistakes but there's no case of a packager ever selling out (correct me if I'm wrong.) On the other hand, there are numerous cases of developers selling out.
Splitting the developer and the packager doesn't inherently reduce the chance of a malicious update any more than using a VPN reduces the chance of being snooped on by an ISP. All it accomplishes is change who you have to trust to not be malicious. You might have good reason to believe that you can trust one party better than you can trust another, but unless you're building the package yourself there's still no guarantee that the package that you install is built from the source code you can inspect.
It's all based in trust in the packager and only the packager—there are no checks and balances. The only reason why splitting up the responsibilities might help is if you find the F-Droid maintainers to be inherently more trustworthy than the Signal developers, not due to simply separating the concerns.
That does not solve the problem. A country can forbid F-Droid and Debian and anything else that is not in a short list of vetted app stores that comply with the law of that country to backdoor everything.
I have unpopular opinions about this, because Signal has been so hostile to anyone other than Signal themselves being involved.
But to be specific: "open source" claims go out the window when they're;
1. Not reproducible (before anyone links me to the "reproducible steps" please actually read them because they tell you directly that they will not create a reproducible output).
2. Able to hide development of mobilecoin (somehow) from us for nearly a year. To be clear: There were updates to the Signal app on iOS and Play, otherwise there would have been security bugs, but those patches did not make their way into the repositories.
Signal operates on a "trust us bro" mentality, and no matter how trustable they seem to be- something about that doesn't sit right with me and never has.
EDIT: I don't really care if bots or shills downvote me, can you really, with a straight face, say it's NOT "trust us bro" ideology that makes people use Signal?
Archive formats are hard to make reproducible because there are lots of ways of making different yet equivalent archives.
So it’s not surprising to me that someone would fail at this hurdle and find it frustrating to resolve.
Nix defined their own format for this to avoid this exact problem.
It seems there are multiple reasons. For one, the apk files include a digital signature and you won't have Signal's and Google's private keys available to recreate their signatures.
Thank you for this nice response. Did you already know or did you look it up? please don't tell me you just copied and pasted my question into an input form somewhere and it gave a bunch of reasons...
Ah nice; they got rid of that explicit warning - instead though we have the entire section about "bundlePlayProdRelease" including an externally sourced binary blob.
I don't understand how the details of the build process matter if the resulting files can be checked to be bit by bit identical? I can only think of something like Signal and Google conspiring to backdoor the binaries during the build process via this external binary blob. But if Google is part of this, they could also do it within Android which is not fully open source.
If you don't like this, you use the non-Play Store build instead (which supposedly doesn't include any binary blobs, but I haven't checked).
> 2. Able to hide development of mobilecoin (somehow) from us for nearly a year. To be clear: There were updates to the Signal app on iOS and Play, otherwise there would have been security bugs, but those patches did not make their way into the repositories.
Signal operates on a "trust us bro" mentality, and no matter how trustable they seem to be- something about that doesn't sit right with me and never has.
The MobileCoin work and the source code not being published on the public repository for nearly a year was an extremely ill thought move. It soured my view of Signal as well.
> to own an object means to subject it fully to your own will
Not by a long shot. Just a few counterexamples from the top of my head: Destroying currency, altering passports, reproducing copyrighted images.
I'm not saying I'm a fan of even more exceptions of that kind, but I don't think there are any particular inherent rights arising from property ownership beyond from what society agrees on there are (e.g. the first sale doctrine for physical media). That's what makes it even more important to codify these rights.
> Just a few counterexamples from the top of my head: Destroying currency, altering passports, reproducing copyrighted images.
These aren't counterexamples, they prove the rule. A US passport literally has the text "this passport is the property of the United States" printed inside of it, and I imagine the same is true in most countries: you are the recipient of a passport, not the owner of one.
The same applies to copyrighted images— when you purchase a book you own the physical copy and can fully subject it to your own will, but you don't own the right to make additional copies of it. You own the copy, not the intellectual property.
As for currency, it may not legally be the property of the US government like a passport, but I would argue that the fact that you can't modify it does in fact mean that you don't own the bill, the bill is a representation of an abstraction of "money" that you do own.
Afaik currency (as in the physical banknote) is actually state property too. What you really own is a promise from the national/federal bank to pay you the value that is written.
Yeah, that has been my understanding, but I couldn't find a citation for that right away, so I didn't want to assert it confidently. But I've heard the same thing.
Note that I said "can", not "legally can". You can destroy currency, alter passport, reproduce copyrighted images if you want to. There may be legal consequences but you can. You can also stab a person with a knife you own, even if you will be punished for it. I'm not talking about rights, but capabilities.
You can't make your phone lie to an app developer about its location, rooted status, etc. You can't make your HP printer print with unsanctioned ink. Therefore, you do not own them.
I also can't make a pen and a sheet of paper contain a proof showing whether P is equal to NP. Does that mean I don't own them either?
Now you could of course say that the difference is somebody having intentionally designed an object in a way that makes it capable of withholding some functionality from me but not others, and I'd agree.
But all in all, I just don't think "property rights" is the right lens to think about computing devices.
You can do what you want with the bike, but your analogy falls flat because it implies that despite you owning the bike you get to drive through your neighbours living room: because your right to own a bike somehow trumps their right to own land and a home.
You are hinting at something important here. Let me strengthen your point: to own an object means to subject it fully to your own will. If the object can act in a way that favors someone else's interests over yours, you do not own it. This is true of pretty much any device running proprietary software.
A litmus test: can you make your device lie to the manufacturer's servers? Regardless of the legality or morality of doing so.
However this article is really about something else: the vulnerability of centralized services in the face of government oppression. Signal only has the ability to log messages because it is a centralized service that controls both the client and the server. The benefits of E2EE is greatly reduced if the client and the server is controlled by the same entity (tomorrow Signal can push out an update that would send a plaintext backup to their servers, and you wouldn't know it until later). Moreover, the non-free distribution mechanisms on mobile phones (stores) limits a company's ability to resist.