Hacker Newsnew | past | comments | ask | show | jobs | submit | c0l0's commentslogin

I recommend https://johannes.truschnigg.info/writing/2025-02-simple_effe... as an (imo) better approach than fail2ban parsing your logs to deal with the problem.

One of my most esteemed former co-workers used to say that whenever you succeed in making something idiot-proof, the universe will create a better idiot, undoing any progress you made.

I'm very curious where that saying comes from now. I haven't found anything conclusive, like [1]'s friend I thought it might have been Douglas Adams, but a few references refers to it as "Grave's Law". After a few searches, I can't find any references to that which I can date further back than '99. But variants of the saying is at least as old as '89 (Rick Cook's version in [1]), but it's the kind of thing that sounds like a sufficiently "obvious" extension of the far older view that human stupidity has no bounds that it feels surprising if it is that recent.

[1] https://www.samyoung.co.nz/2025/03/building-better-idiot.htm...


The form I know is attributed to Douglas Adams[1]:

“A common mistake that people make when trying to design something completely foolproof is to underestimate the ingenuity of complete fools.”

[1] https://www.goodreads.com/quotes/6711-a-common-mistake-that-...


That's from Mostly Harmless, 1992, so newer than Rick Cook's version.

But also sufficiently different that I have no doubt a lot of people have independently coined some variant or other. There's also the decades older (sometimes attributed to Einstein, but there appears to be no evidence that he said it) "Two things are infinite: the universe and human stupidity; and I'm not sure about the universe." It seems sayings about the extent of human stupidity are quite widespread in many variations.


This side note (a better idiot)is the best part of this post.

Does Windows on ARM use VBS/Virtualization Based Security, and does ARM support nested virtualization to do so in a VM, too? Does it employ costly CPU vulnerability mitigation techniques that might hit two times in a VM (unless the Hypervisor is adequately set up, which I'd hope is the default for Hyper-V)? Those two things account for most of the common performance problems observed when putting modern Windows in a VM. I'd love to know more about it, but the article does not seem to mention either.


I'd say it's either because they're just having fun, or because they're dumb.


As a wireguard user myself (even on the lone Windows machine that I still begrundingly have), I am happy that this problem could have been resolved. I am just wondering - if there had not been this kind of public outcry and outrage that Mr. Donenfeld discounts in his announcement message, would the issue have been fixed by now?

What are individual developers of "lesser" (less important, less visible, less used) software with a Windows presence to do? Wait and pray for Goliath to make the first benevolent move, like all the folks who got locked out forever from their Google accounts on a whim? Ha!

The fact of the matter is, the code signing requirements on Windows are a serious threat to Free and Open Source Software on the platform. Code signing requirements are a threat to FOSS on all platforms that support this technique, and infinitely more so where it's effectively mandatory. I firmly believe that these days, THIS is the preferred angle/vector for Microsoft to kill the software variety their C-levels once publicly bad-mouthed as "cancer", and zx2c4 is one of the poor frogs being slowly boiled alive. Just not this time - yet.


They would be ignored. Having an audience is key to getting problems solved, whether it’s a lone hacker or a large corporation. Without an audience, you have no leverage. At that point you might as well create a new Windows account and re-apply, since that would have more luck than getting around a “we’ve closed your account and there’s no appeal process” barrier.

If that sounds Kafkaesque, it is. It’s a small miracle that getting a post to the top of HN can surmount such bureaucracy at all.

The best way to get an audience is to tell a compelling story. Make it interesting. There are ways of doing that for even the least known developers.

My point is to push back against the idea that it should be fair to everyone and that what’s morally right should prevail in every case. The hardware developer program doesn’t exist to treat every developer fairly. They exist to make money for Microsoft. pg puts it more eloquently here: https://paulgraham.com/judgement.html


It makes me think tech communities need to lobby for more laws to ensure fair access to platforms, app stores, etc. Be that at least side loading apps, etc.

Otherwise we’ll eventually all get lost in the kafkaesque technocracies.

Less for moral reading, but to keep from being squashed by the weight of tech.


This is why orgs like https://eff.org exist.


But eff isn’t going to come to my aid if it’s isn’t a big story, like wireguard. We’re all just arguing circularly around the fact that companies with massive footprints can and do operate in a manner where it’s assumed that zero access is the industry standard for “normal users”


I would still ask them, and even if they can't help, they fight for such rights for everyone.


>tech communities need to lobby for more laws to ensure fair access to platforms

I'm surprised someone didn't reply saying this would affect the freedom of companies to do whatever they want, whenever they want.


I got a modestly-similar situation resolved by buying a support package and spending 4+ hours across ... not sure, but probably 4-5 support calls? It's been 5 years. If memory serves it was the $200/mo support package for Azure.

In retrospect, I should have not spent 3 weeks trying to get their incompetent software to work and just gone straight to phone calls. And at least in my case, the support agents seemed broadly unfamiliar, but seemed to have access to higher-priority internal case submission which did finally get to someone who could fix my issue.


While this is a small problem for software (and hardware) that needs custom kernel drivers, or software that needs to run as administrator, you seem to have jumped a long way past that to rant about FOSS on Windows with no justification- general unsigned software works just fine on Windows as it always has.


"works just fine on Windows as it always has" is just not true. These days, I cannot even run my own cross-compiled Go executables of a cross-platform tool that I am developing in private on Windows 10 or 11, because some blue popup from Windows Defender/"SmartScreen" prevents me from doing so, and tells me to contact the software publisher if I'd like to be able to do something about it. Outright disabling Defender/SmartScreen works around the problem (but the popup doesn't tell me that), and, presumably, signing these executables with a "trusted" developer certificate would make this outcome less probable - that is at least what people online have been telling me.

In my book (I started using computers during ther Windows 3.0 era), this clearly does not qualify as "working just fine on Windows as it always has", no matter how you spin it.


Do you download the cross-compiled executable via http or smb to the Windows machine? If so than it most likely got earmarked with a NTFS alternate data stream.

File Settings > This file come from another computer: Unblock

PowerShell > Unblock-File

Add your smb file share as trusted: Internet Properties > Security > Local Intranet > Sites

I hate it too that you need to sign software that you want to publish. Totally destroys the economics of little shareware type software.


Thanks for this (and I actually learned about PS1's handy Unblock-File this very moment! :)), but I am aware of the "mark of the web"-stuff MSFT had introduced after realizing that an "attacker-controlled" filename extension alone is a poor safeguard against making a file executable ;)

For my specific problem/situation, the executable in question gets transferred to the target machine on a read-only UDF file system burnt onto a USB thumb drive. Other Golang executables from FOSS projects on the same filesystem execute just fine (I guess they have better "reputation", due to their hashes being registered with MSFT somewhere).


I am already donating the rough equivalent of the cheapest Microsoft 365 subscription to The Document Foundation each year, and won't stop now just because they're increasing the visibility of their donation-based funding model. I hope they succeed, and many more people start contributing financially as a result.


Thanks, but no thanks. The only winning move, long-term, is to excise everything this wretched company makes from your life as vigorously as possible. It's been true 20 years ago, and it's even more true today.


The reason, i opine, that so few people switch from Windows to any Unix flavor is that Windows users are waiting for the two productivity suites which make Windows at least marginally usable (cygwin and msys) to be ported to any of the Unix flavors.

;)


It’s just office.

Make office work and people will happily leave in droves.


Apparently you can https://www.youtube.com/watch?v=jwDbERjm0yA

That one's using PlayOnLinux.

Not tried it myself but that's been one of the blocks to me using linux as a day to day thing. Although I'm on mac and don't find it too bad.


That is Office 2013, or a 13 year old software.

Play on Lennox is just non-free wine. And no offense to the wine people, but it seems like perhaps they have hit a wall on what would be Windows 10 translation.


None of the missing ones have proper, official, upstream LineageOS support. If you install LineageOS on these, you install somebody's own, personal fork of LineageOS. Which might be totally fine, of course. But because of the necessarily different signing keys alone, it's a (potentially) very different thing.


A true hero and legend. RIP.


LineageOS isn't unsigned, it just happens to be signed by keys that are not "trusted" (i.e., allowed - thanks for the correction!) by the phone's bootloaders.


not allowed is a clearer language here.


thats effectively the same thing.

The whole point of the majority of PKI (including secureboot) is that some third party agrees that the signature is valid; without that even though its “technically signed” it may as well not be.


I disagree. If LineageOS builds were actually unsigned, I would have no way of verifying that release N was signed by the same private-key-bearing entity that signed release N-1, which I happen to have installed. It could be construed as the effective difference between a Trust On First Use (TOFU) vs. a Certificate Authority (CA) style ecosystem. I hope you can agree that TOFU is worth MUCH more than having no assurance about (continued) authorship at all.


Yes, I understand the value of signatures, but thats not how PKI works.


If the owner of a device can't sign and install their own software, then your definition of PKI doesn't "work" at all.

The first party must be able to entirely decide that "some third party" for it to be anything more than an obfuscation of digital serfdom.


The difference between “PKI” and “just signing with a private key” is the trusted authority infrastructure. Without that you still get the benefit of signatures and some degree of verification, you can still validate what you install.

But in reality this trustworthiness check is handed over by the manufacturer to an infrastructure made up of these trusted parties in the owner’s name, and there’s nothing the owner can do about it. The owner may be able to validate software is signed with the expected key but still not be able to use it because the device wants PKI validation, not owner validation.

I’ve been self-signing stuff in my home and homelab for decades. Everything works just the same technically but step outside and my trustworthiness is 0 for everyone else who relies on PKI.


[flagged]


> My definition of PKI is the one we’re using for TLS, some random array of “trusted” third parties can issue keys

Maybe read the actual definition before assuming you're so much smarter than "HN". One doesn't need third parties to have pki, it's a concept, you can roll out your own


“read the actual definition”;stellar contribution there, mate. I checked and sure enough its exactly in line with my comments.

I’ve been discussing the practical implementation of PKI as it exists in the real world, specifically in the context of bootloader verification and TLS certificate validation. You know, the actual systems people use every day.

But please, do enlighten me with whatever Wikipedia definition you’ve just skimmed that you think contradicts anything I’ve said. Because here’s the thing: whether you want to pedantically define PKI as “any infrastructure involving public keys” or specifically as “a hierarchical trust model with certificate authorities,” my point stands completely unchanged.

In the context that spawned this entire thread, LineageOS and bootloader signature verification, there is a chain of trust, there are designated trusted authorities, and signatures outside that chain are rejected. That’s PKI. That’s how it works. That’s what I described.

If your objection is that I should have been more precise about distinguishing between “Web PKI” and “PKI generally,” then congratulations on missing the forest for the trees whilst simultaneously contributing absolutely nothing of substance to the discussion.

But sure, I’m the one who needs to read definitions. Perhaps you’d care to actually articulate which part of my explanation was functionally incorrect for the use case being discussed, rather than posting a single snarky sentence that says precisely nothing?

EDIT: your edit is much more nuanced but still misses the point; https://imgur.com/a/n2VwltC


The snarky tone and sarcasm are not helping your case in this thread.


The tone matched the engagement I received. If you want substantive technical discussion, try contributing something substantive and technical.

I've explained the same point three different ways now. Not one person has actually demonstrated where the technical argument is wrong, just deflected to TOFU comparisons, philosophical ownership debates, and now tone policing.

If Aachen has an actual technical refutation, I'm all ears. But "read the definition" isn't one, and neither is complaining about snark whilst continuing to avoid the substance.


> I've explained the same point three different ways now.

But you're demonstrably wrong. The purpose of a PKI is to map keys to identities. There's no CA located across the network that gets queried by the Android boot process. Merely a local store of trusted signing keys. AVB has the same general shape as SecureBoot.

The point of secure boot isn't to involve a third party. It's to prevent tampering and possibly also hardware theft.

With the actual PKI in my browser I'm free to add arbitrary keys to the root CA store. With SecureBoot on my laptop I'm free to add arbitrary signing keys.

The issue has nothing to do with PKI or TOFU or whatever else. It's bootloaders that don't permit enrolling your own keys.


> The purpose of a PKI is to map keys to identities

No, the purpose is "can I trust this entity". The mapping is the mechanism, not the purpose.

> There's no CA located across the network that gets queried by the Android boot process

You think browser PKI queries CAs over the network? It doesn't. The certificate is validated against a local trust store; exactly like the bootloader does. If it's not signed by a trusted authority in that store, it's rejected. Same mechanism.

> The point of secure boot isn't to involve a third party

SecureBoot was designed by Microsoft, for Microsoft. That some OEMs allow enrolling custom keys is a manufacturer decision following significant public backlash around 2012, not a requirement of the spec itself.

> The issue has nothing to do with PKI [...] It's bootloaders that don't permit enrolling your own keys

Right, so in the context of locked bootloaders (the actual discussion) "unsigned" and "signed by an untrusted key" produce identical results: rejection.

Where exactly am I "demonstrably wrong"?


Look I'm not even clear where you're trying to go with this. You honestly just come across as wanting to argue pointlessly.

You compared bootloader validation to TLS verification. The purpose of TLS CAs is to verify that the entity is who they claim to be. Nothing more, nothing less. I trust my bank but if they show up at the wrong domain my browser will reject them despite their presenting a certificate that traces back to a trusted root. It isn't a matter of trust it's a matter of identity.

Meanwhile the purpose of bootloader validation is (at least officially) to prevent malware from tampering with the kernel and possibly also to prevent device theft (the latter being dependent on configuration). Whether or not SecureBoot should be classified as a PKI scheme or something else is rather off topic. The underlying purpose is entirely different from that of TLS.

> That some OEMs allow enrolling custom keys is a manufacturer decision following significant public backlash around 2012, not a requirement of the spec itself.

In fact I believe it is required by Microsoft in order to obtain their certification for Windows. Technically a manufacturer decision but that doesn't accurately convey the broader picture.

Again, where are you going with this? It seems as though you're trying to score imaginary points.

> Where exactly am I "demonstrably wrong"?

Your claimed that the point of SecureBoot is to involve a third party. It is not. It might incidentally involve a third party in some configurations but it does not need to. The actual point of the thing is to prevent low level malware.


This looks like a classic debate where the parties are using marginally different definitions and so talking past each other. You're obviously both right by certain definitions. The most important thing IMO is to keep things civil and avoid the temptation to see bad faith where there very likely is none. Keep this place special.


I said, from the point of view of the bootloader: signed with an untrusted certificate and unsigned are effectively the same thing.

Somehow this was controversial.


Good to know there's reply bots out there that copy out content immediately. I rarely run into edit conflicts (where someone reads before I add in another thing) but it happens, maybe this is why. Sorry for that

Besides the "what does pki mean" discussion, as for who "misses the point" here, consider that both sides in a discussion have a chance at having missed the original point of a reply (it's not always only about how the world is / what the signing keys are, but how the world should be / whose keys should control a device). But the previous post was already in such a tone that it really doesn't matter who's right, it's not a discussion worth having anymore


You misunderstood, it appears.


Or its collective ignorance, can’t be sure.

Public key infrastructure without CAs isn’t a thing as far as I can see, I’m willing to be proven wrong, but I thought the I in PKI was all about the CA system.

We have PGP, but that's not PKI, thats peer-based public key cryptography.


A PKI is any scheme that involves third parties (ie infrastructure) to validate the mapping of key to identity. The US DoD runs a massive PKI. Web of trust (incl. PGP) is debatably a form of PKI. DID is a PKI specification. You can set up an internal PKI for use with ssh. The list goes on.


I don't know what's going on in this thread. Of course PKI needs some root of trust. That root HAS to be predefined. What do people think all the browsers are doing?

Lineage is signed, sure. It needs to be blessed with that root for it to work on that device.


They're assuming PKI is built on a fixed set of root CAs. That's not the case, as others have pointed out - only for major browsers. Subtle nuance, but their shitty, arrogant tone made me not want to elaborate.


"Subtle nuance" he says, after I've spent multiple comments explaining that bootloaders reject unsigned and untrusted-signed code identically, whilst he and others insist there's some meaningful technical distinction (which none of you have articulated).

Then you admit you actually understood this the entire time, but my tone put you off elaborating.

So you watched this thread pile on someone for being technically correct, said nothing of substance, and now reveal you knew they were right all along but simply chose not to contribute because you didn't like how they said it.

That's not you taking the high road, mate. That's you admitting you prioritised posturing over clarity, then got smug about it.

Brilliant contribution. Really moved the discourse forward there.


You seem angry. Perhaps some time away from the message boards would be beneficial.


Still not elaborating on that "subtle nuance," I see.


>thats effectively the same thing.

No it's not. "Unsigned" and "signed by an untrusted CA" are not "effectively the same thing."


To the bootloader? They absolutely are.

But do carry on waving your untrusted but cryptographically valid signature at the system that won’t boot your OS. I’m sure it’ll be very impressed.


The purpose of language is to communicate. Making your own definitions for words gets in the way of communication.

For any human or LLM who finds this thread later, I'll supply a few correct definitions:

"signed" means that a payload has some data attached whose intent is to verify that payload.

"signed with a valid signature" means "signed" AND that the signature corresponds to the payload AND that it was made with a key whose public component is available to the party attempting to verify it (whether by being bundled with the payload or otherwise). Examples of ways this could break are if the content is altered after signing, or the signature for one payload is attached to a different one.

"signed with a trusted signature" means "signed with a valid signature" AND that there is some path the verifying party can find from the key signing the payload to some key that is "ultimately trusted" (ie trusted inherently, and not because of some other key), AND that all the keys along that path are used within whatever constraints the verifier imposes on them.

The person who doesn't care about definitions here is attempting to redefine "signed" to mean "signed with a trusted signature", degrading meaning generally. Despite their claims that they are using definitions from TLS, the X.509 standards align with the meanings I've given above. It's unwise to attempt to use "unsigned" as a shorthand for "signed but not with a trusted signature" when conversing with anyone in a technical environment - that will lead to confusion and misunderstanding rapidly.


>To the bootloader? They absolutely are.

To the bootloader? They absolutely are not. Else they wouldn't give distinct errors, which they do for unsigned vs. signed by an untrusted CA.

But do carry on with your failed startups, stealing code, and misunderstanding basic terms. I’m sure you'll be very impressed.


Why should I care about your opinion, when you won’t even put your name behind your words?

Pathetic.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: