Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Mythology About Security (gettys.wordpress.com)
144 points by dsr_ on April 9, 2018 | hide | past | favorite | 41 comments


> We asked MIT whether we could incorporate Kerberos (and other encryption) into the X Window System. According to the advice at the time (and MIT’s lawyers were expert in export control, and later involved in PGP), if we had even incorporated strong crypto for authentication into our sources, this would have put the distribution under export control, and that that would have defeated X’s easy distribution.

Fascinating.


Did they really have to include strong crypto?

Most secure protocols negotiate a cipher suite. They just had to add the ability to do so, and maybe some placeholder algorithm using the maximum allowed strength at the time.


The flip side is: Just imagine if Kerberos 1.0 with 40-bit DES was baked into X11 or even IPv4. We'd still be fighting those downgrade attacks. Or maybe we'd be layering real encryption over the broken-but-unremovable encryption, with all the overhead that entails.


True enough, but unless there is some way humanity is going to jump immediately to perfect protocols, then then the tradeoff between backwards-compatibility and downgrade attacks was always going to be with us.

The mere lack of any kind of encryption in our basic protocols can be considered the most important "downgrade attack" around since we never are quite sure when something is going to leak out into the huge background of plaintext.


Why do you think that is true? Cipher suite selection and whitelisting is not a new thing--even if plaintext with CRC32 is in the mix--we deal with it successfully enough in TLS and SSH all the time.


“we deal with it successfully enough in TLS and SSH all the time”

Not quite. Downgrade attacks has been a huge problem for TLS.


Until the rise of SSL and IPSEC, I don't think it was the case that protocols typically gave you a choice of ciphers. Keep in mind that they're talking about over a decade earlier.


The article says "even calls to functions to use strong authentication/encryption by providing an external library would have made it covered under export control." Would your proposal have been ruled out by this? They seem to have gone to considerable lengths to find a solution.


I often hear that (quote the article) "Government export controls crippled Internet security and the design of Internet protocols from the very beginning"

Can anyone give me examples of which a design flaw in the protocol results directly in poorer security, and how it could have been better designed?

Not that I doubt the claim but I am not literate in this area.


https://en.wikipedia.org/wiki/Export_of_cryptography_from_th... is probably a good place to start.

One fairly concrete example:

> Shortly afterward, Netscape's SSL technology was widely adopted as a method for protecting credit card transactions using public key cryptography. Netscape developed two versions of its web browser. The "U.S. edition" supported full size (typically 1024-bit or larger) RSA public keys in combination with full size symmetric keys (secret keys) (128-bit RC4 or 3DES in SSL 3.0 and TLS 1.0). The "International Edition" had its effective key lengths reduced to 512 bits and 40 bits respectively (RSA_EXPORT with 40-bit RC2 or RC4 in SSL 2.0, SSL 3.0 and TLS 1.0), by zero-padding 88 bits of the normal 128-bit symmetric key. Acquiring the 'U.S. domestic' version turned out to be sufficient hassle that most computer users, even in the U.S., ended up with the 'International' version, whose weak 40-bit encryption could be broken in a matter of days using a single computer. A similar situation occurred with Lotus Notes for the same reasons.

It's not necessarily a design flaw in the protocol, but it has basically the same effect.


According to this article, without export controls, X would have had strong crypto baked in. So the 'flaw' is that it was designed, well, without crypto.


"Often hear that the reason today’s Internet is not more secure is that the early designers failed to imagine that security could ever matter."

Related to this, you should definitely watch Moxie Marlinspike's (lead dev of Signal) talk where he tells about his discussion with Kipp Hickman, a developer of SSL: https://www.youtube.com/watch?v=UawS3_iuHoA#t=13m52s (until 16:33)


this is why openbsd foundation is based in canada.


Does this matter? We (not just IT people, everyone in the world) always lack the imagination of what could happen, and every time we're caught off guard by the creativity of malicious people. Sometimes a government is to blame, but eventually it's just us. Again, security is a process and a never-ending game of arms race. When you stop playing, they'll get the best of you.

(Disclaimer: this is for the sake of argument. I'm actually a laid-back person and against government surveillance and stuff.)


I would argue that usually it doesn’t matter because of the reasons you mentioned, but in this particular case it really did. The US government, and the NSA in particular, made it a matter of explicit policy to delay and discourage commercial crypto research and development in the US. Off hand, I’d guess this delayed crypto by 10 years. Imagine if 10 years ago, we had today’s understanding of crypto. I’d wager TLS would look better.

I’ve been reading Crypto by Steven Levy, it’s on exactly this topic, and I’m astounded by how actively the NSA discouraged commercial research, and how early on it happened: mid-1970s.


One of the greatest heroes we have to thank are Diffie and Hellman who put up a fight when US government went against their research on public key cryptography.

But I suppose the reason NSA was delaying was they had to develop a workaround for the encryption. Remember, while public key crypto got strong in 1996 when the key lengths ended Tailored Access Operations (NSA's hacking team) was created in 1998, before DES was replaced with AES in 2001.

Now obviously it's not that simple. Phasing out weak standards has taken very long. Personally, I would love to understand what goes on in the heads of developers who still use age old primitives like MD5 and RSA-1280 (iMessage).


> I suppose the reason NSA was delaying was they had to develop a workaround for the encryption

No. Their primary purpose was never to protect anybody else but their own operations (the concept "Nobody but us" is older and broader than Wikipedia currently knows https://en.wikipedia.org/wiki/NOBUS ) especially not "common citizens". The NSA is mainly a military institution. Had they been able to get by with nobody being allowed to use crypto but they, they would have done that and continue doing.

Bonus: this is directly from the NSA:

https://www.nsa.gov/resources/everyone/digital-media-center/...

Covered here:

https://www.dailydot.com/layer8/cryptokids-nsa-foia/


To help other possible readers: it's "Crypto: How the Code Rebels Beat the Government -- Saving Privacy in the Digital Age" by Steven Levy, from 2001.


Strong recommendation to read this book; it's a pretty approachable introduction to modern cryptography and the people we have to thank for where we are today.


> Sometimes a government is to blame, but eventually it's just us.

What do you mean? The tendency for bad guys to exploit bugs is really important. But the fact that a government deliberately broke security for everyone is an additional thing, not a special case.

And its worthwhile noting that it is a failure mode we can expect from governments. The US had a realistic concern that opposing nations would use strong crypto against them. The trouble was that they put that concern above all the other consequences that followed from their laws. The policy makers probably couldn't even imagine most of those consequences.

It is in the nature of legislation that it can amplify whatever particular concern captures the political imagination without having to consider the broader picture.


"The choice for all of us working on that software was stark: we could either distribute the product of our work, or enter a legal morass, and getting it wrong could end up in court"

Is this not simply an economically expedient choice? To put the security and privacy of users below that of product distribution? How is this choice really different than any tradeoff a software company today makes about security?


That's an unfair characterisation.

There's a world of difference between choosing not to implement security features to allow faster shipping; and keeping security features out because with them you are not allowed to ship.

The first is garden-variety negligence. The second is politically mandated malfeasance.


It is difference between risk of "loosing some money+time" and risk of "loosing huge huge amount of money and possibly ending in jail".

To use analogy, there is difference between risking a dollar in a bet and risking five thousands of dollars+broken collarbone.


I find this hard to believe. I can certainly believe that American crypto laws resulted in a lot of unencrypted protocols, but there’s more to security than just crypto. What about things like rlogin? A lot of older stuff (and newer stuff, for that matter) assumes that the other side is trustworthy, which is a separate concern from encryption.


What about rlogin? Isn't the major security vulnerability with it the lack of encryption?


One of rlogin's authentication methods is to have a list of hosts and usernames that are allowed to log in. If you're on the list, you're in. How does it know you're actually using that username? Well, it asks your host, which would definitely never lie.


This make me to recall a website which claimed NIST P256 ECC Curve is unsafe in some respects.


Can someone explain the US laws of export control around cryptography in layman's term?


Best i recall, from back when the PGP thing was going down, was that anything above 90-bit keys (or some such) was basically considered the equivalent of a military weapon.

So if you wanted to offer it to anyone outside of USA, you were treated as if you were trying to deal in tanks or fighter jets.



Paranoia


So uh, why did you design X in such a manner that any client could sniff any other client's events and windows by default, and only later add a (quite inadequate) SECURITY extension?

This is what we mean when we say that the security model of X is obsolete, and an afterthought besides. The threat model was completely different back then: every griefer, troll, thief, and state actor didn't have a pipe straight into your X session through the browser, and for the most part X was used to talk to trusted programs on trusted hosts.

Wayland, by contrast, has a security model for the modern, hostile internet built in from the start.


> Wayland, by contrast, has a security model for the modern, hostile internet built in from the start.

And yet basic video and screen capture is not working for years now. Arbitrary rectangle capture still doesn't work on Ubuntu 16.04 in any tool I know of.

So they made it so secure to make basic features not work.


I wonder if there will be a successor to Wayland and X11 that learns from the mistakes of both?

Can't always make wise improvements if you don't first make awful mistakes that weren't properly considered in practice.


> I wonder if there will be a successor to Wayland and X11 that learns from the mistakes of both?

Ubuntu wanted to do this with Mir. The rest is history...


A pity that the Mir devs didn't really have any experience or deep knowledge of the Linux graphics stack...

It took the X11 devs, many of whom were also involved in Mesa and Linux graphics development, literally years just to get all of the necessary plumbing in place in order to replace what X11 was previously doing.

These developments help not only X11 or Wayland, but make the development of a newer, superior protocol to X11 and Wayland much easier, because the hard yards have been done for them.

It also makes it easier for those who might want to transition from Wayland to a hypothetical superior protocol.


> And yet basic video and screen capture is not working for years now. Arbitrary rectangle capture still doesn't work on Ubuntu 16.04 in any tool I know of.

I'm curious, what tools have you tried? I ask because the default screenshot tool works perfectly for stills (including stills of video playing in a window or full-screen), and the few video screen recording tools I've tried all worked perfectly. At the moment I'm using Kazam on 16.04 with the default Nouveau drivers, and a quick test with Chromium playing a video stream + Totem simultaneously playing a local video file confirms that everything is captured no problem. Full screen, arbitrary windows, all tested and working.

Tl;dr try Kazam, unless I'm misunderstanding the issue you describe?


> any client could sniff any other client's events and windows

That's a feature that I have used many times in the past. Isolation should be handled at the X server level, where it can be handled without excessive complexity or require breaking backward compatibility.

> default

Locked down defaults and enabling features opt-in is good design. (Principle of Least Privilege)

> Wayland

Isn't compatible as it's missing required features (by design).


BTW, i think SECURITY was to be a first step towards a more secure X. But nobody enabled it because it broke some big name programs (Firefox being one, supposedly)...


Because all programs were running under the same user account, so there was no use trying to build a security boundary between them in the X server.

Wanting to run mutually distrustful sandboxed apps side by side was not a popular use case back then.


Actually, that's not the case. Back then, it was much more commonplace to have a workstation and run local programs side by side with programs running on remote hardware. The network transparency of the X protocol was an advantage -- again, as long as you were running trusted programs on trusted remote hosts.

These days network transparency is a) irrelevant for most use cases and b) much better implemented with newer protocols like RDP and PCoIP. That's why it was removed from Wayland.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: