πŸ‘€dsr_πŸ•‘7yπŸ”Ό144πŸ—¨οΈ41

(Replying to PARENT post)

> We asked MIT whether we could incorporate Kerberos (and other encryption) into the X Window System. According to the advice at the time (and MIT’s lawyers were expert in export control, and later involved in PGP), if we had even incorporated strong crypto for authentication into our sources, this would have put the distribution under export control, and that that would have defeated X’s easy distribution.

Fascinating.

πŸ‘€collinmandersonπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

I often hear that (quote the article) "Government export controls crippled Internet security and the design of Internet protocols from the very beginning"

Can anyone give me examples of which a design flaw in the protocol results directly in poorer security, and how it could have been better designed?

Not that I doubt the claim but I am not literate in this area.

πŸ‘€colorincorrectπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

"Often hear that the reason today’s Internet is not more secure is that the early designers failed to imagine that security could ever matter."

Related to this, you should definitely watch Moxie Marlinspike's (lead dev of Signal) talk where he tells about his discussion with Kipp Hickman, a developer of SSL: https://www.youtube.com/watch?v=UawS3_iuHoA#t=13m52s (until 16:33)

πŸ‘€mashedvikingsπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

this is why openbsd foundation is based in canada.
πŸ‘€adynatosπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Does this matter? We (not just IT people, everyone in the world) always lack the imagination of what could happen, and every time we're caught off guard by the creativity of malicious people. Sometimes a government is to blame, but eventually it's just us. Again, security is a process and a never-ending game of arms race. When you stop playing, they'll get the best of you.

(Disclaimer: this is for the sake of argument. I'm actually a laid-back person and against government surveillance and stuff.)

πŸ‘€euskeπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

"The choice for all of us working on that software was stark: we could either distribute the product of our work, or enter a legal morass, and getting it wrong could end up in court"

Is this not simply an economically expedient choice? To put the security and privacy of users below that of product distribution? How is this choice really different than any tradeoff a software company today makes about security?

πŸ‘€petermcneeleyπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

I find this hard to believe. I can certainly believe that American crypto laws resulted in a lot of unencrypted protocols, but there’s more to security than just crypto. What about things like rlogin? A lot of older stuff (and newer stuff, for that matter) assumes that the other side is trustworthy, which is a separate concern from encryption.
πŸ‘€mikeashπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

This make me to recall a website which claimed NIST P256 ECC Curve is unsafe in some respects.
πŸ‘€AsiasweatworkerπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Can someone explain the US laws of export control around cryptography in layman's term?
πŸ‘€manhntπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

So uh, why did you design X in such a manner that any client could sniff any other client's events and windows by default, and only later add a (quite inadequate) SECURITY extension?

This is what we mean when we say that the security model of X is obsolete, and an afterthought besides. The threat model was completely different back then: every griefer, troll, thief, and state actor didn't have a pipe straight into your X session through the browser, and for the most part X was used to talk to trusted programs on trusted hosts.

Wayland, by contrast, has a security model for the modern, hostile internet built in from the start.

πŸ‘€bitwizeπŸ•‘7yπŸ”Ό0πŸ—¨οΈ0