Asinine victim-blaming. "If you were good at C, it wouldn't have memory safety problems."
OpenSSH, a program written by one of the most security conscious groups of developers out there (OpenBSD), used widely by pretty much every technology company out there including all of the top 8 companies in the world, has had multiple memory safety problems over its lifetime, /none/ of which would have happened in Rust, Zig, or any of their ilk.
Yes, portability in LLVM-based languages is a problem for sure. Even an LLVM/GCC monoculture isn't particularly great. But saying "nah, you're just doing it wrong"? No.
As a Gentoo user and an enthusiastic non-x86 user, when I was just reading the opening, I expected to hear an argument about the necessity of supporting uncommon architectures as a standard library. Gentoo supports a broad range of systems, every time an underlying library switches to a less supported environment creates a headache for Gentoo developers and users, "vintage architecture needs some love" would be an interesting article (not saying it's practical, but interesting). Unfortunately, the author's main argument is a denial of C's memory safety problems. I was disappointed.
Then there is qmail. Totally written in C. No bugs, no security flaws.
But the author is way brighter than I am. I spent my youth programming real-time interrupt rich medical diagnostic software in low-level assembly language, which is harder to get right than C. We had very few bugs.
I would be reluctant to do this in C, probably less so in Rust (which I have yet to master)
I appreciate what the Rust community is trying to accomplish, and suspect it will happen.
However, there is an sense of overselling this idea of memory safety. Keep in mind that SolarWinds and Equifax are not memory safety issues. Also, someone wrote a Rust version of heartbleed, so it would be good to keep a perspective.
And it isn't like I haven't been in the position to be rabid promoter of programming demagoguery. First was the unnecessarily limited gammar of Fortran II. I discovered XP/L and promoted that beyond reason. Then there was the GOTO demagogery which I was one of the loudest. And along with that Structured Programming, then Object Oriented Programming.
So as you promote Rust, don't sell the idea that it will prevent every security nightmare out there--it will massively help with some very important ones. Additional work beyond programming language is necessary to improve security.
When companies like Microsoft and Google publish security reports they state 70% of security exploits were caused by memory corruption bugs, they don't say 100%.
Naturally memory safe systems programming languages still suffer from the remaining 30% root causes for security exploits.
It isn't perfect, but much better, and we aren't still there because bean counters haven't been paying attention how much money it costs to fix those bugs, and the lack of liability.
It retreads the same old, tired argument that Rust developers have been disproving it for so long, i.e. "Just write good C code" or in this case "Just battle test it"
Cryptography is not far off of bc. It is deterministic and should be easily testable.
My bc is battle-hardened because I did my due diligence. The authors of the Cryptography library are writing _crypto_, which requires such due diligence. Did they?
What I should add to the post is that if they were not planning on doing that due diligence from the start, they should have written it in Rust from the beginning and not made promises to users that they couldn't keep.
What they did wrong was making promises to users and then breaking those promises.
I also tried to make the case that they either did not do their due diligence, which they should have done because they used C in the first place, or they did and should not have given up that battle-tested code.
I even mentioned that they were victims of the Rust language not keeping its promises. So I disagree that I am blaming the victim here.
Also, if they did not want to do the due diligence required by using C, they should have written Cryptography in Rust from the beginning and not made promises to users that they couldn't keep.
Rust was not a viable option when pyca/cryptography was first released. You should be more familiar with projects before you invent strange promises you feel they've made to you.
> First, their argument for Rust (and against C) because of memory safety implies that they have not done due diligence in finding and fixing such bugs.
No, it implies that finding such bugs is exponentially harder in C than it is in other languages. Eventually you reach so far up the diminishing returns curve that you simply are able to accomplish more in one language with your time on Earth than you can in another. This is a numbers game, and C loses out to safer competitors when it comes to bugs.
The due diligence that these developers are doing is a meta-analysis that has gone over your head.
> And with my bc, I did my due diligence with memory safety. I fuzzed my bc and eliminated all of the bugs.
That's a pretty absolute claim. And one I doubt.
I'll save my trust for people humble enough to understand the limitations of technologies and themselves.
Which makes it all the worse, because he's using that as "proof" that he has done more due diligence than these other devs, and suggesting that they could achieve just-as-good-or-better results by emulating him (which is provably false given the nature of C vs rust when it comes to bugs).
For a calculator it may be no huge deal to have such a cavalier attitude. But for cryptography, bugs are considerably more important as the threat model isn't even in the same league.
I understand why people are angry and that portability is a worthwhile goal in many scenarios. I also understand that Rust is far from the ubiquity that C has achieved. I also understand that people feel betrayed by a library they depended on drastically altering its dependency structure.
On the other hand, there's a group of developers with limited resources and perceived moral high ground in terms of security that fits right in with the trends that I've observed in the cryptography community.
Both sides have valid arguments. Which sides of this split you fall on depends on your personal values and immediate needs.
Ultimately, the whole debate is fruitless bickering because you cannot reconcile differences in values. This doesn't just apply to the Python "cryptography" library, but also other places where a switch to Rust was enforced by upstream.
The alternative is to define "closer" portability (iOS, Android, Windows, macOS, desktop/server Linux) and "wider" portability (BSDs, illumos, QNX, custom embedded OSes, architectures that aren't x86_64 and ARM). Then make it clear from the get-go which one of these you're aiming for.
There is a happy middle ground[1]. Someone could build a Rust->C compiler and then packages meant for releases could be bundled with pre-compiled C sources.
[1] of course the question of who is responsible for building such a compiler potentially becomes a new value-based conflict.
The author correctly, IMO, points out that this is not possible currently because there's no Rust Specification, so the exact behaviour of such a compiler cannot even be described.
I was surprised by how informal the Rust language semantics are when I was trying to find out how the `?` operator works, exactly. There's no official description anywhere except for an RFC which was created before the feature was introduced, but from which the feature has diverged significantly since it was added to the language!
The Rust Book, normally assumed to be the "true" description of the language, is quite incomplete in places as it's not a formal specification. For example, the section about `?` does not mention that different error types will be auto-converted if there's a `From` implementation between them. This is quite important information missing from the documentation. Just from reading that page, you would be writing really verbose type conversions yourself, everywhere.
But the Rust Book does mention this elsewhere[2] (if you're careful enough reading the first link, you can see that it says `?` is "more or less" equivalent to `try!`, and reading the docs for `try!` you could actually find out what it does, and hope they didn't change anything when implementing `?` - but that's a quite a lot of assumptions). You just need to read the whole book, or at least be lucky enough to look at all the right places, to make sure you understand how a language feature works, precisely.
> Right now, as far as I know, there is no official language spec for Zig, like there is for Go. This means that this new compiler may not match what the official compiler does.
They’re actively working on a spec for Zig btw. Here’s Andrew talking about it, first thing as part of the roadmap video: https://youtu.be/pacsngNYXI0
This is just a longer version of every angry comment on the Github Issue thread it's referring to, differing only in that this person doesn't seem to be a maintainer of any project intersecting with `cryptography`. If you're interested in this debate, such as it is, just read the Github thread.
We won't get rid of C that easily, given the amount of systems that depend on it and won't be rewritten, which is exactly why besides Apple, also Google, Oracle, ARM, Microsoft are leading efforts to turn modern computers into C machines, with hardware memory tagging to fix at hardware level what WG14 won't fix.
Portability would for sure be a great asset for Rust. But it's a new language compared to C. I also agree that having a clear spec is a must. But the author got lost, none of these languages have a clear sweet spot yet that would help them to define a clear spec.
I had added several links to the post regarding replies to what I said about Zig.
I have also clarified a few arguments.
And because there seems to be some disbelief at my claim of absolutely not bugs in my bc, I am going to throw down the gauntlet like I did in https://news.ycombinator.com/item?id=26293920.
I encourage everyone who thinks I am wrong to _prove_ me wrong. Find a memory safety bug in my bc.
OpenSSH, a program written by one of the most security conscious groups of developers out there (OpenBSD), used widely by pretty much every technology company out there including all of the top 8 companies in the world, has had multiple memory safety problems over its lifetime, /none/ of which would have happened in Rust, Zig, or any of their ilk.
Yes, portability in LLVM-based languages is a problem for sure. Even an LLVM/GCC monoculture isn't particularly great. But saying "nah, you're just doing it wrong"? No.
If you read deeper into the post, I do, in fact, talk about how supporting those less well-known architectures is important.
Edit: Also, those arguments were made plenty by others, so I didn't feel a need to rehash them much.
But the author is way brighter than I am. I spent my youth programming real-time interrupt rich medical diagnostic software in low-level assembly language, which is harder to get right than C. We had very few bugs. I would be reluctant to do this in C, probably less so in Rust (which I have yet to master)
I appreciate what the Rust community is trying to accomplish, and suspect it will happen.
However, there is an sense of overselling this idea of memory safety. Keep in mind that SolarWinds and Equifax are not memory safety issues. Also, someone wrote a Rust version of heartbleed, so it would be good to keep a perspective.
And it isn't like I haven't been in the position to be rabid promoter of programming demagoguery. First was the unnecessarily limited gammar of Fortran II. I discovered XP/L and promoted that beyond reason. Then there was the GOTO demagogery which I was one of the loudest. And along with that Structured Programming, then Object Oriented Programming.
So as you promote Rust, don't sell the idea that it will prevent every security nightmare out there--it will massively help with some very important ones. Additional work beyond programming language is necessary to improve security.
Naturally memory safe systems programming languages still suffer from the remaining 30% root causes for security exploits.
It isn't perfect, but much better, and we aren't still there because bean counters haven't been paying attention how much money it costs to fix those bugs, and the lack of liability.
The fact the post is flagged now somehow makes me feel that Rust community can't stand anything else then praise and worship.
I flagged it because it's poorly written flamebait that doesn't add anything to the comments on the GitHub thread.
Cryptography is not far off of bc. It is deterministic and should be easily testable.
My bc is battle-hardened because I did my due diligence. The authors of the Cryptography library are writing _crypto_, which requires such due diligence. Did they?
What I should add to the post is that if they were not planning on doing that due diligence from the start, they should have written it in Rust from the beginning and not made promises to users that they couldn't keep.
What they did wrong was making promises to users and then breaking those promises.
I also tried to make the case that they either did not do their due diligence, which they should have done because they used C in the first place, or they did and should not have given up that battle-tested code.
I even mentioned that they were victims of the Rust language not keeping its promises. So I disagree that I am blaming the victim here.
Also, if they did not want to do the due diligence required by using C, they should have written Cryptography in Rust from the beginning and not made promises to users that they couldn't keep.
No, it implies that finding such bugs is exponentially harder in C than it is in other languages. Eventually you reach so far up the diminishing returns curve that you simply are able to accomplish more in one language with your time on Earth than you can in another. This is a numbers game, and C loses out to safer competitors when it comes to bugs.
The due diligence that these developers are doing is a meta-analysis that has gone over your head.
> And with my bc, I did my due diligence with memory safety. I fuzzed my bc and eliminated all of the bugs.
That's a pretty absolute claim. And one I doubt.
I'll save my trust for people humble enough to understand the limitations of technologies and themselves.
If you doubt my claims about my bc, I suggest you break it and post it here. Embarrass me.
Yes, I am throwing down the gauntlet because I am that confident in my work. Prove me wrong with actual data.
Probably meant all of the reported bugs, given the context.
For a calculator it may be no huge deal to have such a cavalier attitude. But for cryptography, bugs are considerably more important as the threat model isn't even in the same league.
On the other hand, there's a group of developers with limited resources and perceived moral high ground in terms of security that fits right in with the trends that I've observed in the cryptography community.
Both sides have valid arguments. Which sides of this split you fall on depends on your personal values and immediate needs.
Ultimately, the whole debate is fruitless bickering because you cannot reconcile differences in values. This doesn't just apply to the Python "cryptography" library, but also other places where a switch to Rust was enforced by upstream.
The alternative is to define "closer" portability (iOS, Android, Windows, macOS, desktop/server Linux) and "wider" portability (BSDs, illumos, QNX, custom embedded OSes, architectures that aren't x86_64 and ARM). Then make it clear from the get-go which one of these you're aiming for.
[1] of course the question of who is responsible for building such a compiler potentially becomes a new value-based conflict.
I was surprised by how informal the Rust language semantics are when I was trying to find out how the `?` operator works, exactly. There's no official description anywhere except for an RFC which was created before the feature was introduced, but from which the feature has diverged significantly since it was added to the language!
The Rust Book, normally assumed to be the "true" description of the language, is quite incomplete in places as it's not a formal specification. For example, the section about `?` does not mention that different error types will be auto-converted if there's a `From` implementation between them. This is quite important information missing from the documentation. Just from reading that page, you would be writing really verbose type conversions yourself, everywhere.
But the Rust Book does mention this elsewhere[2] (if you're careful enough reading the first link, you can see that it says `?` is "more or less" equivalent to `try!`, and reading the docs for `try!` you could actually find out what it does, and hope they didn't change anything when implementing `?` - but that's a quite a lot of assumptions). You just need to read the whole book, or at least be lucky enough to look at all the right places, to make sure you understand how a language feature works, precisely.
[1] https://doc.rust-lang.org/edition-guide/rust-2018/error-hand...
[2] https://doc.rust-lang.org/book/ch09-02-recoverable-errors-wi...
They’re actively working on a spec for Zig btw. Here’s Andrew talking about it, first thing as part of the roadmap video: https://youtu.be/pacsngNYXI0
Thank you for this. I will update the post with this information.
Deleted Comment
https://support.apple.com/guide/security/memory-safe-iboot-i...
We won't get rid of C that easily, given the amount of systems that depend on it and won't be rewritten, which is exactly why besides Apple, also Google, Oracle, ARM, Microsoft are leading efforts to turn modern computers into C machines, with hardware memory tagging to fix at hardware level what WG14 won't fix.
Can you give me a link to the Reddit thread?
https://old.reddit.com/r/rust/comments/luawx6/rust_zig_and_t...
I had added several links to the post regarding replies to what I said about Zig.
I have also clarified a few arguments.
And because there seems to be some disbelief at my claim of absolutely not bugs in my bc, I am going to throw down the gauntlet like I did in https://news.ycombinator.com/item?id=26293920.
I encourage everyone who thinks I am wrong to _prove_ me wrong. Find a memory safety bug in my bc.
Should be easy since C is so unsafe, right?