I know rewrite it in Rust has essentially become a meme at this point, but really there aren't a lot of great reasons to write new software in languages that aren't memory safe. The only compelling reasons I can think of off the top of my head are interop with existing software and that C and C++ developers would require a bit of training to move to safer languages. There may also be a few problem domains that demand the maximum performance and lowest resource utilization possible, but your typical software would have more than acceptable performance in a memory safe language.
>The only compelling reasons I can think of off the top of my head are interop with existing software and that C and C++ developers would require a bit of training to move to safer languages.
Decent C and C++ developers shouldn't take that long to grasp the borrow-checker. Those that don't can't be trusted coding C or C++ in the first place. Because they don't have a good enough mental model of memory management to deal with it manually.
Interop with C is fairly straightforward. Python also with PyO3. C++ has some support.
> There may also be a few problem domains that demand the maximum performance and lowest resource utilization possible
You can approach that with Rust as well. Often you can even do performance improvements better and safer then in C++, because the borrow checker helps you stay safe. Especially CoW is a godsend when processing data.
> Decent C and C++ developers shouldn't take that long to grasp the borrow-checker. Those that don't can't be trusted coding C or C++ in the first place. Because they don't have a good enough mental model of memory management to deal with it manually.
In fairness to C and C++ developers: while often the problem is not understanding memory management, sometimes the problem is understanding it and having a model that doesn't mesh with borrow checking, or having a problem that needs some additional work to map into something easy to write in safe code. For instance, it takes some additional knowledge in Rust to know that if you want to build a graph, you 1) should almost always use an established graph library and not write your own, and 2) if you do need to write your own graph, you either need to use Rc, have an array of nodes and use indices, or use unsafe and raw pointers and provide a nice safe well-tested wrapper.
(I'm one of the developers of Rust, and I think it's important to characterize languages fairly.)
> Productivity
> Readability
> Performance
> Static binaries
> Not needing PhD in rust, especially async rust
Rust works fine as high level language but it's absolutely awful for anything else and suffers from C++ feature creep. It would be a much better language if rust wasn't afraid of making `unsafe` an useful tool rather than thing that "everyone expect the core devs should absolutely avoid". Ergonomics would increase tenfolds if you could use it to essentially control the borrow checker better as well instead having to do mega abstractions (which aren't often zero-cost even) or hoops for every single thing.
I feel like people on an esteemed forum like Hacker News should be able to move past this kind of beginner-level argument.
The point of memory safe languages is not to not have unsafe code. It is to centralize unsafe code in a few places that can be thoroughly audited, tested, and verified, and then provide safe abstractions on top of that.
No one is saying that you can't use unsafe languages or features when you need to. This press release is announcing the release of a report from the White House Office of the National Cyber Director (ONCD) and it is simply a recommendation, not law or regulatory guidance.
How about the sheer cost of the rewrites? Probably hundreds of billions of dollars worth of rewriting and retesting, if not more. Plus the costs of retraining.
Memory safety is very weak in Rust, IMO. Memory errors are only one kind of error, and users of other languages have invented many brilliant ways to avoid and fix issues. It would be better to focus on detecting and preventing errors than rewriting IMO.
These guidelines do not speak about re-writes, they talk about new code, techniques to mitigate memory unsafety in existing code, and preferring to use memory safe tools where possible, for example, when you are purchasing software.
Within another couple years, automated systems that are descendants of today's rapidly-advancing coding-assistant LLMs will be able to do amazing things.
For example, it's plausible they'd be able to to create proven-memory-safe rewrites of older software, with verified drop-in compatibility for all enumerable use-cases, practically for free.
Well, unless it's a parasitic government contractor doing the work.
Meanwhile, most people haven't even heard of Capability Based Security.[1] 8(
It's sad to see so much effort wasted, while ignoring the root cause, ambient authority.
Given the sad state of operating systems these days, you're forced to absolutely trust any code you run, which is nuts.
Anything you run can open a port to a server somewhere and exfiltrate all your data, encrypt it, or whatever evil that the programmer wants, and there's no practical way to stop it.
---
Imagine if the power grid were built without any fuses or circuit breakers... that's what our internet and computers are like at present, in terms of security.
In that world, it would be like the White House demanding purer copper in the wiring.
Capability systems are a great thing, and they solve another class of security problems. Use memory safe languages, and move towards capability systems.
Capability security doesn't prevent data leakage and lateral movement, especially in today's age of "identity based security". A compromised application would most likely still be running with a privileged identity, giving it access to databases etc.
There is no such thing as "privileged identity" in a capabilities based operating system. When you open a document, the application ONLY gets a handle to that document, and nothing else. It certainly can't just then open a pipe to a server somewhere and dump all your documents to it, unlike any random program these days.
It really is completely absurd that this hasn't been solved yet. Almost makes you think it is on purpose.
Why the hell doesn't Windows implement permissions like iOS already? They seemed to be moving in the right direction last year, with the announcement of Win32 App Isolation, and then nothing happened.
I mean when it was introduced in MacOS Catalina literally everyone was griping about the popups and menus and how "locked down" everything is. Microsoft probably doesn't want another Vista situation, and who can blame them lol
What I'm most curious about is how this will affect new program language development, and programming languages currently in development. Like if Zig wants to see mainstream adoption, will it have to implement the Borrow Checker, or something similar, if people want to use Zig in government contexts? Trying to create a new programming language that people will invest in is hard enough, and if it isn't memory safe, it will be even less attractive to people.
On top of that, trying to start a software business is tough, and if your will be shunned if your stack is non memory safe, why put yourself at a disadvantage?
In 1975, the DoD commissioned Ada to be designed and built. Although the use of Ada wasn't mandated until 1991, between 1983 and 1996 the number of high level programming languages in use at the DoD fell from 450 to 37. For a few years in the mid-80's, Ada was the most popular programming language, even temporarily surpassing C and C++. The mandate was removed in 1997.
Ada was required for NATO systems and was mandated as the preferred language for defense-related software in Sweden, Germany, and Canada.
Today it is used in projects where a bug can have severe consequences such as avionics, air-traffic control, and commercial rockets such as the Ariane 4 and 5, satellites and other space systems, railway transport, and banking.
This doesn’t go into any detail, not even saying what counts as a “memory safe language.” Are there any practical implications? Will the government change anything it does?
This document also does not provide a concrete definition of "memory safe language," instead only providing examples of such, but there are two things that are interesting about it: first of all, it provides C and C++ as examples of "memory unsafe languages," explicitly.
The second is more towards your "practical implications" question. It says this:
> With this guidance, the authoring agencies urge senior executives at every software manufacturer to reduce customer risk by prioritizing design and development practices that implement MSLs. Additionally, the agencies urge software manufacturers to create and publish memory safe roadmaps that detail how they will eliminate memory safety vulnerabilities in their products. By publishing memory safe roadmaps, manufacturers will signal to customers that they are taking ownership of security outcomes, embracing radical transparency, and taking a top-down approach to developing secure products—key Secure by Design tenets.
The National Defense Authorization Act for Fiscal Year 2024 had language inside of it that said:
> SEC. 1713. POLICY AND GUIDANCE ON MEMORY-SAFE SOFT- WARE PROGRAMMING.
>
> (a) POLICY AND GUIDANCE.—Not later than 270 days after the date of the enactment of this Act, the Secretary of Defense shall develop a Department of Defense wide policy and guidance in the form of a directive memorandum to implement the recommendations of the National Security Agency contained in the Software Memory Safety Cybersecurity Information Sheet published by the Agency in November, 2022, regarding memory-safe software programming languages and testing to identify memory-related vulnerabilities in software developed, acquired by, and used by the Department of Defense."
This is referring to the above. However, the final bill text seems to be missing this, and I haven't tracked down yet how that happened.
The sentiment seems to feel like this is something akin to a softer version of the Ada Mandate: that being implemented in a memory safe language is a competitive advantage if you want to sell to the DoD, because using memory unsafe languages will require documentation explaining how you're mitigating the issues they have. Time will tell if that actually comes to pass.
Yeah, I think they're trying to say "Stop writing in C and C++, blockheads!" but in a more diplomatic tone. Most of the common languages today are memory-safe. Really, which ones aren't? C and C++ are the big ones. I guess throw Pascal in there if you don't use pointers correctly. Assembly lets you make indirect accesses anywhere in your allocated memory. Perl lets you leak memory if you create circular structures and then lose references to them. But Java, JavaScript, Go, Python, Ruby, etc. don't let you trample all over memory the way C and C++ let you. You can corrupt memory if you really want to, e.g. in Python by using ctypes to cast integers to pointers, but it takes a lot of effort.
These documents tend to be vague because details could be wrong. They're used by the next layer of bureaucracy to justify programs, and by the next layer to justify plans, and by the next layer to justify designs, and the next layer to justify implementations.
I know you're joking, but the government had asked for comments on some of this work previously, and part of the C++ committee did in fact respond.
I had read all 200+ responses, and was planning on writing up a post about them, because it is interesting, but I've had other things going on this month, and they actually shipped this before I managed to do so. Oh well, maybe I'll still end up doing this.
Anyway, there's a lot of junk in there, but out of major organizations, the vast majority were pro, a few were ambivalent, and the committee's response stuck out as one of the only strongly anti responses. (It was also weird for other reasons that turned out to be a benign misunderstanding.)
> was planning on writing up a post about them, because it is interesting
It is very interesting, and I hope that you still do a write up.
> It was also weird for other reasons that turned out to be a benign misunderstanding.
I would love to know more about this part, maybe just to understand a bit more about how these bureaucratic things work, which while less technical are also interesting.
Decent C and C++ developers shouldn't take that long to grasp the borrow-checker. Those that don't can't be trusted coding C or C++ in the first place. Because they don't have a good enough mental model of memory management to deal with it manually.
Interop with C is fairly straightforward. Python also with PyO3. C++ has some support.
> There may also be a few problem domains that demand the maximum performance and lowest resource utilization possible
You can approach that with Rust as well. Often you can even do performance improvements better and safer then in C++, because the borrow checker helps you stay safe. Especially CoW is a godsend when processing data.
In fairness to C and C++ developers: while often the problem is not understanding memory management, sometimes the problem is understanding it and having a model that doesn't mesh with borrow checking, or having a problem that needs some additional work to map into something easy to write in safe code. For instance, it takes some additional knowledge in Rust to know that if you want to build a graph, you 1) should almost always use an established graph library and not write your own, and 2) if you do need to write your own graph, you either need to use Rc, have an array of nodes and use indices, or use unsafe and raw pointers and provide a nice safe well-tested wrapper.
(I'm one of the developers of Rust, and I think it's important to characterize languages fairly.)
https://zackoverflow.dev/writing/unsafe-rust-vs-zig/
The point of memory safe languages is not to not have unsafe code. It is to centralize unsafe code in a few places that can be thoroughly audited, tested, and verified, and then provide safe abstractions on top of that.
> not be allowed to use any rust that uses unsafe()
Not allowed by whom? What's made you assume your computer will be policed for non-memory-safe code?
Memory safety is very weak in Rust, IMO. Memory errors are only one kind of error, and users of other languages have invented many brilliant ways to avoid and fix issues. It would be better to focus on detecting and preventing errors than rewriting IMO.
Within another couple years, automated systems that are descendants of today's rapidly-advancing coding-assistant LLMs will be able to do amazing things.
For example, it's plausible they'd be able to to create proven-memory-safe rewrites of older software, with verified drop-in compatibility for all enumerable use-cases, practically for free.
Well, unless it's a parasitic government contractor doing the work.
It's sad to see so much effort wasted, while ignoring the root cause, ambient authority.
Given the sad state of operating systems these days, you're forced to absolutely trust any code you run, which is nuts.
Anything you run can open a port to a server somewhere and exfiltrate all your data, encrypt it, or whatever evil that the programmer wants, and there's no practical way to stop it.
---
Imagine if the power grid were built without any fuses or circuit breakers... that's what our internet and computers are like at present, in terms of security.
In that world, it would be like the White House demanding purer copper in the wiring.
[1] https://en.wikipedia.org/wiki/Capability-based_security
Why the hell doesn't Windows implement permissions like iOS already? They seemed to be moving in the right direction last year, with the announcement of Win32 App Isolation, and then nothing happened.
Also, the whole Vista UAC thing was permission flags, not proper capability based security.
STRONG AGREE with the suspicion that this hasn't been solved, on purpose.
On top of that, trying to start a software business is tough, and if your will be shunned if your stack is non memory safe, why put yourself at a disadvantage?
The DoD, NSA, and CISA say[0] memory safe languages include[0]:
* C#
* Go
* Java
* Python
* Rust
* Swift
Presumably any other language that can run on a virtual machine, like WASM, is also to be included! Exciting days ahead :)
[0] https://media.defense.gov/2023/Dec/06/2003352724/-1/-1/0/THE...
C can be compiled to WASM, and it's not a memory safe language, so this can't be true...
Deleted Comment
Ada was required for NATO systems and was mandated as the preferred language for defense-related software in Sweden, Germany, and Canada.
Today it is used in projects where a bug can have severe consequences such as avionics, air-traffic control, and commercial rockets such as the Ariane 4 and 5, satellites and other space systems, railway transport, and banking.
This document also does not provide a concrete definition of "memory safe language," instead only providing examples of such, but there are two things that are interesting about it: first of all, it provides C and C++ as examples of "memory unsafe languages," explicitly.
The second is more towards your "practical implications" question. It says this:
> With this guidance, the authoring agencies urge senior executives at every software manufacturer to reduce customer risk by prioritizing design and development practices that implement MSLs. Additionally, the agencies urge software manufacturers to create and publish memory safe roadmaps that detail how they will eliminate memory safety vulnerabilities in their products. By publishing memory safe roadmaps, manufacturers will signal to customers that they are taking ownership of security outcomes, embracing radical transparency, and taking a top-down approach to developing secure products—key Secure by Design tenets.
The National Defense Authorization Act for Fiscal Year 2024 had language inside of it that said:
> SEC. 1713. POLICY AND GUIDANCE ON MEMORY-SAFE SOFT- WARE PROGRAMMING.
>
> (a) POLICY AND GUIDANCE.—Not later than 270 days after the date of the enactment of this Act, the Secretary of Defense shall develop a Department of Defense wide policy and guidance in the form of a directive memorandum to implement the recommendations of the National Security Agency contained in the Software Memory Safety Cybersecurity Information Sheet published by the Agency in November, 2022, regarding memory-safe software programming languages and testing to identify memory-related vulnerabilities in software developed, acquired by, and used by the Department of Defense."
This is referring to the above. However, the final bill text seems to be missing this, and I haven't tracked down yet how that happened.
The sentiment seems to feel like this is something akin to a softer version of the Ada Mandate: that being implemented in a memory safe language is a competitive advantage if you want to sell to the DoD, because using memory unsafe languages will require documentation explaining how you're mitigating the issues they have. Time will tell if that actually comes to pass.
https://www.whitehouse.gov/wp-content/uploads/2024/02/Final-...
Dead Comment
Deleted Comment
I for one am very happy to see that this government has the ability to focus on such important and impactful technical details.
I had read all 200+ responses, and was planning on writing up a post about them, because it is interesting, but I've had other things going on this month, and they actually shipped this before I managed to do so. Oh well, maybe I'll still end up doing this.
Anyway, there's a lot of junk in there, but out of major organizations, the vast majority were pro, a few were ambivalent, and the committee's response stuck out as one of the only strongly anti responses. (It was also weird for other reasons that turned out to be a benign misunderstanding.)
It is very interesting, and I hope that you still do a write up.
> It was also weird for other reasons that turned out to be a benign misunderstanding.
I would love to know more about this part, maybe just to understand a bit more about how these bureaucratic things work, which while less technical are also interesting.
https://www.federalregister.gov/documents/2023/08/10/2023-17...
https://www.regulations.gov/comment/ONCD-2023-0002-0020
Deleted Comment