>Bergmann agreed with declaring the experiment over, worrying only that Rust still "doesn't work on architectures that nobody uses".
I love you Arnd.
More seriously, this will become an issue when someone starts the process of integrating Rust code into a core subsystem. I wonder whether this will lead to the kernel dropping support for some architectures, or to Rust doing the necessary work. Probably a bit of both.
There are two separate ongoing projects to make a rust compiler that uses GCC as a backend (one on the gcc side adding a c++ frontend that directly reads rust, one on the rustc side to make rustc emit an intermediate format that gcc can ingest).
The long-term solution is for either of those to mature to the point where there is rust support everywhere that gcc supports.
I wonder how good a LLVM backend for these rare architectures would have to be for that to be “good enough” for the kernel team. Obviously correctness should be non-negotiable, but how important is it that the generated e.g. Alpha code is performant for somebody’s hobby?
I suspect more the latter than anything. It could be that by the time Rust gets used in the kernel core, one or both of the GCC implementations would be functional enough to compile the kernel.
I'm curious though, if someone has an ancient/niche architecture, what's the benefit of wanting newer kernels to the point where it'd be a concern for development?
I presume that outside of devices and drivers, there's little to no new developments in those architectures. In which case, why don't the users/maintainers of those archs use a pre-6.1 kernel (IIRC when Rust was introduced) and backport what they need?
No one is doing any kind of serious computing on 30 year old CPUs. But the point of the hobby isn’t turning on the computer and doing nothing with it. The hobby is putting together all the pieces you need to turn it on, turning it on and then doing nothing with it.
There’s an asymmetry in what the retro computing enthusiasts are asking for and the amount of effort they’re willing to put in. This niche hobby benefits from the free labour of open source maintaining support for their old architectures. If the maintainers propose dropping support because of the cost of maintenance the hobbyists rarely step up. Instead they make it seem like the maintainers are the bad guys doing a reprehensible thing.
You propose they get their hands dirty and cherry pick changes from newer kernels. But they don’t want to put in effort like that. And they might just feel happier that they’re using the “real” latest kernel.
> I'm curious though, if someone has an ancient/niche architecture, what's the benefit of wanting newer kernels to the point where it'd be a concern for development?
Wanting big fixes (including security fixes, because old machines can still be networked) and feature improvements, just like anyone else?
> I presume that outside of devices and drivers, there's little to no new developments in those architectures.
There's also core/shared features. I could very easily imagine somebody wanting eg. ebpf features to get more performance out of ancient hardware.
> In which case, why don't the users/maintainers of those archs use a pre-6.1 kernel (IIRC when Rust was introduced) and backport what they need?
Because backporting bits and pieces is both hard and especially hard to do reliably without creating more problems.
> To me the more salient questions are how long before (a) we get Rust in a core subsystem (thus making Rust truly _required_ instead of "optional unless you have hardware foo"), and (b) requiring Rust for _all_ new code.
Previously, the position was that C developers would not be forced to learn Rust.
And a few days ago a security vulnerability was found in the Rust Linux kernel code.
Where did anyone promise that the Rust bits will never have security issues? That CVE was a fantastic demonstration of just how much better the situation is in Rust code and I don't think there's a realistic argument that the experiment has been anything other than successful.
> And a few days ago a security vulnerability was found in the Rust Linux kernel code.
was it a security vulnerability? I'm pretty sure it was "just" a crash. Though maybe someone smarter than me could have turned that into something more.
I have no dog in this race, I really like the idea of Rust drivers but can very much understand retiscience at getting Rust to be handling more core parts of the kernel, just because Rust's value seems to pay off way more in higher level code where you have these invariants to maintain across large code paths (meanwhile writing a bunch of doubly-linked lists in unsafe Rust seems a bit like busy work, modulo the niceties Rust itself can give you)
> was it a security vulnerability? I'm pretty sure it was "just" a crash.
It's a race condition resulting in memory corruption.[1][2] That corruption is shown to result in a crash. I don't think the implication is that it can result only in crashes, but this is not mentioned in the CVE.
Whether it is a vulnerability that an attacker can crash a system depends on your security model, I guess. In general it is not expected to happen and it stops other software from running, and can be controlled by entities or software who should not have that level of control, so it's considered a vulnerability.
Agree. As a dev I've had to pivot a dozen times in my career. If you're a dev you should be able to learn a new language fairly quickly, all the core elements are the same, just the vocabulary is a little different. Since nobody has "fixed" C to avoid these bugs, and we're not going to go as far as put Java or .Net in the kernel, then I think Rust is probably the best, most pragmatic solution we have right now.
Eh, these are highly skilled individuals that just don't like learning a new thing, no matter how useful. That attitude is a problem to be managed, but these people deserve a certain amount of respect. Anyone saying "suck it" or similar should get a grip.
"You don't need to learn it or use it, we just want to do our own separate things with it over here"
.. some time later ..
"Oh yeah it's working good for us, we think it'd be useful to use it in these additional places, think about the benefits!"
.. some time later ..
"Now it's going to be core and required, either deal with it or get out"
They know they could never jump straight to the last step without revolt, so they shove their foot in the door with fake promises and smiles and then slowly over time force the door all the way open until they eventually get what they wanted from the beginning.
There are lots of vulnerabilities in the C code, too. Should we remove C or do we accept that fact that no language is perfect since they are used by humans and continue to make improvements and use tools that help us along as time goes forward?
The cherry picking for this one Rust vulnerability to the ~150 C vulnerabilities is such a weird take that I can't help but think people have some weird hatred of Rust.
Your post is curious, for the post I quoted basically argued for just that eventuality for all new code. Even as the new language introduces undefined behavior vulnerabilities.
The promises as stated previously, and the goal as stated by that lwn.net post now, are starkly different. And the poster did not even wait until the new language has proven its worth. And then a UB CVE comes by in the code in the new language.
> So when you change the C interfaces, the Rust people will have to deal with the fallout, and will have to fix the Rust bindings. That's kind of the promise here: there's that "wall of protection" around C developers that don't want to deal with Rust issues in the promise that they don't have to deal with Rust.
That both you and that lwn.net poster writes these things, is extraordinarily weird and strange.
I do not think it is weird. Every C bug was taken as clear evidence that we need to abandon C and switch to Rust. So the fact that there are also such bugs in Rust is - while obvious - also important to highlight. So it is not weird hatred against Rust, but hatred against bullshit. And considering that most of the code is C, your 150 C vulnerabilities is a meaningless number, so you still continue with this nonsense.
Once again, congrats to the R4L team! It's a big milestone and I'm looking forwards to future developments!.
There was a lot of interesting discussion on the previous post [0], but one thing I didn't see was much discussion about this bit:
> The DRM (graphics) subsystem has been an early adopter of the Rust language. It was still perhaps surprising, though, when Airlie (the DRM maintainer) said that the subsystem is only "about a year away" from disallowing new drivers written in C and requiring the use of Rust.
I was a bit surprised when I first read this. Is this meant to be read in a way that is more just a description of the state of Rust bindings (e.g., the DRM subsystem is about a year away from being able to require the use of Rust, but isn't actually planning on doing so), or it is describing actual plans (e.g., the DRM subsystem is about a year away from actually requiring the use of Rust)? I was originally more inclined to go for the former interpretation, but this other bit:
> With regard to adding core-kernel dependencies on Rust code, Airlie said that it shouldn't happen for another year or two.
Makes me think that perhaps the devs are actually considering the latter. Is anyone more in-the-know able to comment on this?
I'm not more "in the know" but it makes sense that new drivers could require it. New drivers, after all, are pretty much always written for newer platforms that Rust has support for. The main issue with enabling Rust (let alone requiring it) is that Linux still supports platforms which Rust does not.
I don't know. My line of thinking is that if the Linux devs are comfortable with using Rust in core Linux then they might also be comfortable requiring Rust for new drivers as well. No idea if that makes sense, though.
If you build a house out of inflammable bricks instead of magnesium ones you can at least rule out magnesium fires.
Now the question is: If we live in a world where magnesium fires are common, can we afford to not at least try building with the inflammable bricks?
I know this topic stokes emotions, but if you haven't tried Rust as someone with C/C++ experience, give it a go. You will come out wiser on the other side, even if you never use the language for anything.
Inflammable doesn’t mean not flammable. It means able to be inflamed. Language changes over time but this word is particularly problematic, so I’d avoid it to avoid confusion.
I love you Arnd. More seriously, this will become an issue when someone starts the process of integrating Rust code into a core subsystem. I wonder whether this will lead to the kernel dropping support for some architectures, or to Rust doing the necessary work. Probably a bit of both.
The long-term solution is for either of those to mature to the point where there is rust support everywhere that gcc supports.
I'm curious though, if someone has an ancient/niche architecture, what's the benefit of wanting newer kernels to the point where it'd be a concern for development?
I presume that outside of devices and drivers, there's little to no new developments in those architectures. In which case, why don't the users/maintainers of those archs use a pre-6.1 kernel (IIRC when Rust was introduced) and backport what they need?
There’s an asymmetry in what the retro computing enthusiasts are asking for and the amount of effort they’re willing to put in. This niche hobby benefits from the free labour of open source maintaining support for their old architectures. If the maintainers propose dropping support because of the cost of maintenance the hobbyists rarely step up. Instead they make it seem like the maintainers are the bad guys doing a reprehensible thing.
You propose they get their hands dirty and cherry pick changes from newer kernels. But they don’t want to put in effort like that. And they might just feel happier that they’re using the “real” latest kernel.
Wanting big fixes (including security fixes, because old machines can still be networked) and feature improvements, just like anyone else?
> I presume that outside of devices and drivers, there's little to no new developments in those architectures.
There's also core/shared features. I could very easily imagine somebody wanting eg. ebpf features to get more performance out of ancient hardware.
> In which case, why don't the users/maintainers of those archs use a pre-6.1 kernel (IIRC when Rust was introduced) and backport what they need?
Because backporting bits and pieces is both hard and especially hard to do reliably without creating more problems.
> To me the more salient questions are how long before (a) we get Rust in a core subsystem (thus making Rust truly _required_ instead of "optional unless you have hardware foo"), and (b) requiring Rust for _all_ new code.
Previously, the position was that C developers would not be forced to learn Rust.
And a few days ago a security vulnerability was found in the Rust Linux kernel code.
https://news.ycombinator.com/item?id=46309536
Dead Comment
was it a security vulnerability? I'm pretty sure it was "just" a crash. Though maybe someone smarter than me could have turned that into something more.
I have no dog in this race, I really like the idea of Rust drivers but can very much understand retiscience at getting Rust to be handling more core parts of the kernel, just because Rust's value seems to pay off way more in higher level code where you have these invariants to maintain across large code paths (meanwhile writing a bunch of doubly-linked lists in unsafe Rust seems a bit like busy work, modulo the niceties Rust itself can give you)
It's a race condition resulting in memory corruption.[1][2] That corruption is shown to result in a crash. I don't think the implication is that it can result only in crashes, but this is not mentioned in the CVE.
Whether it is a vulnerability that an attacker can crash a system depends on your security model, I guess. In general it is not expected to happen and it stops other software from running, and can be controlled by entities or software who should not have that level of control, so it's considered a vulnerability.
[1] https://www.cve.org/CVERecord/?id=CVE-2025-68260 [2] https://lore.kernel.org/linux-cve-announce/2025121614-CVE-20...
Dead Comment
The learning seems to be the only legitimate issue that people have. But they avoid mentioning it because it sounds intellectually lazy.
"You don't need to learn it or use it, we just want to do our own separate things with it over here"
.. some time later ..
"Oh yeah it's working good for us, we think it'd be useful to use it in these additional places, think about the benefits!"
.. some time later ..
"Now it's going to be core and required, either deal with it or get out"
They know they could never jump straight to the last step without revolt, so they shove their foot in the door with fake promises and smiles and then slowly over time force the door all the way open until they eventually get what they wanted from the beginning.
The cherry picking for this one Rust vulnerability to the ~150 C vulnerabilities is such a weird take that I can't help but think people have some weird hatred of Rust.
Your post is curious, for the post I quoted basically argued for just that eventuality for all new code. Even as the new language introduces undefined behavior vulnerabilities.
The promises as stated previously, and the goal as stated by that lwn.net post now, are starkly different. And the poster did not even wait until the new language has proven its worth. And then a UB CVE comes by in the code in the new language.
What Linus wrote in the past:
https://www.phoronix.com/news/Torvalds-On-Rust-Maintainers
> So when you change the C interfaces, the Rust people will have to deal with the fallout, and will have to fix the Rust bindings. That's kind of the promise here: there's that "wall of protection" around C developers that don't want to deal with Rust issues in the promise that they don't have to deal with Rust.
That both you and that lwn.net poster writes these things, is extraordinarily weird and strange.
There was a lot of interesting discussion on the previous post [0], but one thing I didn't see was much discussion about this bit:
> The DRM (graphics) subsystem has been an early adopter of the Rust language. It was still perhaps surprising, though, when Airlie (the DRM maintainer) said that the subsystem is only "about a year away" from disallowing new drivers written in C and requiring the use of Rust.
I was a bit surprised when I first read this. Is this meant to be read in a way that is more just a description of the state of Rust bindings (e.g., the DRM subsystem is about a year away from being able to require the use of Rust, but isn't actually planning on doing so), or it is describing actual plans (e.g., the DRM subsystem is about a year away from actually requiring the use of Rust)? I was originally more inclined to go for the former interpretation, but this other bit:
> With regard to adding core-kernel dependencies on Rust code, Airlie said that it shouldn't happen for another year or two.
Makes me think that perhaps the devs are actually considering the latter. Is anyone more in-the-know able to comment on this?
[0]: https://news.ycombinator.com/item?id=46213585
Now the question is: If we live in a world where magnesium fires are common, can we afford to not at least try building with the inflammable bricks?
I know this topic stokes emotions, but if you haven't tried Rust as someone with C/C++ experience, give it a go. You will come out wiser on the other side, even if you never use the language for anything.
Dead Comment
There's dozens of rust osdev projects. Its an open question if any become relevant like linux.
Dead Comment
Dead Comment