Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?
It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.
There's a bootstrapping process that has to happen to compile the compiler. Moving up the language standard chain requires that compilers compiling the compiler need to also migrate up the chain.
So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.
Imagine, for example, that you are debian and you want to prep for the next stable version. It's reasonable that for the next release you'd bootstrap with the prior releases toolset. That allows you to have a stable starting point.
This is not the case. They are discussing the default value of `g++ -std=...`. That does not complicate bootstrapping as long as the C++ sources of GCC are compatible with older and newer versions of the C++ standard.
Aren't they talking about the c++ dialect the compiler expects without any further -std=... arguments? How does that affect the bootstrapping process? This https://gcc.gnu.org/codingconventions.html should define what C/C++ standard is acceptable in the GCC.
Counterpoint: you could write a C++ compiler in a non-C/C++ language such that the compiler’s implementation language doesn’t even have the notion of C++20.
A compiler is perfectly capable of compiling programs which use features that its own source does not.
> Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?
> What is the downside of switching to the newest standard when it's properly supported?
Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc.
[1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++.
[2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing.
[3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues.
well, shouldn't not-up-to-date code use the corresponding compiler flag instead of someone starting a greenfield project, who might then write outdated code?
Please don't spread misinformation. Breaking changes are actually almost inexistent with C++. The last one was with the COW std::string and std::list ~15 years ago with the big and major switch from C++03 to C++11. And heck, even then GCC wouldn't let your code break because it supported dual ABIs - you could mix C++03 and C++11 code and link them together.
So C++ actually tries really hard _not_ to break your code, and that is the philosophy behind a language adhering to something that is called backwards-compatibility, you know? Something many, such as Google, were opposing to and left the committee/language for that reason. I thank the C++ language for that.
Introducing new features or new keywords or making stricter implementation of existing ones, such as narrowing integral conversions, is not a breaking change.
The issue with defaults is that people have projects that implicitly expect the default to be static.
So when the default changes, many projects break. This is maybe fine if it’s your own project but when it’s a few dependencies deep, it becomes more of an issue to fix.
If you’re relying on defaults, and upgrade, that is entirely your fault. Don’t hold everyone in the world back because you didn’t want to codify your expectations.
C++ is very good at compatibility. If your code breaks when the standard changes, odds are it was always broke and you just didn't know. C++ isn't perfect, but it is very good.
Do you have an example? Adding the `--std=<whatever you're using now here>` flag should work, which you should already be using anyways. Is the issue that you don't want to use that argument?
That sounds more like a problem of nonsensical assumptions… what possible expectation could there have been that GCC would never change this in the future?
Where do you draw the line for properly supported? I've been using g++ in c++23 mode for quite some time now - even if every feature is not entirely implemented, the ones that work, work well and are a huge improvement
A lot of software, and thus build automation, will break due to certain features that become warnings or outright errors in new versions of C++. It may or may not be a lot of work to change that, and it may or may not even be possible in some cases. We would all like there to be unlimited developer time, but in real life software needs a maintainer.
> Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?
cursing because the old program does not compile anymore No.
When a language changes significantly faster than release cycles (ie, rust being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Debian's Apt now having rust code, and Debian's release cycle being 4 years for LTS, Debian's shipped rustc won't be able to compile Apt since nearly all rust devs are bleeding edge targeters. The entire language culture is built around this rapid improvement.
I love that C++ has a long enough time between changing targets to actually be useful and that it's culture is about stability and usefulness for users trying to compile things rather than just dev-side improvements uber alles.
The problem you mention is perhaps a sign that the model Debian uses is ill suited for development. Stable software is great but it need not impede progress and evolution. It's also possible to support older rust compiler versions if it's important - apt developers can do the work necessary to support 4yo lts compilers.
> Debian's shipped rustc won't be able to compile Apt since nearly all rust devs are bleeding edge targeters.
This is nonsense. Apt devs can target a rustc release and that release can be the same release that ships with Debian? Moreover, since those apt devs may have some say in the matter, they can choose to update the compiler in Debian!
> The entire language culture is built around this rapid improvement.
... Because this is a cultural argument about how some people really enjoy having their codebase be 6 years behind the latest language standard, not about any actual practical problem.
And I can understand how someone may not be eager to learn C++20's concepts or to add them immediately to a code base, but upgrades to your minimum Rust version don't really feel like that. It's much more like "Wow that's a nifty feature, I immediately understand and I'd like to use in the std lib. That's a great alternative to [much more complex thing...]" See, for example, OnceLock added at 1.70.0: https://doc.rust-lang.org/std/sync/struct.OnceLock.html
In the last “big” shop I worked in, we were cross-compiling all production code. Each target device had an SDK that came with a GCC and a kernel tarball, inter alia. We had a standard way to set these up. We used C++03 for years. We decided to try C++11 for userland. All the compilers supported that and after some validation, we changed permanently. Neither before the change nor after, did we rely on the absence of a “—-std=“ command line option as the means of choosing the standard for C++ or even C.
Of course we were all ADHD pedantic nerds so take this with a grain of salt.
The coroutine convo is interesting. Does it mean that for example, a GCC program may not run correctly when linked to a clang binary and both use coroutines?
This is from 2019, prior to the finalization of modules in the standard. I'd be interested in how many of these issues were unaddressed in the final version shipped.
I think if you were to poll people, a significant portion would be repulsed by this catgirl aesthetic, or (though this isn't the case for Anubis) the cliche inappropriately dressed inappropriately young anime characters dawned as mascots in an ever increasing number of projects. People can do whatever they want with their projects, but I feel like the people who like this crap perhaps don't understand how repulsive it is to a large number of people. Personally it creeps me out.
It's particularly jarring to basically every site I've seen it on which is usually some serious and professional looking open source site.
I wonder why nobody configures this, is this not something that they can configure themselves to a more relevant image, like the GCC logo or something?
Anubis is a bit annoying over crappy internet connections, especially in front of a webpage that would work quite well in this case otherwise, but it still performs way better than Cloudflare in this regard.
Anubis is significantly less jarring than cloudflare blocks preventing any access at all. At least Anubis lets me read the content of pages. Cloudflare is so bleeding edge and commercial they do not care about broad brower support (because it doesn't matter for commercial/sales). But for websites you actually want everyone to be able to load anubis is by far the best.
That said, more on topic, I am really glad that C++ actually considers the implications of switching default targets and only does this every 5 years. That's a decent amount of time and longer than most distros release cycles.
When a language changes significantly faster than release cycles (ie, rustc being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Apt now having rust code, and Debian's release cycle being 4 years for LTS, debian's shipped rustc won't be able to compile Apt.
Many people have said they don't like it, and all that did is make its supporters even happier that it's there, because it makes them feel special is some strange way.
It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.
So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.
Imagine, for example, that you are debian and you want to prep for the next stable version. It's reasonable that for the next release you'd bootstrap with the prior releases toolset. That allows you to have a stable starting point.
A compiler is perfectly capable of compiling programs which use features that its own source does not.
https://en.cppreference.com/w/cpp/compiler_support/20.html
A good example is the C++11 standard garbage collection! It was explicitly optional but afiak no one implemented it.
https://isocpp.org/wiki/faq/cpp11-library#gc-abi
C++ standards support and why C++23 and C++26 are not the default: https://gcc.gnu.org/projects/cxx-status.html
Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc.
[1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++.
[2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing.
[3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues.
Please don't spread misinformation. Breaking changes are actually almost inexistent with C++. The last one was with the COW std::string and std::list ~15 years ago with the big and major switch from C++03 to C++11. And heck, even then GCC wouldn't let your code break because it supported dual ABIs - you could mix C++03 and C++11 code and link them together.
So C++ actually tries really hard _not_ to break your code, and that is the philosophy behind a language adhering to something that is called backwards-compatibility, you know? Something many, such as Google, were opposing to and left the committee/language for that reason. I thank the C++ language for that.
Introducing new features or new keywords or making stricter implementation of existing ones, such as narrowing integral conversions, is not a breaking change.
The issue with defaults is that people have projects that implicitly expect the default to be static.
So when the default changes, many projects break. This is maybe fine if it’s your own project but when it’s a few dependencies deep, it becomes more of an issue to fix.
They are discussing in this email thread whether it is already properly supported.
> It's one reason why people care so much about self-hosted compilers
For self-hosting and bootstrapping you want the compiler to be compilable with an old version as possible.
"Properly supported" is the key here. Does GCC currently properly support C++23, for example? When I checked a few months ago, it didn't.
Warnings becoming errors would be scoped to gcc itself only, and they can fix them as part of the upgrade.
cursing because the old program does not compile anymore No.
I love that C++ has a long enough time between changing targets to actually be useful and that it's culture is about stability and usefulness for users trying to compile things rather than just dev-side improvements uber alles.
This is nonsense. Apt devs can target a rustc release and that release can be the same release that ships with Debian? Moreover, since those apt devs may have some say in the matter, they can choose to update the compiler in Debian!
> The entire language culture is built around this rapid improvement.
... Because this is a cultural argument about how some people really enjoy having their codebase be 6 years behind the latest language standard, not about any actual practical problem.
And I can understand how someone may not be eager to learn C++20's concepts or to add them immediately to a code base, but upgrades to your minimum Rust version don't really feel like that. It's much more like "Wow that's a nifty feature, I immediately understand and I'd like to use in the std lib. That's a great alternative to [much more complex thing...]" See, for example, OnceLock added at 1.70.0: https://doc.rust-lang.org/std/sync/struct.OnceLock.html
Of course we were all ADHD pedantic nerds so take this with a grain of salt.
I wonder why nobody configures this, is this not something that they can configure themselves to a more relevant image, like the GCC logo or something?
That said, more on topic, I am really glad that C++ actually considers the implications of switching default targets and only does this every 5 years. That's a decent amount of time and longer than most distros release cycles.
When a language changes significantly faster than release cycles (ie, rustc being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Apt now having rust code, and Debian's release cycle being 4 years for LTS, debian's shipped rustc won't be able to compile Apt.