I've been working professionally with c++ since 2001, and I'm currently a team lead.
The compensation packages we are able to offer to new hires means we're generally hiring from the middle of the talent pool, not the upper tier.
The complexity of c++ has long since outpaced the pace of fluency of the hiring pool. In my experience, the average c++ professional (that applies for our open job ads) knows c++14, and might not balk at an occasional c++17 feature.
It doesn't matter if the compilers support modules or not, because in practice, I won't be able to use modules in the workplace for ages.
--
Standard disclaimer - I'm not able to predict the crush of changes coming as generative AI for software development proliferates.
I haven't coded c++ professionally since a couple of standards ago. However, I believe that something like c++ modules can be picked up quickly even by "middle of the talent pool" devs, because it's a useful feature for them.
What might hinder modules adoption, beside availability of the compilers, is the rest of the tooling ecosystem and the particular idiosyncrasies that most c++ projects have.
In every major project I've been involved in (and it's not terribly many, to be fair) things keeping us on previous versions were almost always libraries or other support software, rarely if ever was it the devs.
Why not use them as soon as the compiler supports them? Your teammates will either ask you what it is (since you are the team lead), or look it up on cppreference.com.
I used to do this back when C++11 came out and ended up regretting it. When a feature just comes out, you can understand it possibly in isolation, but it's very hard to understand how that feature will fit in with other features, tools, libraries etc... and so you use it in a certain way according to how various blog posts and thought leaders say you should use it, and then after a year you end up realizing that no one could anticipate that the new feature has all kinds of footguns and certain features even end up being deprecated or superseded by yet another new feature.
When a new feature comes out, it's best to let it settle in a bit, maybe experiment using it on smaller side projects, but avoid diving into using it too early before it's really well understood.
For example I now cringe every time I see the curly bracket initialization of the form T{...} and how that was advocated as the one true way to initialize everything in C++, only for everyone to realize a year later that it has its own footguns (like with initialize lists) and with C++20 fixing almost all of the original problems that led to T{...}, the best practice nowadays is to go back to just using plain old T(...) and there's little to no reason to use T{...} anymore.
There was also Herb Sutter's Always Use Auto, which then was revised to Almost Always Use Auto, and now I think most developers take the sensible approach of use auto for redundant names like iterators, or unnamable types like lambda expressions, and avoid using it everywhere so as to turn your codebase into an opaque soup of inscrutable keywords.
Yeah, I think grandparent is confusing "not knowing the language features" with "want nothing to do with the latest metaprogramming mess the cooks are serving". Job listings with C++ are a mess, you never know what level of insanity you are going to get (bonus shoutout for those that say "C/C++").
You mention the average C++ programmer won't know the latest features, but if you did find an enthusiast who knew the latest features, you and the team probably wouldn't allow the use of those new features.
I can't imagine a more soul draining job than maintaining a corporate C++ codebase. Talk about doing the bare minimum.
re: changes coming from generative AI, it would need fluency in modules and other modern formulae. It would get that by being trained on modern conventions. So it's the same problem then, isn't it?
I wrote my own build system to use C++20 modules before CMake even had support for them, and while I have probably had net benefit from using them, I can’t recommend them for anyone in their own projects at this point.
The feature has so many irregularities that could only come out of a standards process, there are too many compiler bugs (just try using header units), the different implementations are too fragmented (I’m only using clang, which makes this easier on me), and there is a lack of things like module maps that would dramatically improve usability.
C++ has long surpassed the point where mere mortals like me can understand it; It's so loaded with baggage, footguns, and inscrutable standards language that honestly I think the only thing keeping it going is institutional inertia and "backwards compatbility" (air quotes).
I work extensively in the embedded space and unfortunately C and C++ are still pretty much the only viable languages. I can not wait until the day rust or some other language finally supplants them.
I‘m currently doing work with Rust on Esp32 platforms and I‘ll have to say, it‘s not quite ready yet. Debug tools still have issues, and we‘re facing some problems with vendor specific magic in Esp IDF version 5.
The grass is always greener. Rust also has rough edges and there's maybe a fraction of a fraction of a percent of Rust code out there to work on in the corpus of systems software, and a lot of it has to interface with C anyway. Even crates in Rust are overly complicated and they've set up a system that allows people to squat on well known names.
I think if you want to work on systems software you should enjoy working with legacy cruft, otherwise you're just going to be miserable. I mean it's everywhere from the language, to the POSIX APIs, to device drivers, to hardware quirks... C++ is only a quarter of the problem.
I think it depends on what you use C++ for. For low-level high-performance systems work, the thing I probably miss most in many of the alternatives is the extensive metaprogramming and generics capabilities of C++. This is unfortunate given both the power of this language feature and how much opportunity there is to improve the ergonomics of C++ metaprogramming.
If you had to pick only one language to use, for everything, you'd pick C++. It can do it all, from bit fields to polymorphic classes to closures; it's safer and saner than C (you haven't read the standards if you think otherwise); it's got a level of support and maturity (and probably lifespan) than any other comparable language.
I love this having just started reading through the C++ 20 stuff.
However, a key opportunity is missed in that neither the icon nor the site links in the footer linked to a short definition of the language before modules (the lack), the impact of modules on the design of the language at present (the real) and its place in the future of programming languages (the imaginary and the symbolic).
After writing build systems for a C/C++ operating system and years optimising builds for C/C++ operating systems the major disaster by far is the C preprocessor.
This is the source of all the evil. Even a hello world program involves reading through 100s of kilobytes, often megabytes, of headers that have to be parsed again and again for every source file but which can produce totally different outcomes in each case depending on compilers and the OS and the definitions on the commandline and whatever's defined in the source code itself and how the filesystem is laid out.
You can forget managing the dependencies on large projects this way - they are overwhelming. Every build system tends to be leaky and imperfect to not get drowned in dependencies and the fanciest systems all tend to have big holes here or there or they have to use huge "catchall" dependencies to try to be correct at the cost of efficiency.
I hoped modules would remove this problem but so far I'm not sure. I'd love to get the opinion of someone who has used them. My read-ups about it didn't seem that hopeful - I got the impression of them being a bit like pre-compiled headers.
C/C++ sucks to write dev tooling (e.g. syntax highlighting, LSPs, static analyzers) for. Pretty much everyone leans on libclang for parsing because very few people are insane enough to try to reimplement a parser themselves, let alone all the GNU extensions. And even then, macros make robust parsing really difficult. Imagine trying to parse a file that contains two programming languages that can be arbitrarily interleaved at almost the character level. That's basically what C/C++ are.
Named modules (not header units[1] which are a workaround for libraries not-yet migrated to C++ standard modules) straight-up disallow exporting macros. Which is a good thing. I can't stand macros.
I think calling GCC's support of modules "partial" is a tad generous. It's pretty easy to hit ICEs/segfaults when trying to use modules with GCC, which is a good reason why it's not worth it for libraries to support modules at all.
The compensation packages we are able to offer to new hires means we're generally hiring from the middle of the talent pool, not the upper tier.
The complexity of c++ has long since outpaced the pace of fluency of the hiring pool. In my experience, the average c++ professional (that applies for our open job ads) knows c++14, and might not balk at an occasional c++17 feature.
It doesn't matter if the compilers support modules or not, because in practice, I won't be able to use modules in the workplace for ages.
--
Standard disclaimer - I'm not able to predict the crush of changes coming as generative AI for software development proliferates.
When a new feature comes out, it's best to let it settle in a bit, maybe experiment using it on smaller side projects, but avoid diving into using it too early before it's really well understood.
For example I now cringe every time I see the curly bracket initialization of the form T{...} and how that was advocated as the one true way to initialize everything in C++, only for everyone to realize a year later that it has its own footguns (like with initialize lists) and with C++20 fixing almost all of the original problems that led to T{...}, the best practice nowadays is to go back to just using plain old T(...) and there's little to no reason to use T{...} anymore.
There was also Herb Sutter's Always Use Auto, which then was revised to Almost Always Use Auto, and now I think most developers take the sensible approach of use auto for redundant names like iterators, or unnamable types like lambda expressions, and avoid using it everywhere so as to turn your codebase into an opaque soup of inscrutable keywords.
You mention the average C++ programmer won't know the latest features, but if you did find an enthusiast who knew the latest features, you and the team probably wouldn't allow the use of those new features.
I can't imagine a more soul draining job than maintaining a corporate C++ codebase. Talk about doing the bare minimum.
To me, that's anything web-related in Java/C#/Go/JS.
Especially if most of the developers learnt Microsoft Visual C++ and believe that is proper C++!
This is C++es way of finally getting rid of them, akin to Swift or Rust.
I feel those statements are related.
The feature has so many irregularities that could only come out of a standards process, there are too many compiler bugs (just try using header units), the different implementations are too fragmented (I’m only using clang, which makes this easier on me), and there is a lack of things like module maps that would dramatically improve usability.
I work extensively in the embedded space and unfortunately C and C++ are still pretty much the only viable languages. I can not wait until the day rust or some other language finally supplants them.
I think if you want to work on systems software you should enjoy working with legacy cruft, otherwise you're just going to be miserable. I mean it's everywhere from the language, to the POSIX APIs, to device drivers, to hardware quirks... C++ is only a quarter of the problem.
For compiled garbage-collected applications (web/cli): Go.
For high-level applications (web/cli/etl/desktop): Java, C#.
Also here is good writeup: https://hackernoon.com/the-real-c-killers-not-you-rust
discussed here two times:
https://news.ycombinator.com/item?id=34792932
https://news.ycombinator.com/item?id=39770467
If you had to pick only one language to use, for everything, you'd pick C++. It can do it all, from bit fields to polymorphic classes to closures; it's safer and saner than C (you haven't read the standards if you think otherwise); it's got a level of support and maturity (and probably lifespan) than any other comparable language.
However, a key opportunity is missed in that neither the icon nor the site links in the footer linked to a short definition of the language before modules (the lack), the impact of modules on the design of the language at present (the real) and its place in the future of programming languages (the imaginary and the symbolic).
This is the source of all the evil. Even a hello world program involves reading through 100s of kilobytes, often megabytes, of headers that have to be parsed again and again for every source file but which can produce totally different outcomes in each case depending on compilers and the OS and the definitions on the commandline and whatever's defined in the source code itself and how the filesystem is laid out.
You can forget managing the dependencies on large projects this way - they are overwhelming. Every build system tends to be leaky and imperfect to not get drowned in dependencies and the fanciest systems all tend to have big holes here or there or they have to use huge "catchall" dependencies to try to be correct at the cost of efficiency.
I hoped modules would remove this problem but so far I'm not sure. I'd love to get the opinion of someone who has used them. My read-ups about it didn't seem that hopeful - I got the impression of them being a bit like pre-compiled headers.
[1]: https://clang.llvm.org/docs/StandardCPlusPlusModules.html#he...
I think this line on its own sums it up.