Readit News logoReadit News
Panzerschrek · 17 days ago
It's a good decision to add at least some checks into C++ standard library. But no runtime check can find a bug in code like this:

  std::vector<int> v;
  v.push_back(123);
  auto& n= v.front();
  v.push_back(456);
  auto n_doubled= n * 2;
A better language is needed in order to prevent such bugs, where such compile-time correctness checks are possible. Some static analyzers are able to detect it in C++, but only in some cases.

GoblinSlayer · 17 days ago
Nirvana fallacy. Some checks are better than no checks.
delta_p_delta_x · 17 days ago
It seems to me statically checking this should be possible. The liveness of the result of std::vector::front() should be invalidated and be considered dead after the second invocation to push_back(). Then a static analyser would correctly mark the final line with red squiggles. Of course, compilers would still be happy to compile this, which they really ought not to.
aw1621107 · 17 days ago
> It seems to me statically checking this should be possible.

Statically checking this specific example (or similarly simple examples) could be possible, sure. I'm not so sure about more complex cases, such as opaque functions (whether because the function is literally opaque or because not enough inlining occurred), stored references (e.g., std::span), unintentional mutation of the underlying data structure, etc.

Thats basically one of the main reason Rust's lifetimes exist - to explicitly encode information about when lifetimes are valid in the type system. C++ doesn't have an equivalent (yet?), so unless you're willing to use global analysis an/or non-standard annotations there's only so much static analysis can do.

lang4d · 17 days ago
I'd be surprised if some combination of ASAN and UBSAN wouldn't catch this and similar dangling references
steveklabnik · 17 days ago
I thought it should too, but it doesn't seem to, unless I made a mistake, which I probably did: https://godbolt.org/z/Ex63vxj4r
optimalsolver · 17 days ago
What does this do?
aw1621107 · 17 days ago
Potential use-after-free. push_back() may reallocate, which would invalidate the reference returned by front(), rendering its subsequent use invalid.
xiphias2 · 18 days ago
It's great that finally bounds checking happened in C++ by (mostly) default.

The only thing that's less great is that this got so much less upvotes than all the Safe-C++ languages that never really had the chance to get into production in old code.

pjmlp · 17 days ago
It has always been the default in compiler provided frameworks before C++98, like Turbo Vision, BIDS, OWL, MFC and so on.

Unfortunately the default changed when C++98 came out, and not everyone bothered with providing at least hardening mode in debug builds, VC++ followed by GCC, or compilers in high integrity computing like Green Hills.

Sadly the security and quality mentality seems to be a hard sell in areas where folks are supposed to be Engineers and not craftsmen.

xiphias2 · 17 days ago
It's not just simply security / quality, but speed of iteration matters as well: the earlier bugs are found in optimized nodebug mode as well, the more I like to use C++ itself for developing.
BinaryIgor · 18 days ago
Interesting how C++ is still improving; seems like changes of this kind my rival at least some of the Rust use cases; time will tell
galangalalgol · 18 days ago
The issue with safer c++ and modern c++ is the mirror of the problem with migrating a code base from c++ to rust. There is just so much unmodern and unsafe c++ out there. Mixing modern c++ into older codebases leaves uncertain assumptions everywhere and sometimes awkward interop with the old c++. If there was a c++23{} that let the compiler know that only modern c++ and libc++ existed inside it would make a huge difference by making those boundaries clear and you can document the assumptions at that boundary. Then move it over time. The optimizer would have an advantage in that code too. But they don't want to do that. The least they could do is settle on a standard c++abi to make interop with newer languages easier, but they don't want to do that either. They have us trapped with sunk cost on some gaint projects. Or they think they do. The big players are still migrating to rust slowly, but steadily.
kaz-inc · 18 days ago
There kind of is. There's __cplusplus, which I'll grant you is quite janky.

  #IF __cplusplus==202302L

blub · 17 days ago
> There is just so much unmodern and unsafe c++ out there. Mixing modern c++ into older codebases leaves uncertain assumptions everywhere and sometimes awkward interop with the old c++

Your complaint doesn’t look valid to me: the feature in the article is implemented with compiler macros that work with old and new code without changes.

See https://libcxx.llvm.org/Hardening.html#notes-for-users

pjmlp · 17 days ago
And not everywhere, as there are many industrial scenarios where Rust either doesn't have an answer yet, or is still in early baby steps regarding tooling and ecosystem support.
josephg · 18 days ago
I’m not really sure how checks like this can rival rust. Rust does an awful lot of checks at compile time - sometimes even to the point of forcing the developer to restructure their code or add special annotations just to help the compiler prove safety. You can’t trivially reproduce those all those guardrails at runtime. Certainly not without a large performance hit. Even debug mode stdc++ - with all checks enabled - still doesn’t protect against many bugs the rust compiler can find and prevent.

I’m all for C++ making these changes. For a lot of people, adding a bit of safety to the language they’re going to use anyway is a big win. But in general guarding against threading bugs, or use after free, or a lot of more obscure memory issues requires either expensive GC like runtime checks (Fil-C has 0.5x-4x performance overhead and a large memory overhead). Or compile time checks. And C++ will never get rust’s extensive compile time checks.

pjmlp · 17 days ago
It could have gotten them, had the Safe C++ proposal not been shot down by the profiles folks, those profiles that are still vapourware as C++26 gets finalised.

Google just did a talk at LLVM US 2025, regarding the state of clang lifetime analyser, the TL;DW is we're still quite far from the profiles dream.

blub · 17 days ago
They rival Rust in the same way that golang and zig do: they handle more and more memory-safety bugs to the point that the delta to Rust’s additional memory-safety benefits doesn’t justify the cost of using Rust any more.
fpoling · 17 days ago
Rust borrow checker rules out a lot of patterns that typical C++ code uses. So if C++ would get similar rules, they still cannot be applied to most of the existing code in any case.
semiinfinitely · 18 days ago
> Interesting how C++ is still improving

its not

Conscat · 18 days ago
Do you read the Clang git commit log every day? C++ improves in many ways faster than any other language ecosystem.
fweimer · 18 days ago
How does this compare to _GLIBCXX_ASSERTIONS in libstdc++ (on by default in Fedora since 2018)?
beached_whale · 18 days ago
My understanding that this is like that but both libstdc++/libc++ have been doing more since. Additionally, Google did a blog not to long ago where they talked to actual the performance impact on their large C++ codebase and it averaged about 0.3% I think https://security.googleblog.com/2024/11/retrofitting-spatial...

Since then, libc++ has categorized the checks by cost and one can scale them back too.

ris · 18 days ago
See also the "lite assertions" mode @ https://gcc.gnu.org/wiki/LibstdcxxDebugMode for glibc, however these are less well documented and it's less clear what performance impact these measures are expected to have.
tialaramex · 18 days ago
> those that lead to undefined behavior but aren't security-critical.

Once again C++ people imagining into existence Undefined Behaviour which isn't Security Critical as if somehow that's a thing.

Mostly I read the link because I was intrigued as to how this counted as "at scale" and it turns out that's misleading, the article's main body is about the (at scale) deployment at Google, not the actual hardening work itself which wasn't in some special way "at scale".

AshamedCaptain · 18 days ago
Of course there is undefined behavior that isn't security critical. Hell, most bugs aren't security critical. In fact, most software isn't security critical, at all. If you are writing software which is security critical, then I can understand this confusion; but you have to remember that most people don't.

The author of TFA actually makes another related assumption:

> A crash from a detected memory-safety bug is not a new failure. It is the early, safe, and high-fidelity detection of a failure that was already present and silently undermining the system.

Not at all? Most memory-safety issues will never even show up in the radar, while with "Hardening" you've converted all of them into crashes that for sure will, annoying customers. Surely there must be a middle ground, which leads us back to the "debug mode" that the article is failing to criticize.

AlotOfReading · 18 days ago

    In fact, most software isn't security critical, at all. If you are writing software which is security critical, then I can understand this confusion; but you have to remember that most people don't.
No one knows what software will be security critical when it's written. We usually only find out after it's already too late.

Language maintainers have no idea what code will be written. The people writing libraries have no idea how their library will be used. The application developers often don't realize the security implications of their choices. Operating systems don't know much about what they're managing. Users may not even realize what software they're running at all, let alone the many differing assumptions about threat model implicitly encoded into different parts of the stack.

Decades of trying to limit the complexity of writing "security critical code" only to the components that are security critical has resulted in an ecosystem where virtually nothing that is security critical actually meets that bar. Take libxml2 as an example.

FWIW, I disagree with the position in the article that fail-stop is the best solution in general, but there's experimental evidence to support it at least. The industry has tried many different approaches to these problems in the past. We should use the lessons of that history.

charleslmunger · 18 days ago
>Not at all? Most memory-safety issues will never even show up in the radar

Citation needed? There's all sorts of problems that don't "show up" but are bad. Obvious historical examples would be heartbleed and cloudbleed, or this ancient GTA bug [1].

1: https://cookieplmonster.github.io/2025/04/23/gta-san-andreas...

samdoesnothing · 18 days ago
nooooo you don't understand, safety is the most important thing ever for every application, and everything else should be deprioritized compared to that!!!
criemen · 18 days ago
> Of course there is undefined behavior that isn't security critical.

But undefined behavior is literally introduced as "the compiler is allowed to do anything, including deleting all your files". Of course that's security critical by definition?

gishh · 18 days ago
Most people around here are too busy evangelizing rust or some web framework.

Most people around here don’t have any reason to have strong opinions about safety-critical code.

Most people around here spend the majority of their time trying to make their company money via startup culture, the annals of async web programming, and how awful some type systems are in various languages.

Working on safety-critical code with formal verification is the most intense, exhausting, fascinating work I’ve ever done.

Most people don’t work a company that either needs or can afford a safety-critical toolchain that is sufficient for formal, certified verification.

The goal of formal verification and safety critical code is _not_ to eliminate undefined behavior, it is to fail safely. This subtle point seems to have been lost a long time ago with “*end” developers trying to sell ads, or whatever.

pjmlp · 17 days ago
Well, there is ongoing work to put all known UB into an UB annex on the standard, with hope that the size of the annex gets reduced over time.

How well this will work out remains to be seen.

Rust still needs to get rid of its dependency on C++ compiler frameworks, and I don't see Cranelift matching GCC and LLVM any time soon.

forrestthewoods · 18 days ago
> Undefined Behaviour which isn't Security Critical as if somehow that's a thing

Undefined behavior in the (poorly written) spec doesn't mean undefined behavior in the real world. A given compiler is perfectly free to specify the behavior.

dana321 · 18 days ago
Imagine hardening the regex library, its already as slow as molasses.
jeffbee · 17 days ago
There are lots of parts of the standard library that nobody uses, and hardening has no performance impacts in code you don't call.

Dead Comment