(Already mentioned) CakeML would be another example, together maybe with its Pancake sibling.
Also: WebAssembly!
(Already mentioned) CakeML would be another example, together maybe with its Pancake sibling.
Also: WebAssembly!
type
Age = range[0..200]
let ageWorks = 200.Age
let ageFails = 201.Age
Then at compile time: $ nim c main.nim
Error: 201 can't be converted to Age
[1] https://nim-lang.org/docs/tut1.html#advanced-types-subrangesWeirdly, when going through the higher assurance levels in aviation, defensive programming becomes more costly, because it complicates the satisfaction of assurance objectives. SQLite (whiches test suite reaches MC/DC coverage which is the most rigorous coverage criterion asked in aviation) has a nice paragraph on the friction between MC/DC and defensive programming:
https://www.sqlite.org/testing.html#tension_between_fuzz_tes...
But the most obvious difference, and maybe most important to a user, was left unstated: the adoption and ecosystem such as tooling, libraries, and community.
Ada may have a storied success history in aerospace and life safety, etc, and it might have an okay standard lib which is fine for AOC problems and maybe embedded bit poking cases in which case it makes sense to compare to Rust. But if you're going to sit down for a real world project, ie distributed system or OS component, interfacing with modern data formats, protocols, IDEs, people, etc is going to influence your choice on day one.
The 787 and 777 are already purely fly by wire. Their entire feel is made up.
Boeing simply has a different design philosophy on how much a pilot should feel like they are in command vs steering a system.
Outside hobbies, I've been mostly away from this field for little over a decade by now. Is it still that bad? I remember back then, every single professional electronics engineer that I met had this die hard belief that this was simply how things work:
You want to use DerpSemi micro controllers? You must install DerpStudio '98! Not the later 2005 version tough, that has some major UI bugs. You want to program a HerpSoft PLC? You need EasyHerpes IDE! What, command line toolchain? Text editor? You must be completely insane!
It's been somewhat of a personal fight against windmills for me back then. That, plus suggesting that we are actually developing software and the C/Assembly/VHDL maybe shouldn't be an undocumented, tested-only-once pile of spaghetti based off a circuit diagram inside one guys head (and no, a 50 line comment block with a German transcription of the code below is not documentation).
After thinking through this carefully, though, I do not see UB (except for signed overflow in a corner case):
Step 1, bit shift.
I understand that, until C++20, left shift of a signed int was UB. But this right shift appears to be implementation defined, which is ok if documented in the code.
Step 2: Add.
Then, (x + mask) is defined behavior (a) for positive x, since then mask=0, and (b) for most negative x, since then mask=-1. However, if x is numeric_limits::lowest, then you get signed integer overflow, which is UB.
Step 3, XOR.
Then the final XOR doesn't have UB AFAICT. It wouldn't be UB as of C++20, when signed integers became two's complement officially. It might have been implementation defined before then, which would be almost as good for something that ubiquitous, but I'm not sure.
Ok, so I think this does not involve UB except for an input of numeric_limits_lowest.
Sound about right?
To fix this, perhaps you would need to make that + an unsigned one?
It bothers me how hard you need to think to do this language lawyering. C++ is a system of rules. Computers interpret rules. The language lawyer should be a piece of software. I get it, not everything can be done statically, so, fine, do it dynamically? UBSan comes closest in practice, but doesn't detect everything. I understand formally modeled versions of C and C++ have been developed commercially, but these are not open source so they effectively do not exist. It's a strange situation.
Just the other day I considered writing something branchless and said, "nah", because of uncertainty around UB. How much performance is being left on the table around the world because of similar thinking, driven by the existence of UB?
Maybe I was supposed to write OCaml or Pascal or Rust.
cargo miri run
And if your code actually touches UB, mirei will most likely point out exactly where and why.Here is a talk about that porting effort:
- Persistant state? Must declare a volume. - IO with external services? Must declare the ports (and maybe addresses). - Configurable parameters? Must declare some env variables. - Trasitive dependecies? Must declare them, but using a mechanism of your choosing (e.g. via the package manager of your base image distro).
Separation of state (as in persistency) and application (as in binaries, assets) makes updates easy. Backups also.
Having most all IO visible and explicit simplifies operation and integration.
And a single, (too?!?) simple config mechanism increases reusability, by enabling e.g. lightweight tailoring of generic application service containers (such as mariadb).
Together this bunch of forced, yet leaky abstractions is just good enough to foster immense reuse & composability on to a plethora of applications, all while allowing to treat them almost entirely like blackboxes. IMHO that is why OCI containers became this big, compared to other virtualization and (application-) cuntainer technologies.