Apple tailors their software to run optimally on their hardware. Other OSs have to work on a variety of platforms. Therefore limiting the amount of hardware specific optimizations.
I am working on something like this for work. But with plain old C
But most programmers i've encountered are just converting English to <programming language>. If a bug is reported then convert English to <programming language>
AI is the new Crypto
If you look at it this way, does most complaints about undefined behavior go away?