Even in Go we had a stupid problem where default json deserializer creates floats (when deserialized into any) and the number was high enough int64 where it lost precision.
I mean, we can go at it all night long what pitfalls await in what language. Perhaps Rust is safest with its own pitfalls where you just can't do it safely (looking at you BST and use of Arc).
Programming is full of such traps and only inexperienced engineers in a language would make such a mistake. This includes engineers with 20+ years of 1 year experience.
Why people don’t use the Unicode symbol for Roman numeral two?
Apple Ⅱ+