Step 1: Some upstarts create a new way of doing something. It’s clunky and unrefined.
Step 2: "Experts" and senior folks in the field dismiss it as a "toy." It doesn't follow their established rules or best practices and seems amateurish. They wouldn't recommend it to anyone serious.
Step 3: The "toy" gets adopted by a small group of outsiders or newcomers who aren't burdened by the "right way" of doing things. They play with it, improve it, and find new applications for it.
Step 4: The "toy" becomes so effective and widespread that it becomes the new standard. The original experts are left looking out of touch, their deep knowledge now irrelevant to the new way of doing things.
We're at step 2, bordering on 3.
* Executives at Nokia and BlackBerry saw the first iPhone, with its lack of a physical keyboard, as an impractical toy for media consumption, not a serious work device.
* Professional photographers viewed the first low-resolution digital cameras as flimsy gadgets, only for them to completely decimate the film industry.
The problem is that the experts have tried it and found that vibe coding doesn't actually work at scale that they need.
Will it ever? Perhaps, but I'd argue a near-AGI level of intelligence would need to be achieved first. When that happens, we have bigger problems (and/or opportunities?) than a few programmers losing their jobs.
Of the top of my head, in order of likely difficulty to calculate: byte length, number of code points, number of grapheme/characters, height/width to display.
Maybe it would be best for Str not to have len at all. It could have bytes, code_points, graphemes. And every use would be precise.