Now, that's not to say AI isn't useful and we won't have AGI in the future. But this feels alot like the AI winter. Valuations will crash, a bunch of players will disappear, but we'll keep using the tech for boring things and eventually we'll have another breakthrough.
Is all the IP they acquired with Nuvia[1] tainted? Or were they just using ARM-derived internals?
From my understanding, just slapping on a different instruction decoder isn't a big technical hurdle. Actually, I wonder if it would be possible to design a chip with both an ARM and a RISC-V decoder on the same die and just fuse-off the ARM die on select units to avoid any fees...
[1] https://en.wikipedia.org/wiki/Qualcomm#2015%E2%80%932024:_NX...
Odin vs Rust vs Zig would be more apt, or Go vs Java vs OCaml or something...
Odin doesn't (and won't ever according to its creator) implement specific concurrency strategies. No async, coroutines, channels, fibers, etc... The creator sees concurrency strategy (as well as memory management) as something that's higher level than what he wants the language to be.
Which is fine by me, but I know lots of people are looking for "killer" features.
Google's revenue stream and structural advantages mean they can continue this forever and if another AI winter comes, they can chill because LLM-based AI isn't even their main product.
It's the inevitable peak of the venture capital pipeline, just this time it isn't individual industries (e.g. taxis with Uber, hotels with AirBnB) getting squeezed out by unsustainable pricing - it's the economy at large that's suffering this time.
And it's high time for us as a society to put an end to this madness. End the AI VC economy before it ends our economy.