This money is well beyond VC capability.
Either this lets them build to net positive without dying from painful financing terms or they explode spectacularly. Their rate of adoption it seems to be the former.
There are currently multiple posts per day on HN that escalate into debates on LLMs being useful or not. I think this is a clear example that it can be. And results count. Porting and modernizing some ancient driver is not that easy. There's all sorts of stuff that gets dropped from the kernel because it's just too old to bother maintaining it and when nobody does, deleting code becomes the only option. This is a good example. I imagine, there are enough crusty corners in the kernel that could benefit from a similar treatment.
I've had similar mixed results with agentic coding sometimes impressing me and other times disappointing me. But if you can adapt to some of these limitations it's alright. And this seems to be a bit of a moving goalpost thing as well. Things that were hard a few months ago are now more doable.
My main worry is whether they will be useful when priced above actual cost. I worry about becoming depending on these tools only for them to get prohibitively expensive.
Except the authors own provided data says it cost them $2B in inference costs to generate $4B in revenue. Yes training costs push it negative, but this is like tech growth 101, debt now to grow faster leads to larger potential upsides in the future.
* Ghost signs: https://sf.nerdnite.com/2014/06/04/nerd-nite-sf-49/
* Neon signs: https://sf.nerdnite.com/2017/10/11/nerd-nite-sf-89-brain-sci...
* Dairy farming: https://sf.nerdnite.com/2019/08/14/nerd-nite-sf-111-butterfl...
In practice, there is no longer a pytype team at Google [https://news.ycombinator.com/item?id=40171125], which I suspect is the real reason for the discontinuation.
There's also the trick of telling the hospital you'll pay "in cash" and getting a 10x lower bill from the hospital, then getting that reimbursed/covered by your private or alternative insurance.