If the effect size isn't attributable to AI, then it must be the case that we have been in an unacknowledged recession for over a year.
If the effect size isn't attributable to AI, then it must be the case that we have been in an unacknowledged recession for over a year.
Because the agents aren't yet good enough for a hands off experience. You have to continuously monitor what it does if you want a passable code base.
>> I know many who have it on from high that they must use AI. One place even has bonuses tied not to productivity, but how much they use AI.
> How does maximizing AI use prevents developers from reading their code?
In my mind developers are responsible for the code they push, no matter whether it was copy pasted or generated by AI. The comment I responded to specifically said "bonuses tied not to productivity, but how much they use AI". I don't see that using AI for everything automatically implies having no standards or not holding responsibility for code you push.
If managers force developers to purposefully lower standards just to increase PRs per unit of time, that's another story. And in my opinion that's a problem of engeneering & organisational culture, not necessarily a problem with maximizing AI usage. If an org is OK with pushing AI slop no one understands, it will be OK with pushing handwritten slop as well.
I know many who have it on from high that they must use AI. One place even has bonuses tied not to productivity, but how much they use AI.
Meanwhile managers ask if AI is writing so much code why aren't they seeing it on topline productivity numbers.
Nvidia would need to move on the order of 4,000,000,000 units to hit $4T in revenue, more than triple that to realize $4T in profits. Even if the average per-unit costs are 2-3x my estimated $1k, as near as I’ve been able to tell they “only” move a few million units each year for a given sku.
I am struggling to work out how these markets get so inflated, such that it pins a company’s worth to some astronomical figure (some 50x total equity, in this case) that seems wholly untethered to any material potential?
My intuition is that the absence of the rapid, generationally transformative, advances in tech and industry that were largely seen in the latter half of the 20th-century (quickly followed with smartphones and social networking), stock market investors seem content to force similar patterns onto any marginally plausible narrative that can provide the same aesthetics of growth, even if the most basic arithmetic thoroughly perforates it.
That said, I nearly went bankrupt buying a used car recently, so this is a whole lot of unqualified conjecture on my part (but not for nothing, my admittedly limited personal wealth isn’t heavily dependent on such bets).
I also see where the reasoning here contradicts the reality. If we assume Nvidia only sells $1000 gpus and moves a few millions a year, then how did it received $137B in FY2025? In reality they don't just sell GPUs, they sell systems for AI training and inference at insane margins (I've seen 90% estimates) and also some GPUs at decent margins (30-40%). These margins may be enough to stimulate competition at some point, but so far those risks have not materialized.
There's not too much detail in that press release.
Thanks for sharing the link to the report!
Also, I looked at the wrong year. Currently they are in Q4 2023FY, the statements for the last quarter are here https://ir.gitlab.com/news-releases/news-release-details/git...
The main difference between the libraries is that Skija provides Java/JVM bindings for Skia, whereas Skiko provides Kotlin bindings for Kotlin/JVM, Kotlin/JS, and Kotlin/Native targets. Of course Skiko's Kotlin/JVM bindings can be used with other JVM languages, not just with Kotlin.
The future is not that AI takes over. It's when the accountants realize for a $120K a year developer, if it makes them even 20% more efficient (doubt that), you have a ceiling of $2000/mo. on AI spend before you break even. Once the VC subsidies end, it could easily cost that much. When that happens... who cares if you use AI? Some developers might use it, others might not, it doesn't matter anymore.
This is also assuming Anthropic or OpenAI don't lose any of their ongoing lawsuits, and aren't forced to raise prices to cover settlement fees. For example, Anthropic is currently in the clear on the fair use "transformative" argument; but they are in hot water over the book piracy from LibGen (illegal regardless of use case). The worst case scenario in that lawsuit, although unlikely, is $150,000 per violation * 5 million books = $750B in damages.
Source? Dario claims API inference is already “fairly profitable”. They have been optimizing models and inference, while keeping prices fairly high.
> dario recently told alex kantrowitz the quiet part out loud: "we make improvements all the time that make the models, like, 50% more efficient than they are before. we are just the beginning of optimizing inference... for every dollar the model makes, it costs a certain amount. that is actually already fairly profitable."
https://ethanding.substack.com/p/openai-burns-the-boats