> Nobody who is doing this is willing to come clean with hard numbers but there are data points, for example from Meta and (very unofficially) Google.
The Meta link does not support the point. It's actually implying a MTBF of over 5 years at 90% utilizization even if you assume there's no bathtub curve. Pretty sure that lines up with the depreciation period.
That article makes a big claim, does not link to any source. It vaguely describes the source, but nobody who was actually in that role would describe themselves as the "GenAI principal architect at Alphabet". Like, those are not the words they would use. It would also be pointless to try to stay anonymous if that really were your title.
That is not merely an unofficial source. That is just made up trash that the blog author just lapped up despite its obviously unreliable nature, since it confirmed his beliefs.
Besides, if the claim about GPU wear-and-tear was true, this would show up consistently in GPUs sourced from cryptomining (which was generally done in makeshift compute centers with terrible cooling and other environmental factors) and it just doesn't.
> It's actually implying a MTBF of over 5 years [...] Pretty sure that lines up with the depreciation period.
You're assuming this is normal, for the MTBF to line up with the depreciation schedule. But the MTBF of data center hardware is usually quite a bit longer than the depreciation schedule right? If I recall correctly, for servers it's typically double or triple, roughly. Maybe less for GPUs, I'm not directly familiar, but a quick web search suggests these periods shouldn't line up for GPUs either.
Google is using nVidia GPUs. More than that, I'd expect Google to still be something like 90% on nVidia GPUs. You can't really check of course. Maybe I'm an idiot and it's 50%.
But you can see how that works: go to colab.research.google.com. Type in some code ... "!nvidia-smi" for instance. Click on the down arrow next to "connect", and select change runtime type. 3 out of 5 GPU options are nVidia GPUs.
Frankly, unless you rewrite your models you don't really have a choice but using nVidia GPUs, thanks to, ironically, Facebook (authors of pytorch). There is pytorch/XLA automatic translation to TPU but it doesn't work for "big" models. And as a point of advice: you want stuff to work on TPUs? Do what Googlers do: use Jax ( https://github.com/jax-ml/jax ), oh, and look at the commit logs of that repository to get your mind blown btw.
In other words, Google rents out nVidia GPUs to their cloud customers (with the hardware physically present in Google datacenters).
There's many questions about the overall economics of AI, its value, is it overvalued, is it not, etc. but this is a very poor article I suspect made by someone with little to no financial or accounting knowledge with a strong "uh big tech bad" bias.
> When companies buy expensive stuff, for accounting purposes they pretend they haven’t spent the money; instead they “depreciate” it over a few years.
There's no pretending. It's accounting. When you buy an asset, you own it, it is now part of your balance sheet. You incur a cost when the value of the asset falls, i.e. it depreciates. If you spend 20k on a car you are not pretending to not having spent 20k by considering it an asset, you spent money but now you have something of similar value as an asset. Your cost is the depreciation as years go by and the car becomes less valuable. That's a very misleading way to put it.
> Management gets to pick your depreciation period, (...)
They don't. GAAP, IFRS, or whatever other accounting rules that apply to the company do. There's some degree of freedom in certain situations but it's not "management wants". And it's funny that the author thinks that companies in general are interested in defining longer useful lives when in most cases (this depends on other tax considerations) it's the opposite because while depreciation is a non-cash expense you can get real cash by reducing your taxable income and the sooner you get that money the better. There's some more nuance to this, tax vs accounting, how much freedom management has vs what is industry practice and auditors will allow you to do... my point is, again, "management gets to pick" is not an accurate representation of what goes on.
> It’s like this. The Big-Tech giants are insanely profitable but they don’t have enough money lying around to build the hundreds of billions of dollars worth of data centers the AI prophets say we’re going to need.
Actually they do, Meta is the one that has the least but it could still easily raise that money. Meta in this case just thinks it's a better deal to share risk with investors that at the moment have a very strong appetite to own these assets. Meta is actually paying a higher rate through these SPVs compared to funding them outright. Now, personally I don't know how I would feel about that deal in particular if I was an investor just because you need to dig a little deeper in their balance sheet to have a good snapshot of what is going on but it's not any trick, arguably it can make economic sense.
1. Being a VP in these companies does not imply they have an understanding of financing, accounting or data-center economics unless their purview covered or was very close to the teams procuring and running the infrastructure.
2. That level of seniority does, on the other hand, expose them to a lot of the shenanigans going on in those companies, which could credibly lead them to develop a "big tech bad" mindset.
There are many legitimate concerns about the financial implications of these huge investments in AI. In fact the podcast that he references is great at providing _informed_ and _nuanced_ observations about all of this - Paul Kedrosky is great.
BUT (my point)
Is that the article is terrible at reflecting all of that and makes wrong and misleading comments about it.
The idea that companies depreciating assets is them "pretending they haven't spent the money" or that "management gets to pick your depreciation period" is simply wrong.
Do you think any of those two statements are accurate?
P.S. Maybe you make a good point, I said that I suspected based on those statements that he had little financial knowledge. tbh I didn't know the author, hence the "suspect". But now that you say that it might be that he is so biased in this particular topic that he can't make a fair representation of his point. Irrespective of that, I will say it again: statements like the ones I've commented are absurd.
I have been thinking. The reality is that in general employees are not paid for value/revenue/profit they generate. That sets the floor. But they are paid what market sets as rate for their demand. See people putting together high cost electronics. Clearly lot of value there with margins what they are, but not lot of pay.
Wouldn't AI largely be race to bottom? As such even if expensive employees get replaced, the cost of replacing them might not be that big. It might only barely cover the costs of interference for example. So might it be that profits will actually be lot lower than costs of employees that are being replaced?
To your first point, yes we're moving slowly towards a more general awareness that most employees are paid market (replacement) rate, not their share of value generated. As the replacement rate drops, so will wages, even if the generated value skyrockets. Unsurprisingly, business owners and upper management love this.
To the second point, the race to the bottom won't be evenly distributed across all markets or market segments. A lot of AI-economy predictions focus on the idea that nothing else will change or be affected by second and third order dynamics, which is never the case with large disruptions. When something that was rare becomes common, something else that was common becomes rare.
Its the whole layer of feature companies built on top of the core models what will pop (openai, anthropic, google resellers). Existing services will add AI bit by bit for features and survive. The core owners of models I think will survive (exception might be open ai, maybe). For the hardware there really are not that many TSMC wafers going around so max sales are capped for everyone at the newest nodes.
>When companies buy expensive stuff, for accounting purposes they pretend they haven’t spent the money; instead they “depreciate” it over a few years
I thought there was a US IRS Law that was changed sometime in the past 10/15 years that made companies depreciate computer hardware in 1 year. Am I misremembering ?
I thought that law was the reason why many companies increased the life time of employee Laptops from 3 to 5 years.
All, you have to do is wait. Seriously, just wait. if the tech deflationists are right, you'll get more cost effective memory every several years or decades at least.
In somehwere around 1999, my high school buddy, worked overtime shifts to afford a CPU he had waited forever to buy! Wait for it, it was a 1 GHZ CPU!
The Meta link does not support the point. It's actually implying a MTBF of over 5 years at 90% utilizization even if you assume there's no bathtub curve. Pretty sure that lines up with the depreciation period.
The Google link is even worse. It links to https://www.tomshardware.com/pc-components/gpus/datacenter-g...
That article makes a big claim, does not link to any source. It vaguely describes the source, but nobody who was actually in that role would describe themselves as the "GenAI principal architect at Alphabet". Like, those are not the words they would use. It would also be pointless to try to stay anonymous if that really were your title.
It looks like the ultimate source of the quote is this Twitter screenshot of an unnamed article (whose text can't be found with search engines): https://x.com/techfund1/status/1849031571421983140
That is not merely an unofficial source. That is just made up trash that the blog author just lapped up despite its obviously unreliable nature, since it confirmed his beliefs.
You're assuming this is normal, for the MTBF to line up with the depreciation schedule. But the MTBF of data center hardware is usually quite a bit longer than the depreciation schedule right? If I recall correctly, for servers it's typically double or triple, roughly. Maybe less for GPUs, I'm not directly familiar, but a quick web search suggests these periods shouldn't line up for GPUs either.
But you can see how that works: go to colab.research.google.com. Type in some code ... "!nvidia-smi" for instance. Click on the down arrow next to "connect", and select change runtime type. 3 out of 5 GPU options are nVidia GPUs.
Frankly, unless you rewrite your models you don't really have a choice but using nVidia GPUs, thanks to, ironically, Facebook (authors of pytorch). There is pytorch/XLA automatic translation to TPU but it doesn't work for "big" models. And as a point of advice: you want stuff to work on TPUs? Do what Googlers do: use Jax ( https://github.com/jax-ml/jax ), oh, and look at the commit logs of that repository to get your mind blown btw.
In other words, Google rents out nVidia GPUs to their cloud customers (with the hardware physically present in Google datacenters).
> When companies buy expensive stuff, for accounting purposes they pretend they haven’t spent the money; instead they “depreciate” it over a few years.
There's no pretending. It's accounting. When you buy an asset, you own it, it is now part of your balance sheet. You incur a cost when the value of the asset falls, i.e. it depreciates. If you spend 20k on a car you are not pretending to not having spent 20k by considering it an asset, you spent money but now you have something of similar value as an asset. Your cost is the depreciation as years go by and the car becomes less valuable. That's a very misleading way to put it.
> Management gets to pick your depreciation period, (...)
They don't. GAAP, IFRS, or whatever other accounting rules that apply to the company do. There's some degree of freedom in certain situations but it's not "management wants". And it's funny that the author thinks that companies in general are interested in defining longer useful lives when in most cases (this depends on other tax considerations) it's the opposite because while depreciation is a non-cash expense you can get real cash by reducing your taxable income and the sooner you get that money the better. There's some more nuance to this, tax vs accounting, how much freedom management has vs what is industry practice and auditors will allow you to do... my point is, again, "management gets to pick" is not an accurate representation of what goes on.
> It’s like this. The Big-Tech giants are insanely profitable but they don’t have enough money lying around to build the hundreds of billions of dollars worth of data centers the AI prophets say we’re going to need.
Actually they do, Meta is the one that has the least but it could still easily raise that money. Meta in this case just thinks it's a better deal to share risk with investors that at the moment have a very strong appetite to own these assets. Meta is actually paying a higher rate through these SPVs compared to funding them outright. Now, personally I don't know how I would feel about that deal in particular if I was an investor just because you need to dig a little deeper in their balance sheet to have a good snapshot of what is going on but it's not any trick, arguably it can make economic sense.
Actually the author has worked for Google, Amazon (VP-level), Sun, and DEC; and was a co-creator of XML.
2. That level of seniority does, on the other hand, expose them to a lot of the shenanigans going on in those companies, which could credibly lead them to develop a "big tech bad" mindset.
BUT (my point)
Is that the article is terrible at reflecting all of that and makes wrong and misleading comments about it.
The idea that companies depreciating assets is them "pretending they haven't spent the money" or that "management gets to pick your depreciation period" is simply wrong.
Do you think any of those two statements are accurate?
P.S. Maybe you make a good point, I said that I suspected based on those statements that he had little financial knowledge. tbh I didn't know the author, hence the "suspect". But now that you say that it might be that he is so biased in this particular topic that he can't make a fair representation of his point. Irrespective of that, I will say it again: statements like the ones I've commented are absurd.
Dead Comment
Wouldn't AI largely be race to bottom? As such even if expensive employees get replaced, the cost of replacing them might not be that big. It might only barely cover the costs of interference for example. So might it be that profits will actually be lot lower than costs of employees that are being replaced?
To the second point, the race to the bottom won't be evenly distributed across all markets or market segments. A lot of AI-economy predictions focus on the idea that nothing else will change or be affected by second and third order dynamics, which is never the case with large disruptions. When something that was rare becomes common, something else that was common becomes rare.
Deleted Comment
"Special Purpose Vehicles" reminds me of "Special Purpose Entities" from the 90s and 00s, e.g., for synthentic leases
I thought there was a US IRS Law that was changed sometime in the past 10/15 years that made companies depreciate computer hardware in 1 year. Am I misremembering ?
I thought that law was the reason why many companies increased the life time of employee Laptops from 3 to 5 years.
In somehwere around 1999, my high school buddy, worked overtime shifts to afford a CPU he had waited forever to buy! Wait for it, it was a 1 GHZ CPU!