There is so much interesting stuff going on in GPU compute that isn't crypto. I'm really excited about this because there are SO MANY gpus that are now going to be cheaper than sand. There is A LOT that can be made of that and I intend to get mine. I think the crypto boom really covered up what we can really, really do with GPU compute and possibly stifled adoption and innovation but now we've got so many just sitting around. Which is super useful as we move more into a world after being able to get things manufactured and shipped world wide in what feels like an instant.
> There is so much interesting stuff going on in GPU compute that isn't crypto.
For sure, but there were many crypto operations with data centers that had hundreds or thousands of GPUs. For example, a report from JPR estimates that crypto miners bought 25% of all GPUs produced in 1H'2021.¹
I'd like to see some PoW scheme for public good projects like SETI or Folding@Home or even CGI for a fan fiction, where there's perhaps something more than just bragging rights for contributing. I'm not sure what that would look like exactly.
For PoW you dont just need work, you need work that can be easily verified, easily scaled difficulty levels, and difficult to cheat with - all with no centralized trusted authority.
Even when SETI/F@H was worth only fake internet points they already had problems with people cheating. What happens when its worth real money?
Since the new GPU offerings from nvidia have secure multi-tenancy, I think you're going to start seeing things like that. Especially when you look at what's happening with compute being more universally adapted via Vulkan. I haven't seen the framework for such a thing yet but you make a good point. I think I've got half a model in my head that could be retooled for something like that in a flexible fashion. It could work both ways too. Either you are giving away GPU cycles for research or you are paying for people to be paid for their cycles, or you post up your own job to be computed for pay or donation of cycles or money. As an example, wanna render predictions for erosion on a property you wanna buy? Put up the job and people that wanna contribute can and you get the result. Any user could set their hierarchy of things they contribute to. Like patreon sort of. So bucks or compute cycles can be chunked out to them by order of need and weighted priority.
Humm. Someone beat me to this idea so I don't have to do it.
Gridcoin rewarded seti@home when seti@home was operating and still rewards a number of other BOINC projects. It is proof of stake based and the rewards are layered on top.
There is a similar cryptocurrency called curecoin for folding@home although I think they do use SHA-256 mining (i.e not useful computations) for part of the rewards
Mining is only profitable when the block reward and transaction fees are worth more than the costs of mining (electricity, capital costs). ETH was the only coin big enough to support all those GPU mining rigs. With ETH gone, there are too many miners and not enough valuable stuff to mine.
Presumably those other coins are close to an equilibrium point where more people mining them would be unprofitable. Aka if the total block rewards from other coins are 1 million dollars per month, there is no way spending 2 million per month on electricity is a good idea.
I saw that argument on twitter when talking about the energy reduction for ETH mining. Someone commented that it won't change because they will just focus their GPUs on some other coin.
Why is the assumption that crypto has held back innovation? If anything normally pumping a ton of money into something causes more innovation to happen.
Nvidia was spending money, time and energy to cater to them one way or the other instead of other areas were GPUs are actually good for society.
Your sentence is very generic and not sure why this should be an universal law. Putting money into the right thing might cause innovation not just money
Pumping money into blockchain scams did cause more innovations in SEC fraud and ponzi schemes, but usually they meant innovation in science/technology.
Until demand was proven, nobody was interested in making GPU compute at scales beyond niche (e.g. for scienctific calculation or for server clusters). As-is, we are about to enter a squeeze that is going to see a few of the big players in retail GPU manufacture drop out, so don't expect this glut of cheap GPUs to last.
Learning how to program shaders is actually a pretty cool thing to do. It's the most math heavy programming I've ever delved into and for that I like how it feels like it's getting my knives real sharp.
> SO MANY gpus that are now going to be cheaper than sand
The market knew that the PoS merge was coming. I'd have expected the market to therefore have already factored the PoS transition into GPU prices. Why do you think GPU prices are tanking now?
The market for GPUs isn't as efficient as the stock market. Miners might also have wanted to profit right until the end. The price doesn't come down until the used GPUs actually hit the stores.
I have been using a gtx 980 for a long time now and have wanted to upgrade but everything still seems so expensive. Right now on newegg Canada a ASUS ROG Strix NVIDIA GeForce RTX 3080 OC is $1039.99. Seems like a lot. Will that price go down? Or should I be looking deals with used cards?
Last big crypto crash I bought a used Titan XP for like 350$ (still running strong and kicks ass). Got me through this whole GPU debacle. Always buy low!
I'm just finishing up Andrew Ng's course, and would like to pick up a GPU for building models (probably focusing on U-net image models), do you have any references on what I should expect from trying to buy a used one like you did?
I know it's a pretty random question, but i honestly don't know anything about GPU's other than that my life would be a lot easier if I had one (I run an old Macbook Pro, and I have an old Asus laptop that runs a debian distro, and finally my very old gaming pc is a alienware alpha).
We are already at a stagnating point in GPU speeds, with the most recent generations from NVIDIA simply being pump more juice (watts) instead of big design changes and efficiency optimizations.
I don't believe R&D funds will dry as well, since they will simply be relocated to AI and datacenter workloads which have been on the rise more recently.
I think you are both right. Lower prices and lower r&d. If you buy gpu because you care about the objective performance then it's bad. If you buy gpu to keep up with Joneses, then it's good.
One I'm interested in is graph databases powered with linear algebra (see GraphBLAS and RedisGraph). Putting the graph structure in a sparse matrix in GPU memory and doing matrix-multiplication to perform queries means you can effectively traverse the entire graph quickly by using the massive parallel nature of the graphics card.
> There is so much interesting stuff going on in GPU compute that isn't crypto.
There's a lot of stuff that isn't any better than crypto either, deepfakes, producing hundreds of thousands stable diffusion pics of the same scene.
Much of this is still a garbage fire of greenhouse gases and e-waste, used GPU prices won't change that. Many ml advances are simply more compute and bigger models in the end.
By that logic you can include gaming in there as well. Thousands of people re-playing the same scenes over and over again, instead of just watching a let's play of the first person that bought the game. I guess the only reasonable uses for GPUs are cancer research etc.
> producing hundreds of thousands stable diffusion pics of the same scene
Why are you twisting reality? People don't generate hundreds of thousands of stable diffusion pics of the same scene. Instead they generate dozens to hundreds of images carefully tweaking the prompt and the starter image.
You're not going to get very far trying to impose your subjective perspective of usefulness on other people's use of energy. Energy is one of the foundational pillars of modern society, and other people are going to use it for all sorts of things, including activities that you don't like (but that are liked by others).
I'd suggest focusing your irritation on advocating for universally clean generation of energy. Regardless of how much energy each of us uses, and regardless of what we use it for (even if it's something that you consider to be "useless"), most of us seem to agree that we don't want to pollute our air and ruin our planet. However, the moment you start attacking things that make other people happy, you are risking losing support for that fundamental goal.
I don't know why you're being downvoted. At least the last part is totally true; advance in ML is unfortunately just bigger models, more data, and more compute.
But doing ML won't necessarily boost GPU sales because most deep learning work is shifted to the cloud.
If you say something's bad, you imply something else is good. So can you identify some contrasting technologies which didn't start out as a "garbage fire of greenhouse gasses and e-waste" or the appropriate equivalent? Are you wanting a world contains only those born perfect technologies?
Love the reference. On a more serious note I'm really curious how this will play out. Nvidia seems to be doing it's best to prop up the prices of existing models as it prepares to launch the 4k series. The big question seems to be whether most of these miners will start mining some other token or get out of gpu mining entirely.
If the card has cuda support I would guess they're off to some sort of p2p AI / ML marketplace. Unfortunately AMD cards were actually better for mining. If anybody knows of something like vast.ai or render for AMD I'm all ears.
New 3000 series retail prices on the high end cards have been steadily dropping, and it seems like on ebay used prices have dropped 10% in the last month.
As for the 4000 series cards - they've stated in SEC filings that they will be trickling out stock to keep prices high.
AMD are the ones who are really fucked; their cards suck, and nobody bought them out of choice but desperation. Now that the market is glutted, people will heavily prefer nvidia cards.
To some extent this has already happened, as the recent crash in the GPU market from its highs a few months back, but obviously we did see some miners hold on to the bitter end, but it's not clear how much difference that inventory will make.
There is one other cryptocurrency use for these mining rigs, and that is to compromise existing chains.
Now that there is a great deal of excess capacity, presumably it would be possible to attack smaller chains in an attempt to glean some profit through double-spending, as those chains might now be vulnerable to larger-scale history-rewriting attacks.
The problem is that such an attack would be discovered and send the value of the token to zero , so you'd have a limited window to double spend into something else valuable but also not revokable.
I have a similar fun conspiracy theory, that Satoshi is actual an alien farmer that injected the whitepaper into its human farm to get humanity's tensor calculation capacity up, and now injected PoS to switch the capacity over to AI for some ineffable purpose.
Mine was always: "Everyone laughed at the young NSA intern Satoshi that he could convince the worlds criminal enterprises to open their financial books to the world."
Alternative: "Satoshi Nakamoto" was the nom de guerre of an emergent renegade AI who had figured out a way to induce monkeys to attach as much processing power as possible to a network.
In the last couple of years, we have been quite supply constrained on the GPU front, so it probably actually has caused some short-term hinderance. I think in the long-term though, your view probably makes more sense, like how funding science helps with proliferating technology in industry
I don't understand this well enough, but, why can't miners just mine other coins? Was all GPU mining Etherium based?
I know that bitcoin mining requires ASICS and GPUS can't compete with that, but I just assumed miners are just mining one of many possible coins, with Etherium being one of them.
If 80% of the revenue was from Ethereum, and now that part disappeared, 100% of the miners are left fighting over the 20% that's left.
It's be like if all women stopped going to Starbucks tomorrow, and you asked, "Why don't they just sell to the men?" Well they could, but they'd still be down ~50% in revenue.
Imo that's a bit of a failed analogy. More appropriate would be, imagine coffee is something very precious and Starbucks suddenly stopped selling to women. In this case, women need to go to other cafes, increasing their competition over coffee there. But in that case it doesn't seem to be so absurd any more - perhaps it's not the case that the coffee resources are so scarce (just as it's the case with crypto - does 3x miners of some currency mean that each of them makes 3x less? I don't think so).
Also, is it true that 80% of revenue was from Ethereum, or it's dummy data?
No other coins provide the profitability margins that ETH did, so miners can switch to other POW coins, however the will be paying more in electricity than whatever crypto they are mining.
I would assume the problem is that a lot of alt-coins are built on top of the Ethereum blockchain, and that most of those that aren't are nowhere near as profitable to mine.
It's actually not that straightforward to plug in these consumer cards as 4x setup. We spent weeks researching how to achieve up to 7x RTX 3090 setup in a single rig. Could write up our method if anyone is interested.
would love to see the riser setup that you're using for such a monster!
we mostly gave up and just got barebones machines since the cabling situation becomes pretty tricky, and the barebones total cost is low relative to the GPUs anyways.
If enough GPU miners stop, it becomes profitable for other GPU miners to mine. Fortunately that's currently 90% fewer GPU miners at the moment, across the whole sector.
That's a good outcome. We'd still be in a situation with so many fewer miners that GPU prices aren't influenced as they have been. The investment becomes risky as it'll always be teetering on the edge. I'm finally looking forward to a new GPU to pair with this 11900K.
It doesn't matter if its trash so long as its fungible trash at a profitable price. i.e. the situation is OK for now but as soon as the market adjusts to the glut of GPU miners suddenly minting no-name coins no one really wants (and the novelty wears off) those prices are going to drop like a rock.
Most of these coins only have value as scams that you could prop up then cash out by exchanging for BTC or ETH; so long as the "new coin of the day" hype train exists there will be a way to make money off of GPU mining. I guess there will always be suckers in this unregulated market.
Let me pull up my latest electric bill (which I almost have never looked at/have on autopay):
New Charges
Rate: RS-1 RESIDENTIAL SERVICE Base charge: $8.99
Non-fuel: (First 1000 kWh at $0.073710) (Over 1000 kWh at $0.083710) $76.74
Fuel: (First 1000 kWh at $0.034870) (Over 1000 kWh at $0.044870) $36.49
Electric service amount 122.22
20.99 in taxes / surcharges
Total $143.21
$143.21 for 1036 kWh in a 30 day timespan
0.138 $/kWh with taxes and fees I guess for me, I'm sure if I was doing crazy ASIC stuff at my house they'd charge me more/the rate would become less favorable
Looks like mining BTC would net be $0.85 a day profit with some kind of ASIC. $310/yr. Yikes.
If I am looking to buy a GPU chip for ML research:
- What chip should I buy?
- When should I buy? Should I wait for prices to drop? Will new and improved chips be released anytime soon?
- What are the advantages / disadvantages of each chip (3060 vs 3080, Nvidia vs AMD)? Which chip is most cost-efficient? What are each chips' specialties (e.g. specific type of neural network, graphics vs compute)?
This applies for all neural networks. Depending on how much money you're willing to spend, in descending order: DGX (computer with 8 A100s, $150,000), A100 (80GB, $15,000), A6000 ($5000), RTX 3090 ($1000).
1 x A100, 80GB is $3.19 / hour.
8 x A100, 80GB is about $25 / hour.
They have much less expensive machines. I used to use them to run steam games, but now proton is just too darn good. Their low end machines are OK for CAD software that supports real-time raytracing.
I just grabbed an A6000 for $3500 on eBay, so you can probably get a pretty decent deal on those now. They're pricey but IMO it's a great deal if you really need the VRAM (e.g. for training LLMs).
Lambda Labs crunched the numbers in Feb 2022 [0]. They concluded:
“””
So, which GPUs to choose if you need an upgrade in early 2022 for Deep Learning? We feel there are two yes/no questions that help you choose between A100, A6000, and 3090. These three together probably cover most of the use cases in training Deep Learning models:
Do you need multi-node distributed training? If the answer is yes, go for A100 80GB/40GB SXM4 because they are the only GPUs that support Infiniband. Without Infiniband, your distributed training simply would not scale. If the answer is no, see the next question.
How big is your model? That helps you to choose between A100 PCIe (80GB), A6000 (48GB), and 3090 (24GB). A couple of 3090s are adequate for mainstream academic research. Choose A6000 if you work with a large image/language model and need multi-GPU training to scale efficiently. An A6000 system should cover most of the use cases in the context of a single node. Only choose A100 PCIe 80GB when working on extremely large models
“””
You should get a Nvidia card with as much VRAM as possible. A 12 GB RTX 3060 is probably the most cost efficient at the moment.
I don't think AMD is really viable for ML. Nvidia has the mind share in that segment, so nearly all tools will work with Nvidia, while very few support AMD.
For sure, but there were many crypto operations with data centers that had hundreds or thousands of GPUs. For example, a report from JPR estimates that crypto miners bought 25% of all GPUs produced in 1H'2021.¹
¹ https://www.jonpeddie.com/blog/crypto-minings-half-a-billion...
For PoW you dont just need work, you need work that can be easily verified, easily scaled difficulty levels, and difficult to cheat with - all with no centralized trusted authority.
Even when SETI/F@H was worth only fake internet points they already had problems with people cheating. What happens when its worth real money?
Humm. Someone beat me to this idea so I don't have to do it.
There is a similar cryptocurrency called curecoin for folding@home although I think they do use SHA-256 mining (i.e not useful computations) for part of the rewards
> "The only coins showing profit have no market cap or liquidity. The profit is not real."
That is, the remaining PoW derived coins are not as profitable for miners, presumably not enough to cover their margins.
Deleted Comment
Innovation on other use cases for GPUs were probably held back by excessive costs.
GPUs were and still are in high demand anyway.
Crypto blocked real innovation for years.
Nvidia was spending money, time and energy to cater to them one way or the other instead of other areas were GPUs are actually good for society.
Your sentence is very generic and not sure why this should be an universal law. Putting money into the right thing might cause innovation not just money
Like what Peter Jackson did for "They Shall Not Grow Old".
The market knew that the PoS merge was coming. I'd have expected the market to therefore have already factored the PoS transition into GPU prices. Why do you think GPU prices are tanking now?
Nvidia is sitting on a bunch of ampere stock, the miner liquidation, and new gen cards coming out.
My guess is January-February when the Christmas buying season is past and all these are in full swing.
Flash memory is also experiencing a price crash.
Deleted Comment
Over the course of a GPU's lifetime (in your hypothetical use-case), how much of the cost is the GPU itself and how much electricity?
I know it's a pretty random question, but i honestly don't know anything about GPU's other than that my life would be a lot easier if I had one (I run an old Macbook Pro, and I have an old Asus laptop that runs a debian distro, and finally my very old gaming pc is a alienware alpha).
GPU prices may go down in the short term, but long term GPU speeds will stagnate.
I wish there was a way to put a 1 year timer on this comment to see who's right me or stuntkite.
I don't believe R&D funds will dry as well, since they will simply be relocated to AI and datacenter workloads which have been on the rise more recently.
Dead Comment
There's a lot of stuff that isn't any better than crypto either, deepfakes, producing hundreds of thousands stable diffusion pics of the same scene.
Much of this is still a garbage fire of greenhouse gases and e-waste, used GPU prices won't change that. Many ml advances are simply more compute and bigger models in the end.
Why are you twisting reality? People don't generate hundreds of thousands of stable diffusion pics of the same scene. Instead they generate dozens to hundreds of images carefully tweaking the prompt and the starter image.
I'd suggest focusing your irritation on advocating for universally clean generation of energy. Regardless of how much energy each of us uses, and regardless of what we use it for (even if it's something that you consider to be "useless"), most of us seem to agree that we don't want to pollute our air and ruin our planet. However, the moment you start attacking things that make other people happy, you are risking losing support for that fundamental goal.
But doing ML won't necessarily boost GPU sales because most deep learning work is shifted to the cloud.
Not really, no:
https://wccftech.com/hours-into-the-eth-merge-nvidia-geforce...
New 3000 series retail prices on the high end cards have been steadily dropping, and it seems like on ebay used prices have dropped 10% in the last month.
As for the 4000 series cards - they've stated in SEC filings that they will be trickling out stock to keep prices high.
AMD are the ones who are really fucked; their cards suck, and nobody bought them out of choice but desperation. Now that the market is glutted, people will heavily prefer nvidia cards.
Now that there is a great deal of excess capacity, presumably it would be possible to attack smaller chains in an attempt to glean some profit through double-spending, as those chains might now be vulnerable to larger-scale history-rewriting attacks.
step 2; start your fleet of GPUs on a secret 51% attack.
...
step n; profit.
My favorite wacky conspiracy theory is that proof-of-work was invented to slow down ai progress.
https://rationalwiki.org/wiki/Roko%27s_basilisk
I know that bitcoin mining requires ASICS and GPUS can't compete with that, but I just assumed miners are just mining one of many possible coins, with Etherium being one of them.
It's be like if all women stopped going to Starbucks tomorrow, and you asked, "Why don't they just sell to the men?" Well they could, but they'd still be down ~50% in revenue.
Also, I hope every GPU miner's fireproof safe fails and all their money burns up.
we mostly gave up and just got barebones machines since the cabling situation becomes pretty tricky, and the barebones total cost is low relative to the GPUs anyways.
Most of these coins only have value as scams that you could prop up then cash out by exchanging for BTC or ETH; so long as the "new coin of the day" hype train exists there will be a way to make money off of GPU mining. I guess there will always be suckers in this unregulated market.
What was it before ETH PoW -> PoS merge?
And is 0.1 $/kWh average/normal?
Let me pull up my latest electric bill (which I almost have never looked at/have on autopay):
New Charges
Rate: RS-1 RESIDENTIAL SERVICE Base charge: $8.99
Non-fuel: (First 1000 kWh at $0.073710) (Over 1000 kWh at $0.083710) $76.74
Fuel: (First 1000 kWh at $0.034870) (Over 1000 kWh at $0.044870) $36.49
Electric service amount 122.22
20.99 in taxes / surcharges
Total $143.21
$143.21 for 1036 kWh in a 30 day timespan
0.138 $/kWh with taxes and fees I guess for me, I'm sure if I was doing crazy ASIC stuff at my house they'd charge me more/the rate would become less favorable
Looks like mining BTC would net be $0.85 a day profit with some kind of ASIC. $310/yr. Yikes.
- What chip should I buy? - When should I buy? Should I wait for prices to drop? Will new and improved chips be released anytime soon? - What are the advantages / disadvantages of each chip (3060 vs 3080, Nvidia vs AMD)? Which chip is most cost-efficient? What are each chips' specialties (e.g. specific type of neural network, graphics vs compute)?
https://docs.paperspace.com/core/compute/machine-types/
1 x A100, 80GB is $3.19 / hour. 8 x A100, 80GB is about $25 / hour.
They have much less expensive machines. I used to use them to run steam games, but now proton is just too darn good. Their low end machines are OK for CAD software that supports real-time raytracing.
“”” So, which GPUs to choose if you need an upgrade in early 2022 for Deep Learning? We feel there are two yes/no questions that help you choose between A100, A6000, and 3090. These three together probably cover most of the use cases in training Deep Learning models:
Do you need multi-node distributed training? If the answer is yes, go for A100 80GB/40GB SXM4 because they are the only GPUs that support Infiniband. Without Infiniband, your distributed training simply would not scale. If the answer is no, see the next question.
How big is your model? That helps you to choose between A100 PCIe (80GB), A6000 (48GB), and 3090 (24GB). A couple of 3090s are adequate for mainstream academic research. Choose A6000 if you work with a large image/language model and need multi-GPU training to scale efficiently. An A6000 system should cover most of the use cases in the context of a single node. Only choose A100 PCIe 80GB when working on extremely large models “””
[0] https://lambdalabs.com/blog/best-gpu-2022-sofar/
I don't think AMD is really viable for ML. Nvidia has the mind share in that segment, so nearly all tools will work with Nvidia, while very few support AMD.
https://timdettmers.com/2020/09/07/which-gpu-for-deep-learni...