Readit News logoReadit News
mikae1 · a year ago
I wish performance per watt[1] was more regarded than pure performance in times of ecological crisis. Let's see how these two fare...

[1] https://wikipedia.org/wiki/Performance_per_watt

magicalhippo · a year ago
My 2080Ti is still holding up reasonably well, helped by the fact I only run 1440p, but it's also doing OK for inference[1]. Though my "must have shiny new things" craving is getting hard to ignore.

Considered getting a new GPU earlier this year but then realized 5xxx was "around the corner", but now it seems they're pushed back to next year. And with AI being what it is, I'm guessing prices won't drop significantly.

Would be nice if AMD could get their GPU act together so it was a more viable alternative, NVIDIA could do with some competition.

edit: just recalled I had a dual-chip GPU back in the day, the ATI 4870X2[2]. Though that was more like two GPUs glued to one PCB, so effectively "single card SLI".

Hopefully the 5090 would be a better experience, as my 4870X2 never quite lived up to what it could theoretically do.

[1]: https://www.pugetsystems.com/labs/articles/llm-inference-con...

[2]: https://www.techpowerup.com/gpu-specs/radeon-hd-4870-x2.c236

throwaway48476 · a year ago
Sadly AMD has given up on the high end. Which is the worst timing because desktop GPU sales are increasingly used for AI.
jauntywundrkind · a year ago
They had a multi-chip consumer flagship for the next rdna4, but it was reportedly having lots of problems.

I do think games have a lot more vary needs than most AI - which primarily need to crunch matrixes - so I'm a little sympathetic to a jump to many-chip failing. You also burn power on chip to chip interconnect, try as we might to ever optimize this down.

> Which is the worst timing because desktop GPU sales are increasingly used for AI.

Well, the good news is RDNA4 is the last consumer chips. Good news because they are merging RDNA with CDNA (compute/ai) to make a new UDNA (unified), which all next next gen chips will be.

https://www.tomshardware.com/pc-components/cpus/amd-announce...

bryanlarsen · a year ago
Prediction: the cards will sell crazy well.

Gamers don't need high end video cards, they want high end video cards. In general, the high marginal price for low marginal value of high end video cards prevents most gamers from acting on their desires.

But this generation of video cards provides a couple of other justifications for the purchase:

- it will allow them to run uncensored ML stacks locally - it will allow the buyers to train themselves on the hottest new career path.

A large number of people who use these excuses to justify the purchases to themselves or their loved ones will only use it for gaming, but those excuses will fuel a lot of sales.

This seems like the wrong generation for AMD to skip the halo tier of gaming cards.

drtgh · a year ago
> But this generation of video cards provides a couple of other justifications for the purchase - it will allow them to run uncensored ML stacks locally

It does not. They will/are reselling exactly the same graphics cards that have been sold for the last ten years; the same computing capacity (and a modest increase for their top model), with very limited RAM, at very high prices.

They simply changed the silicon manufacturing process to smaller nodes without achieving a meaningful reduction in power consumption with respect to the previous generation, they still being high-consuming energy wasters heat generators.

bryanlarsen · a year ago
Flimsy justifications can be flimsy.

The same justifications sold lots of 20X0 and 30X0 generation cards too.

kiririn · a year ago
Even the 250W 2080Ti (+150W Intel) is oppressive to be in the same room with during warmer months. I know it probably won't be, but it should be a hard sell in countries that don't have air conditioning as standard. Not to mention the noise needed to cool such heat
lanceflt · a year ago
I'm running a 4090 at 280W, and I'm seeing ~96% of the performance of 450W. There's no need to run it at full power.
Scene_Cast2 · a year ago
Also doing the same here. I'm hoping that more places will run power to performance analysis like the derbauer 4090 review video.
jauntywundrkind · a year ago
Im thinking more and more about putting my computer outside when I game.

I have 50 and 150 ft fiber optic DisplayPort cables. Can do 8k60, or 4k240. Can be had for like $70, work fine.

The hard part is input? I'm no stranger to USB extension cables. But I don't love them. They're so bulky, and they usually need 5v in every 50ft, and since USB hierarchy tops out at 7 deep and each cable is actually 2 hubs, you can only really chain two 50 ft cables together (and a 25 ft shorter active extension too, then hub and device.. gee since the cables sre hubs would sure be nice if someone made an active USB extension cable that exposed all 4 ports at the end!). Here's a well reviewed example for $42, https://www.amazon.com/Extension-Extender-Repeater-Boosters-...

There are some usb4 over fiber optic solutions, but often more than $150 for 100 ft, which is kind of a drag. Spent money on stupider things, maybe will do it.

HPsquared · a year ago
You can turn the power level down usually without hugely reducing performance, it's quite non-linear.
solardev · a year ago
This 100%. When I still built PCs, I'd usually throttle the heck out of the GPU and CPU fan curves and keep them off or just barely on but inaudible. Usually that'd mean going down to medium settings or so, but games still look good enough (and run at a high FPS) it doesn't really matter. And that made the in-room experience a lot nicer.

I wonder how DLSS and frame gen affects power usage. Presumably they save power vs drawing real frames, but I haven't tested it...?

ghastmaster · a year ago
AMD's decision to back out of the high end cards seems even more logical given this information. High end may be trending too powerful. I used to buy the high end cards, but my next one will not be so. The cost and the power consumption are large factors in that decision. I do not need nor can I afford a super computer for gaming.
BaculumMeumEst · a year ago
Does it really? I keep thinking that they could just slap on 32G of VRAM on a midrange card and they would get roaring enthusiast support. I'm guessing the only reason they don't do that is to not encroach on their more expensive ML cards?
Ekaros · a year ago
For gaming I would be happy with 16GB card. Enough to survive three or so generations more.
Ekaros · a year ago
It seems both Nvidia and Intel are now brute forcing things with more energy consumption and possibly much larger dies. Which is not that great in either power efficiency or costs.

Doing best at 100-150W GPU at top seems most responsible move to me. With reasonable cost for the GPU to boot.

solardev · a year ago
Have you ever considered Geforce Now? It's incredible to use, and all the power consumption is in their data center instead of your home. Right now it's a 4080 but presumably they'll upgrade to 5080s once those are out.

I used to build my own PCs, but GFN is a much nicer (and significantly cheaper!) experience overall. I can play everything on ultra, at 4k (with DLSS) for $20/mo and don't need to worry about keeping up with new GPUs or local heat, noise, and system maintenance. For an aged, busy gamer, it's really really nice.

nothercastle · a year ago
The load times are pretty terrible and game compatibility is 80-90%. But otherwise I are it’s a good deal. Great for turn based and rpg type games, but very mid for first person. I would recommend it with those reservations. Also I don’t think the high end tier is worth it because of the bandwidth inconsistency in their data centers.
elcomet · a year ago
What games are you playing? I'm guessing FPS don't work so well with latency
gtirloni · a year ago
If the trend in NVIDIA deceiving customers continue, the 5090 will be the new xx70 and the 5080 the new xx60.
PedroBatista · a year ago
This tango NVIDIA is dancing will continue as long as ( local ) AI doesn't reach plateau where even most enthusiasts are happy with the AI model they have, because when it comes to games, other than a few streamers and rich kids, nobody will run to buy this, even AAA games are dying and the appetite for running them with the latest and greatest graphics has mostly disappeared ( given the cost ).

From a gaming perspective, technicality is not anything like the ending days of Voodoo cards, but somehow it reminds me of the same feeling.

spyder · a year ago
True, but also AI could be the reason why games would need more powerful GPUs in the future (but not yet). For example LLMs for NPC dialogs, or the wilder things like running the whole game inside a neural network, or running an AI style filter over the output of the regular game engines.

https://futurism.com/doom-running-on-neural-network

https://www.youtube.com/watch?v=udPY5rQVoW0

https://x.com/VaigueMan/status/1836802465133437056

JohnBooty · a year ago

    For example LLMs for NPC dialogs
I've been thinking about this for a while; like how feasible it might be.

What's cool about this is that (unlike so much generative AI) is that a game based around this would not be cutting writers out of the loop.

I'm imagining a game where the writer(s) produce backstories for the characters, and all sorts of dialogue they might possibly say. And then the LLM is sort of remixing that stuff in real time to produce novel dialogue and behaviors.

Done well, I think it could be pretty seamless and compelling...

solardev · a year ago
I mean, Nvidia ended up buying 3dfx and using some of their SLI tech for a while. But then it turns out DLSS and frame generation were much better ways at increasing FPS and quality.

PC gaming was always a niche market relative to the consoles, and it will probably remain so. But also, it's never looked better, had such a great selection (especially indie games), or been this affordable (between Gamepass, GeForce Now, and various publisher subs). For like $30/mo you can play many of the latest amazing games on max graphics without owning a GPU at all.

I hope, like servers, more and more of this moves to the cloud. It doesn't make sense to run much of this workload locally when they can be more efficiently managed and shared in data centers anyway, with better power and cooling management than most home PCs could have.

Rinzler89 · a year ago
>Nvidia ended up buying 3dfx and using some of their SLI tech for a while

The SLI Nvidia used had nothing to do with the tech from 3df besides the initials and the general idea of the concept, especially since the way the Nvidia GPUs worked at the time Nvidia dropped SLI was totally different than the way 3dfx's 3d accelerators worked.

qwytw · a year ago
> PC gaming was always a niche market relative to the consoles, and it will probably remain so.

The difference is not that huge though. Supposedly 43:57 by revenue:

https://www.visualcapitalist.com/visualizing-pc-vs-console-g...

According to other stats:

https://www.statista.com/statistics/292460/video-game-consum....

PC is even bigger.

And this is by revenue, I would guess that on average PC games might be significantly cheaper? Meaning that more people actually play them.

Or it might just be skewed by some highly addictive/competitive in Asia etc. (where IIRC console gaming was never that big outside of Japan).

Joker_vD · a year ago
If this doesn't stop, then very soon we'll just end up with external video cards because this is ridiculous. We already kinda have that [0], but it's quite a hack. I wonder how well PCIe can be delivered over the ribbon cables though...

[0] https://www.razer.com/mena-en/gaming-laptops/razer-core-x

Springtime · a year ago
I saw someone who used two humongous, external watercooling radiators to cool both their GPU and CPU. Apparently these are now available to buy.
mananaysiempre · a year ago
Kinda, because USB-C/Thunderbolt eGPUs like the Razer Core have noticeably less bandwidth than the same card installed on your motherboard.
chx · a year ago
Which is why OCuLink is gaining traction. Lenovo added hot swap capability and sells it under the name "TGX" in China but the compatibility is total both ways.

Also Beelink has MiniPCs with external PCIe x8 slots, it's a clever trick: https://liliputing.com/beelink-gti14-ultra-is-an-intel-meteo...

nothercastle · a year ago
I would actually welcome that. It would make building small pcs way easier. And make upgrades and swaps a breeze