Readit News logoReadit News
imbusy111 · 2 years ago
I bought an RX 6600 off of ebay for $155 half a year ago, for context, which, as the article mentions, is substantially more power efficient for nearly the same (poor for these days) performance.
JohnBooty · 2 years ago
It has "poor" performance relatively speaking, I guess, but on mine I'm playing all of my games with settings fairly high at 1080p at 100+ fps on my RX 6600.

Admittedly I'm not playing AAA games from the last few years.

Deleted Comment

sosodev · 2 years ago
I got an RX6600 new for a similar price not long ago. It definitely works well for 1080p gaming even with new titles. Provided you’re not expecting them to run at max settings or super high frame rates.
ta8645 · 2 years ago
The inflation we're starting to see over the same timespan, makes that comparison harder to evaluate.
imbusy111 · 2 years ago
You mean the 6 month 2% core inflation? Technology has always depreciated quickly.
tibbydudeza · 2 years ago
Same here bought a used RX 6600XT - GPU prices are insane - last GPU I had was a RX580.
borissk · 2 years ago
Looks like Intel realized they can't leave the GPU AI market to nVidia and to finance their own AI GPUs they need to sell a ton of GPUs to gamers first.
HeWhoLurksLate · 2 years ago
definitely agree

also, since this is a relatively recent development for Intel they also have to get their drivers & stack competitive, and there's clearly no better way to do that than by getting telemetry on exactly what ways your products are lacking and crashing things

edit to add:

I think there's also a really good chance that this winds up like the first generation of Ryzen systems did- not the most competitive, but quite cheap and with a few interesting features to get enough people over, and then eventually the future generations go on to go head-to-head with offerings from other companies.

I would be more surprised by the complete failure and disappearance of ARC in five years than by it duking it out in the datacenter and gaming markets even at the high end.

bigdict · 2 years ago
Same move as Tesla.
wlesieutre · 2 years ago
Opposite move as Tesla, they started on the expensive end to finance scaling up the cheaper ones.

Meanwhile Intel is going after the cheap ones to try and get a budget to tackle the more difficult AI/ML GPU market.

Ecco · 2 years ago
Is there any reputable website to easily compare GPU performances? I know it depends on the specific game/benchmark, but some kind of standardized score would really help.
kyrra · 2 years ago
Gamer's Nexus YouTube channel. They put a lot of effort to make sure they do like for like comparisons. They also point out that certain cards have strengths in some games and not others, and it depends on the settings. I highly recommend their videos.

https://m.youtube.com/channel/UChIs72whgZI9w6d6FhwGGHA

ethbr1 · 2 years ago
A_D_E_P_T · 2 years ago
Tom's has to be one of the longest-running sites on the internet. They've had a GPU hierarchy since 1997! Amazing consistency and longevity.

https://web.archive.org/web/19980109201410/http://tomshardwa...

scns · 2 years ago
My favorite is Techpowerups' GPU database [0]. If you pick a model like the A580 [1], you'll get loads of information, including a comparison to other GPUs. Their tests are great too. I consult them to find the cards emitting the least amount of noise [2].

[0] https://www.techpowerup.com/gpu-specs/

[1] https://www.techpowerup.com/gpu-specs/arc-a580.c3928

[2] https://www.techpowerup.com/review/gigabyte-radeon-rx-6600-e...

poisonborz · 2 years ago
It is mostly a single metric, but it should give uniform results for approximate overall average performance if that matters. Has CPUs and disk drives as well.

https://www.videocardbenchmark.net/

shoo · 2 years ago
logicalincrements gives relative performance comparisons of gpus vs a current fast gpu - https://www.logicalincrements.com/
pahae · 2 years ago
Catering more to a German speaking audience but nonetheless a pretty good overview (updated quarterly?):

Performance: http://www.3dcenter.org/artikel/fullhd-ultrahd-performance-u...

Performance/Price w/ German market prices: http://www.3dcenter.org/artikel/grafikkarten-marktueberblick...

everyone · 2 years ago
asmor · 2 years ago
Userbenchmark are an incredibly biased website towards Intel (vs. AMD), so much so that the Intel subreddit banned them. This is less obvious in how they change the weighting of the benchmark, but their AMD reviews are downright comedic, where everyone but them it bought by AMD and they are the last bastion of honesty. They might actually be more anti-AMD than pro-Intel, considering GPU reviews before Intel entered the market are also always 3/4 complaining that AMD bought everyone off.

See also: Userbenchmark - the April Fools that never ends (2kliksphilip)

https://www.youtube.com/watch?v=RQSBj2LKkWg

butz · 2 years ago
What is support of Intel video cards on Linux nowadays?
goosedragons · 2 years ago
They're fine. At least for the other Alchemist cards, A380, A750, A770, you need kernel 6.2+ for them to work correctly out of the box. Hopefully the same is true of A580. There is some issues, there's no sparse residency support yet so DX12 games don't run under Proton but otherwise they work.
orwin · 2 years ago
I had to reinstall X-server (I couldn't find another way to make X use the new Intel drivers for some reason). Except that, all is fine. Upgrade to kernel 6.2+, update Intel drivers, reinstall X, swap cards. I also had to modify my grub and my Xconfig a lot to make my old Nvidia card work, and cleaning it up took an hour.

Overall, better experience than with Nvidia. If your computer/install is recent maybe you won't need to reinstall X at all. I kept my Linux across 2 mobo and 4 CPU over 10 years, I usually prefer patching/fixing the issues rather than reinstalling but I think I ought to clean it up.

FirmwareBurner · 2 years ago
Always have been the best out of the three
pjmlp · 2 years ago
The users of Intel cards once upon a time based on PowerVR wouldn't say that.
bravetraveler · 2 years ago
Seems fine, I have one of these Arc cards doing transcoding

I haven't been able to prove it yet but it seems to crash when I unplug the temporary display I have on the rack

It's either this or unfortunate timing/panics from this old ZFS release candidate/kernel

nubinetwork · 2 years ago
AFAIK xe-based GPUs are waiting for the next release to get more bits working, but IGT/UHD should be "fine".
bee_rider · 2 years ago
The memory bandwidth could be interesting?

Going after pure gamers seems like a rough market for Intel to enter. But maybe with the extra compute and the memory bandwidth, they are actually going after devs and other professionals who also game sometimes? That seems like a good niche for them, at least to start chipping away at.

snuxoll · 2 years ago
Pure gamers are the perfect market to target if they can sell budget cards with ever improving driver support. Even AMD's low end is laughably expensive these days (in terms of $/perf as well as pure cost), so there's a large market of people who buy budget cards that's being neglected by the existing market.
Dylan16807 · 2 years ago
> Even AMD's low end is laughably expensive these days (in terms of $/perf as well as pure cost)

Amazon currently has some 7600s priced at $188. At $250+ they don't do well in this competition, but at that price I'd say the 7600 wins.

jauntywundrkind · 2 years ago
Almost no one else does 256-bit except for upper tier. This is a great call out of some ass kicking.

It costs them not really much, but most established vendors just don't wanna give away memory bandwidth, are trying to tier it up. Interesting to see Intel be like, hey, here's gobs of bandwidth for cheap.

bee_rider · 2 years ago
Yeah, beyond just competition just driving down prices generally, adding a new entrant into the market seems have the advantage that they can make their own, totally different product differentiation decisions.
KeplerBoy · 2 years ago
What's the reference language for compute on those cards?

Deleted Comment

hiAndrewQuinn · 2 years ago
1080p feels like the sweet spot for me in terms of resolution. I have 4K screens at the workplace office, and on my laptop itself, but it just doesn't matter very much to me.

I picked up a few 1080p screens for 80 euros apiece a few weeks ago to build out my home office and I'm very happy that the price has been driven so far down in the last 10 years.

quickthrower2 · 2 years ago
For coding I agree. Roomy enough for an IDE but little neck movement required of larger monitors.
sph · 2 years ago
You know they make 4K screens that are the same size as regular 1080p right?

The point of 4K is clarity and legibility of text (at > 1.5x scaling), not size.

I don't get people who say 1080p is good enough for coding. They must have eyes at a lower resolution than mine not to being able to tell that text at 2x scaling is so much more readable and pleasant.

badpun · 2 years ago
1920x1200 are the best screens for coding, it's a shame they're hardly made anymore.
keyle · 2 years ago
This has cost me a few thousand dollars to admit.

27" 4K is the sweet spot where the display is retina but the size is 1080p equivalent.

I like fullscreen applications for better focus.

Alifatisk · 2 years ago
It's good that more competitors join the GPU market, I am tired of Nvidias monopoly.

I really need to upgrade my 8 year old GPU, I can barely afford good one because of the high prices.

ThinkBeat · 2 years ago
It his Arc architecture one that can take on NVIDIA for AI workloads once Intel expands and extends it?