Ryzen has been more power friendly than Intel Core for a while now. I don't think consumers care much about eco friendliness, though, which is a shame. I suppose it could always be sold as "doesn't heat up the room as much", but anyone with a dedicated GPU will have a computational space heater sitting next to them regardless these days.
For mobile processors, it's one of the first things I look at though. And like you and others said I don't just look at the TDP but also rely on the big testing sites for load heat tests.
Or, at least, I did until the m series processors came out and you could get a passively cooled laptop for a decent price.
Maybe not everyone but I definitely care. Lower consumption means smaller PSU and cooling requirements and less issues during heatwaves (no AC). Futhermore, my computer corner with a fair bit of electronics is served by only one electricity socket with a bunch of extensions cords that I really would rather not overload.
I think that yes they're more power friendly when doing work, but consume more power at idle than Intel CPUs. At least that used be the case, don't know if that's still the case.
As for eco-friendliness... it's something I now consider when buying electronics but more from an electricity bill point of view.
Be very careful about comparing TDPs, especially across manufacturers. It doesn't really say anything about real power consumption. The TDP difference is so big it probably represents some real difference in power consumption, but actual measurements are necessary to say anything concrete.
While you can't compare TDPs directly. Basically all tests of Ryzen processors with 3d cache use about half as much power or less than their Intel counterparts.
So this wouldn't surprise me.
Also, chiplet based Ryzens are notorious for higher idle power consumption than Intel's monolithic dies, which is where CPUs spend most of their time in consumer applications (web browsing, netflix, shitposting on reddit, etc) instead of running at full load all the time.
Synthetic tests that stress everything at full bore, are not realistic representations for your average consumer PC application scenario. Even video games don't cause as much power draw as synthetic benchmarks can which are more akin to power viruses.
In real life daily driving, Ryzen might still be more efficient than Intels but with a much smaller margin than such tests might lead you to believe.
Along with comparing what the others have said about comparing TDPs between vendors (or even different product lines sometimes) TDP is about multi-core draw so you'd need to also normalize the multithread performance differences between the two, not look at the single thread performance being near equal (that'll be significantly less than 65 Watts for each).
I think the 8700G is probably more efficient but it won't be anything like 65 vs 181 watt would leave one to believe.
The higher power consumption on the Intel side is mostly a reflection of the futility of feeding it more power. A better way to compare these would have been to dial back the power levels on the Intel CPU until it was only just as fast as the AMD part, and compare that power level.
> But when integrated graphics push forward, it can open up possibilities for people who want to play games but can only afford a cheap desktop
Or those who play on recent handhelds. Steamdeck with integrated RDNA2 is very capable with even modern games. Sure, you can't crank it up to ultimate quality with hundreds of fps - but it's enough, and above the "cheap" level of performance.
Also, looking at geekbench results, it basically matches the M2 Max in a mbp.
There’s either a mistake or this is a bad metric, perhaps.
Currently the most powerful AMD iGPU is Radeon 780M (found 7840U/HS and 8700G CPUs). Judging by Notebookcheck‘s results, M2 Max GPU has up 2× the fps in Borderlands, 2.5× the fps in Witcher 3, and 3× the fps in Shadow of The Tomb Raider.
As for the benchmarks, the M2 Max GPU has 4–6× the fps in GFXBench, compared to Radeon 780M. And the RDNA3-based 780M has twice the raw compute performance, compared to Steam Deck’s RDNA2 GPU.
Unfortunately, GPUs in handhelds are always severely underpowered.
at least historically handhelds can't be too expensive if they want to archive wide success
M2 Max isn't available on the free market (as part) but if it where it likely alone would cost noticable more then what a full handheld can afford to cost
Sometimes even lower with upscaling. But yes, one "advantage" of handhelds is that, thanks to their tiny screens, they can run at lower resolution without too much visible degradation.
Can this integrated graphic units (APUs) work similarly to Apple M series for Generative AI inference, in combining GPU and System memory and give competitive advantage to AMD?
Anything without CUDA is not an existential threat, but anything that supports PyTorch and provides lots of RAM at a low price is a monopoly-profits threat.
With APUs, you must buy the Pro SKUs for ECC support. I specifically had to get the 5650g pro (as opposed to the regular 5600g) for my home server due to this exact reason.
Just curious, but why opt for an apu in a server config? They tend to be marketed towards gamers on a budget so not including ecc support isnt that big of a deal.
Not being able to beat an rx570 isn't something to brag about. Those cards go used for under 50usd in the market and can be undervolted pretty easily.
Just having a hard time figuring out why someone wouldn't just go with a 7700.
65 vs 181 watt
now that is impressive!
Less heat also allows for a more compact or mobile design. Smaller builds are gaining popularity.
Or, at least, I did until the m series processors came out and you could get a passively cooled laptop for a decent price.
As for eco-friendliness... it's something I now consider when buying electronics but more from an electricity bill point of view.
Many do care about noise though!
See https://gamersnexus.net/guides/3525-amd-ryzen-tdp-explained-...
Synthetic tests that stress everything at full bore, are not realistic representations for your average consumer PC application scenario. Even video games don't cause as much power draw as synthetic benchmarks can which are more akin to power viruses.
In real life daily driving, Ryzen might still be more efficient than Intels but with a much smaller margin than such tests might lead you to believe.
Deleted Comment
I think the 8700G is probably more efficient but it won't be anything like 65 vs 181 watt would leave one to believe.
In a video encoding task, the 8700G used 30% less energy than the 14600K, but it actually used more energy than a 13400.
Or those who play on recent handhelds. Steamdeck with integrated RDNA2 is very capable with even modern games. Sure, you can't crank it up to ultimate quality with hundreds of fps - but it's enough, and above the "cheap" level of performance.
Also, looking at geekbench results, it basically matches the M2 Max in a mbp.
Currently the most powerful AMD iGPU is Radeon 780M (found 7840U/HS and 8700G CPUs). Judging by Notebookcheck‘s results, M2 Max GPU has up 2× the fps in Borderlands, 2.5× the fps in Witcher 3, and 3× the fps in Shadow of The Tomb Raider.
As for the benchmarks, the M2 Max GPU has 4–6× the fps in GFXBench, compared to Radeon 780M. And the RDNA3-based 780M has twice the raw compute performance, compared to Steam Deck’s RDNA2 GPU.
Unfortunately, GPUs in handhelds are always severely underpowered.
I wish Asahi would move forward much faster and I could game on it, but at this point, gaming on a Mac isn't really a thing.
Source: I have one from work that I really want to be able to play. Whisky, Crossover, Parallels, I tried them all.
M2 Max isn't available on the free market (as part) but if it where it likely alone would cost noticable more then what a full handheld can afford to cost
With a caveat: at its native 1280×800 resolution.
https://www.asrockrack.com/minisite/Ryzen7000/
That is an explicit "ecc: no" from amd on 7640U spec page.
There is absolutely no ECC mentioned on 8700G spec page currently: https://www.amd.com/en/product/14066
Vendors can do only so much if amd starts actively disabling/removing/preventing ecc, like it seems to be doing quietly now.