For several (3) years now I have had no reason to update my desktop with a i7-6700k because single core performance has remained relatively flat (in fact my particular processor has quite good single core rating still). I would have liked increased cores but the reduction in single-threaded performance wasn't worth it. Zen 2 changed all of that...
Now EVERY single Zen 2 chip is at least smidge faster than my 6700k in single threaded benchmarks AND close to 100% faster in multi-core benchmarks. So for $180 I can buy something that takes a huge meaningful shit on my (fairly nice at the time) 6700k (~$350 when I purchased it).
That's just crazy to me how much performance I can get for so little. I'm going to buy something beastly with at least 16 cores, but! I also plan on building a little cluster using mini-itx B550 boards and the 3300x. In fact, I'll probably build the little cluster first, because each 3300X is still faster than my 6700k and the total cost per system will be like $450!!one! It hasn't been since the orig Core-2-Duo days I've gotten such a meaningful upgrade for so little. Plus, when Zen 3 comes out it's a drop in upgrade.
AMD is delivering insane value to their consumers, and I love it. I just wish I could buy a Zen 2 chip in a laptop that doesn't look like it was made for a fourteen year old (no offense to any fourteen year olds). I heard someone say the lack of 4k and more professional style laptops could be Intel back-channel fuckery, but... there's also a chance no-one expected AMD, in a single fucking generation of CPUs, to sweep every single market.
>> I just wish I could buy a Zen 2 chip in a laptop that doesn't look like it was made for a fourteen year old (no offense to any fourteen year olds)
I just wish I could buy a decent motherboard that doesn't look like a 14yo's first attempt at drawing an f-35. My current rig does this weird glowing thing at night when it is supposed to be off. For my next machine I would honestly pay more to NOT have RGB support.
You can, but it's surprisingly hard to do so. There is the Asus Pro WS X570-ACE for current AMD consumer parts, but try to find a sTRX4 one for your 4000$ cpu that doesn't look like it's going to transform into a giant robot.
I had to use an ASUS motherboard and it took a good 10-15 minutes of poking through menus (with terrible keyboard navigation) to find the "magic rgb off" incantation.
If you happen to have an ASUS Prime Z370-A motherboard, here is the what you have to do:
boot and quickly press F2 or DEL to get into the BIOS (I had to do this more than once to poise my finger over the right key)
Press F7 to enter Advanced mode
Choose the Advanced menu (4th menu across top)
-> Onboard Devices Configuration (8th sub-menu in list)
-> RGB LED Lighting configuration (2/3 way down the page)
You have to disable both:
- when system is in working state
- when system is in sleep, hibernate or soft off states
I basically tried all obvious places, then all the menus before this one before I stumbled upon it. it was nuts.
I like my personal desktop builds to be at least vaguely aesthetically pleasing, but I agree that RGB lights everywhere is not the way to do it. Just give me a nice black and white color scheme and maaaybe a couple of easily disabled lights here and there.
They don't dissipate much heat, yet almost every current AMD motherboard (except one at $400) has one of those annoying tiny high rpm fans that tend to die well before anything else on the motherboard.
> i7-6700k because single core performance has remained relatively flat
Well, it hasn't improved tremendously, but it hasn't remained flat. You can get around 20% more in single core performance, if it's so important to you.
Lack of "acceptable high end" Zen 2 laptops is my gripe too. From what I understand, it is following 2 reasons;
1. Intel/nVidia Contracts have OEMs their hands tied (which explains capped GPU in most Zen 2 laptops).
2. Lack of widespread Thunderbolt 3 support on AMD.
While I'm fully bought on reason 1., reason 2. is still hard to digest as one can still ship laptop with USB-C port supporting PD and DisplayPort (w/ alternate mode) and call it a day, and most users won't mind. I hope Zen 3 causes the power shift on laptops.
I also noticed that the current Ryzen mobile 4000 series tops out at 32GB of memory supported, which sadly excludes them from even possibly being in the 2020 16” MBP since Apple’s already been selling 64GB systems. :/
Have been thinking strongly about the ROG Zephyrus 14 (cheaper and better than the 13" MBP in almost every way). But don't think I can stomach not having the webcam built-in (do way too many Zoom calls).
I upgraded to a 3600X. I use an 8th gen 6 core chip at work. This 3600X murders the Intel daily. The speed is insane. At work I run off an SSD with Optane acceleration, my home machines with the 3600X and NVME just kills my work machine, which itself is no slouch. I've never been so impressed with a brand new machine, and I've been building them since 1990.
I don't know what's wrong with me, but I want to atleast buy a 1400€ 24core 3960x Threadripper, because of the insane upgradability.
The idea that in 2-3 years i could just buy a used 3970 or even a 3990 to more than double the core count is amazing to me, and knowing that prohibits me from having my mind blown by the insane value of the smaller Ryzens.
Pro tip: sell your used 6700k on ebay. Somewhere, someone can put it to good use, and judging by the latest closed deals on this model, you can easily sell it for something like $225.
> I just wish I could buy a Zen 2 chip in a laptop that doesn't look like it was made for a fourteen year old (no offense to any fourteen year olds). I heard someone say the lack of 4k and more professional style laptops could be Intel back-channel fuckery,
Lenovo just released a couple ThinkPads peered by AMD Ryzen 7 Pro CPUs: T495, T495s and X395. They don’t have 4K screens though.
The T495 / X395 series is based on Zen+ (Ryzen 3000 mobile) chips and not Zen 2 (Ryzen 4000 mobile) - I know the naming is confusing given that the 3000 series desktop chips are Zen 2. However, Lenovo has announced ThinkPads based on Zen 2, such as models in the rebranded T14 lineup, but they have not been released yet.
But what are you going to do that your 6700K couldn’t do already? It’s only slightly faster in aI by le threaded. More cores, great, if you’re rendering video or 3D or compiling all day.
I too was rocking the Skylake series (6600K for me though). However, I upgraded for two reasons - I went to the 3700X for twice as many cores, and the 6th gen Intel series can’t keep up with h265 like 7th gen plus.
I’ve been trying to upgrade my desktop since the motherboard is dying and I’ve found that small factor pc (SFFPC) components are rarer and at even more of a premium than usual. Most commenters in various forums chalk it up to the plague, so now I’m hoping once the next generation of CPU, motherboard, and GPUs come out things won’t be as scarce.
It's remarkable how not only the first but also #3, #5, #6 positions are some Ryzen 5 ?600? CPU. The 2600X is the only that is currently lower at #12. Also remarkable how utterly absent Intel is from this chart. The i3 9100F managed to break into #2 and then a Pentium Gold G5400 at #15 but both of these are sub-$75 parts while most AMD parts are significantly more expensive likely bringing more profit to AMD. Usually price per performance goes sharply down with performance, it doesn't scale linearly but remarkably a $405 processor from AMD is in the top 10, the 3900X. Power-performance wise also AMD is the killer, https://www.cpubenchmark.net/power_performance.html the new 15W TDP AMD 5/7 4xxxU CPUs are the first four while most Intel chips at the top of the chart are in 4.5-7W territory. If you look for ordinary socketed chips, you will find the Ryzen 3900, the Intel 9900T (which is a special 35W part), the 3950X and astonishingly the EPYC 7702 at 200W beating the Intel Core i3-1005G1 which is a 15W part on the latest Intel node at 10nm ... that's just embarassing.
Some days ago I checked https://opendata.blender.org/ and noticed that the RTX 2060, RTX 2060 Super and RTX 2070 are all equal when it comes to performance in the Blender benchmark.
Also the RTX 2070 Super might be the cheapest performance card at the moment.
Charts like these can save you a lot of money. But it will depend on the usage of course.
> Charts like these can save you a lot of money. But it will depend on the usage of course.
You have to be careful though. For example, the Radeon RX 570 is right next to the GeForce GTX 1650 SUPER at the top of the chart, but the GeForce costs almost 40% more. If the GPU isn't your bottleneck you may be better off with the less expensive one and save money to buy a faster CPU or more memory. Whereas if it is, you may be better off spending more for a faster GPU than either of those.
What these charts are great for is to look at the top fifteen or so as the set of candidates, which will all have good performance/$, and then if you prioritize better performance look at the fastest ones and if not then look at the least expensive ones.
But even then you have to be careful. For example, on the CPU chart the Core i3-9100F is one of the best values. The Ryzen 3 3200G is only slightly faster for $17 more (i.e. almost 25% more). But the 3200G has an iGPU, which is worth a lot more than $17 if it means you don't have to buy a discrete GPU.
Because their chart is determined by benchmark score / current sale price. The 1080 wasn't heavily discounted when the 2xxx series was launched, and high end GPUs are rarely value for money winners anyway as cost scales more than linearly with performance. Double the $ might get you 25% more frames.
You could argue they should have some minimum threshold for performance before including them on the chart, certainly no one should be considering a r7 260 or gtx 770 for their new system, however cheap they've gotten, but they don't.
Without a doubt. For comparison a 13” MacBook Pro can drive two 4K displays plus the internal display at 2880x1800, all at 60Hz, from an integrated GPU.
You can probably run as many 4K monitors as the 1650 has video ports.
I've had a 2600 for a year or two. I was looking to see what I'd gain from an upgrade to the 3600..Interesting to see the new 2600s still compete in the value department...now that their prices has been reduced by about $50.
When I was building a workstation in late 2018, it was a close match for price to performance, even after getting a 50% discount on Intel HDET processors from a friend! I'm happy to see AMD killing it in the CPU space.
With the proviso that the perf/$ approach works less well on GPUs cause you need to match that with what type of screen res & refresh you want to drive (if gaming I guess).
..which might easily land you in 2060/2070 territory.
It's neat to see that the 3900X is on the zen 2 threadripper price/performance line, but is available at mortal consumer prices and on the low end platform (B450). The numbering scheme makes sense in the context of this plot, even though the 3900X is not threadripper while all other 39X0X parts are.
Indeed! I knew that AMD was thoroughly on top, but that list really shows by how much. The first Intel in that list which I'd consider for a mid-range workstation (the i5-9600FK) is way down at #22.
I've heard that Intels idle at lower power. Ryzens do not go below 18W, but Intels, even ancient Sandy Bridges idle at 5W. Not sure if it is true though.
That's roughly equivalent to Cinebench points per dollar, which is unrelated to gaming performance, if you want that. Doesn't mean the 3600 is bad, but it does mean that these kinds of synthetic benchmarks tell you very little about gaming performance, which is less about the speed of the cores themselves but rather memory latency, something all current Ryzen CPUs are architecturally very bad at (20-50 % slower than Intel).
I was just reading through the Tom's Hardware review of the Ryzen 3600 and I can't see what you're talking about. The gaming performance is within 3% or so of the i5-9600K in basically every game. Granted, these tests were conducted at only 1920x1080, but still, if there were such a dramatic performance difference, it should be noticeable at lower resolutions too.
This is true, I’d say the build GP describes would probably be good for someone who is short on cash and wants to build a machine that is OK/Good for gaming and Good for things like programming or photo/audio/video editing. It’s true that that build won’t knock anyone’s socks off and isn’t optimized for anything in particular, but the fact that you can make a build like that with such good parts at that price is a net positive for consumers.
Weird thing about benchmark is that Cinebench score is treated as most important metric. It's very informative score for CG rendering use cases but not for all use cases like gaming, video editing, etc. I suspect that vendors/benchmarkers use it because of Cinebench multithread is scales very well by increasing threads (so easier to advertise higher core count model).
It is easily the best bang for buck CPU on the market, that allows pretty much 99% of workflows that require a PC. Gaming? Check. Programming? Check. Graphics design? Check. Video editing: also check (but on the lower end, true that).
It has also some really fantastic thermals, it does not produce that much heat. I use it in a fully fanless / passive cooling setup and it works great.
I just built a new machine around this CPU back in March too. I built a new machine specifically for playing Half-Life: Alyx, and couldn't be happier with this CPU.
My thermals were pretty great with the stock CPU cooler, but I ended up going with a Noctua NH-U14S which might be a little overkill, but when playing Half-Life: Alyx, I stay between 45-60c which seems pretty good to me. The game runs at a perfect framerate on High settings with the AMD 5700XT, 32GB of DDR4 RAM on the Vive Pro.
I may end up upgrading to a Ryzen 7 4XXX series when they come out, but for now, this CPU is performing great. Def best bang for your buck.
> but I ended up going with a Noctua NH-U14S which might be a little overkill
Yes. I have the same cooler* on a Threadripper 3970X with 280W TDP. The only thing that it can't handle is a sustained all-core load for >10-15 minutes. Even at all-core load it still turbos above base clock (but less than max turbo).
Technically a different SKU with a larger base plate, but otherwise the same design.
I built a new system with one of these last December, and it has been a great experience for everything I have thrown at it. I wasn't sure whether I'd stick with it or upgrade later - I mainly got it just to "get into" the Ryzen ecosystem, as this article suggests many do - but so far I see absolutely no reason for anything more.
One thing I didn't realize at the time is that the 3000 series is "end of the line" for B450 motherboards, which should be a real consideration for anybody eyeing a new system with this CPU now. As this article suggests it leaves you in an awkward spot. I opted to upgrade there to a X570, but only barely; a lucky decision, as it turns out!
Granted my case doesn't have the best airflow (NZXT h210i), I was getting pretty high CPU temps with the stock cooler. Averaging mid to uppers 80s during gaming, High 50s idle.
I went ahead and ordered a Noctua cooler which should be coming today actually. Hoping it helps.
The 3600 stock cooler is GARBAGE. Any load at all and it would eventually climb up to 95 degrees. I tried re-seating it, using different thermal compound, just blowing more air at my computer, but nothing fixes the fact that there's so little material in the cooler, and it's just not that functional. It can handle small but bursty loads, but any constant one easily overcomes it's pitiful performance.
That said, $30 aftermarket cooler will keep you at 50 degrees even during max load. Great chip for the money and I'm super happy with it
IntelliJ stuff is usually slow (I like smaller form factor machines). Finally with the 3600 performance is at least reasonable and not annoying.
It took forever to get bundled, but you can get Lenovo and I think HP prebuilt machines for this as well - for work I don't like handbuilding a machine.
I don't have first hand experience with higher spec build on the single CCD Zen 2 chips, but the dual CCD chips are not particularly good in terms of thermals. Only the very best of air coolers can handle them, and are not particularly quiet doing so. Idle CPU power consumption is unimpressive and at 50-70 W exceeds what older systems achieved for the entire system.
I had more problems with the 3700X although on paper it is 65W TDP (same as the 3600). In practice it gets much hotter and draws more power. The 3600 never gave me any problems.
Streacom DB4, one of the most beautiful passive cases around. It matches Apple-level design, it's stunning.
As GPU I had the Gigabyte GTX 1650 mini and the Gigabyte GTX 1660 TI mini. The 1650 is definitely easier to put in a passively cooled system, but the 1660 TI is much more powerful.
Price for performance. It plays the latest games at high resolution without being a CPU bottleneck. In gaming benchmarks the 3600 is usually within a few fps of the 3800x. https://youtu.be/9OXbhgnHvXQ
It's more than price for performance. It's also a very high absolute performance. Especially in games, where something like a Threadripper processor is not faster.
The 3700X is very slightly lower on performance per $ but it does have 8 cores and 16 threads so if you're going to be compiling code it's totally worth it.
The 3900x has been selling for $400 which is also amazing value.
Intel CPU market is behind after many years. Long live AMD design + Taiwan manufacturing.
> Intel CPU market is behind after many years. Long live AMD design + Taiwan manufacturing.
Except Intel still delivers better and more consistent gaming performance largely based on better overall architecture (L3, IMC and what connects these).
Both Intel and AMD are billion-dollar publicly traded companies. I don't get why people would fanboy either of them. I don't want AMD to win, because then they are going to abuse their position, just like Intel did. Conversely, I don't want Intel to win. Two players in a market is already problematic; trying for one player is just suicidal.
I don't think most people want AMD to win completely, as no one expects Intel to ever fold. Also historically, Intel has been much scummier, and it's paid off for them. When AMD was previously top dog (most recently with Athlon64), I don't remember them being as anticompetitive as Intel has been.
Intel has 5x the market cap and 10x the revenue compared to AMD, and AMD is the underdog in its dGPU competition with nVidia too! If you want innovation, you'd want to equalize investment in R&D for both these two competing pairs, it makes sense to be pro-AMD.
I don't think anyone actually wants AMD to "win", regardless of how they express themselves. What they want is AMD to have a lead on Intel, since it's been so long since that's been the case, and we'll all benefit from Intel not being continuously in a hefty lead, like it has been for decades.
AMD is so far behind intel in market share, money, resources overall that worrying about "you don't want AMD to win (either)" is looking too far ahead.
I'm still thinking about my cheap Ryzen-based PC only as a stop-gap solution, until the "real deal", aka Intel 7nm CPUs, premieres in 2021 (fingers crossed) .
I can confirm this. I'm running a 3700x with the stock cooler and It's idling at 50-60 degree at 3.6 Ghz. If I enable Game boost mode in the motherboard (it basically push the frequency to 4.2 and turn up all fans) it increase the performance of the CPU in cinebench from 4662 to 5087 points. That's said, the CPU reaches a whopping 100 degree!!! Probably need to get better cooler before enabling that again.
I see this as a good thing, the chip is tuned well enough to hit its limits out of the box, without having to mess around with manual overclocking to get optimal performance.
>so if you're going to be compiling code it's totally worth it.
This always gets brought up as a use case, but how many people are building non-trivial codebases where compile times matter? If build times are under 5s, near perfect parallelization only nets you a 1.25s gain between 3700x and 3600.
Any decently sized C++ or Swift code base is going to have compile times way over 5s. Many just moderately sized projects have build times over 10 minutes.
Bought one for a new PC back in January. The value is great and I haven't noticed a single bottleneck. I don't do anything too intensive, and the games I play haven't stressed it that hard. The best part of the upgrade has been Rust compile times. Coming from an older ThinkPad, it's a huge change.
AMD is making good CPUs, but i had an issue with Ryzen 5 1600 that left me with a bad experience with AMD. It's was my first ever build and i used this CPU, all good, but from time to time the Linux Desktop i use will freeze, hard reboot is the only fix. I noticed that this happen when i left the PC idle and the screen turn off, i tried everything and just accepted that this is a bug in KDE/Ubuntu/Kernel/Motherboard/RAM or GPU, not in my wildest thinking thought it would be the CPU.
Turns out it was the CPU and how crappy AMD handle C-states that save energy, i had to turn them completely off and that resolved the issue (In AMD defense, freezing the entire OS do save a lot of energy)
This was exactly 2 years ago, and i had the issue for almost a year before figuring out the fix. I wonder if the new generation have this problem too?
> Turns out it was the CPU and how crappy AMD handle C-states that save energy
From my experience, Intel is no better.
I have a 4 year old laptop with an Intel CPU that has the same problem, it was never fixed by Intel. [see EDIT]
For the first 3 years of use I just accepted the fact that my laptop would randomly freeze and I would have to reboot it.
One day, I got sick of it and started digging into forums until I found a way to avoid the specific c-state (by modifying kernel boot parameters) that caused the issue.
I’ve just been through this exact thing with a 3200G on Ubuntu 19 and 20. Took months to disgnose. Extremely painful. Ended up replacing the RAM and motherboard before finally determining the problem was with the CPU. Disabling C-states helped but didn’t resolve the issue. Tried a bunch of other things too, none of which ultimately worked.
In the end I switched to a 3700X; so far the problem hasn’t reoccurred. No doubt it will now that I’ve posted about it (fool me twice, shame on me).
Worth pointing out that 3xxxG parts are Zen+ and not Zen 2 like all other Ryzen 3000 parts. So the CPU in the 3200G is very different from the one in a 3700X.
I stopped using Ubuntu 10 years ago, when I had a weird issue. Out of nowhere computer locked up and required a hard reset. It took a while when I figured out that issue originated with Intel WiFi card. I eventually found the bug, it turned out that kernel crashed whenever the adapter sensed an 802.11n packet (the protocol was very new at the time so it wasn't everywhere yet). Ubuntu still decided to go ahead with the release, yet hold off on including freshly released major version of Open Office (I think version 3), because of stability. This was extremely frustrating, because there was no fix and no option to rollback to older version. So that was when I dropped this POS and used a different distro. I'm wondering if that's again Ubuntu issue and not CPU.
I've been running a 3400G since last fall on Debian with a custom kernel to get the latest video drivers. It's been stable until a few freeze ups in the past week. I'd pin the blame on Ubuntu.
Surprisingly, the solution for the first gen Ryzens / 1600AFs is not allowing the power supply idle voltage to go into low power mode. Check your BIOS for it.
It turns out many PSUs don't handle low CPU power states well, and disabling C6 just prevents the CPU from being able to idle (and consequently, boost) properly. I've had perfect uptime with my 1600AF in a storage node since discovering this.
All my computers were Intel based except for one. It was an Athlon64 X2 from ~2006 and I also felt something was wrong when running Linux. Random crashes happened quite often, I never got to know the cause. Performance was not amazing either, despite my previous computer being 4 years older.
I've never had a problem running Intel based systems with Linux. So, even though everybody's talking about how good AMD is right now, and why you should buy one for your next rig, and how good are AMD graphic cards with open drivers instead of the nVidia binary blobs... I sincerely don't see any reason to ditch what has been working OK and make the switch when preparing a new build. Should I?
Yes, you should. AMD is significantly different than it was almost 14 years ago. They wipe the floor with Intel that's still using their years old 14 nm process. AMD simultaneously has higher performance, more cores, lower energy usage and lower cores. It is categorically better in all fields.
As far as anecdotal evidence goes, I had a Linux desktop built around a 64-bit Athlon in 2004-2006 - as the primary, it saw plenty of use, but I don't recall any random crashes.
At Ryzen's launch mobo vendors weren't investing a lot into the platform because they were largely burned by previous AMD generations that failed to compete with Intel.
It wasn't until Ryzen 2000 / 3000 in particular that board vendors took the AMD stack seriously again.
I remember at release getting ram working on Zen 1 was a nightmare. Nowadays you can throw almost any stick to 3733mhz on Zen 2 no problem.
Yeah, early RAM support on Zen 1 sucked. It was definitely fair for vendors not to want to plow huge amounts of money in at that point, as the Bulldozer-era CPUs sucked in a profound manner. Definitely glad for those first few AGESA updates that got things like XMP fully working on my early X370 board.
AFAIK this goes back to the AMD graphics driver; I just built my wife a Ryzen 3200G / B450 system and she has had a couple of spontaneous reboots too. I did burn in using an old AMD 6970 card instead of the built in graphics and it did fine with that for me; but of course that card wont run the games she wants so we're hoping for a fix too.
It's a pretty awesome chip. I bought one the moment they came out with a B450 motherboard. It sounds like I won't be upgrading to Zen 3 with it, but that's probably fine. It's more than enough for the tasks I do.
With that said, once the 3300x is out, I would likely go that route with a cheaper a320 motherboard. This would bring the cost of the cpu/motherboard to about $175 which is terrific value, and leaves you with more money for other components.
I'm in the same boat with a 3700x. I'm pretty sure the 3950x will be available at a good enough price that I jump to that before needing a motherboard upgrade.
Ryzen 3000-series will run on B450+ and X470+, based on current specs. Obviously, at some point there will be a future motherboard release that will change the +'s to ranges.
Now EVERY single Zen 2 chip is at least smidge faster than my 6700k in single threaded benchmarks AND close to 100% faster in multi-core benchmarks. So for $180 I can buy something that takes a huge meaningful shit on my (fairly nice at the time) 6700k (~$350 when I purchased it).
That's just crazy to me how much performance I can get for so little. I'm going to buy something beastly with at least 16 cores, but! I also plan on building a little cluster using mini-itx B550 boards and the 3300x. In fact, I'll probably build the little cluster first, because each 3300X is still faster than my 6700k and the total cost per system will be like $450!!one! It hasn't been since the orig Core-2-Duo days I've gotten such a meaningful upgrade for so little. Plus, when Zen 3 comes out it's a drop in upgrade.
AMD is delivering insane value to their consumers, and I love it. I just wish I could buy a Zen 2 chip in a laptop that doesn't look like it was made for a fourteen year old (no offense to any fourteen year olds). I heard someone say the lack of 4k and more professional style laptops could be Intel back-channel fuckery, but... there's also a chance no-one expected AMD, in a single fucking generation of CPUs, to sweep every single market.
I just wish I could buy a decent motherboard that doesn't look like a 14yo's first attempt at drawing an f-35. My current rig does this weird glowing thing at night when it is supposed to be off. For my next machine I would honestly pay more to NOT have RGB support.
I had to use an ASUS motherboard and it took a good 10-15 minutes of poking through menus (with terrible keyboard navigation) to find the "magic rgb off" incantation.
If you happen to have an ASUS Prime Z370-A motherboard, here is the what you have to do:
boot and quickly press F2 or DEL to get into the BIOS (I had to do this more than once to poise my finger over the right key)
Press F7 to enter Advanced mode
Choose the Advanced menu (4th menu across top)
-> Onboard Devices Configuration (8th sub-menu in list)
-> RGB LED Lighting configuration (2/3 way down the page)
You have to disable both:
- when system is in working state - when system is in sleep, hibernate or soft off states
I basically tried all obvious places, then all the menus before this one before I stumbled upon it. it was nuts.
Well, it hasn't improved tremendously, but it hasn't remained flat. You can get around 20% more in single core performance, if it's so important to you.
https://www.cpubenchmark.net/singleThread.html
https://browser.geekbench.com/processor-benchmarks
https://www.cpu-monkey.com/en/cpu_benchmark-cinebench_r15_si...
1. Intel/nVidia Contracts have OEMs their hands tied (which explains capped GPU in most Zen 2 laptops).
2. Lack of widespread Thunderbolt 3 support on AMD.
While I'm fully bought on reason 1., reason 2. is still hard to digest as one can still ship laptop with USB-C port supporting PD and DisplayPort (w/ alternate mode) and call it a day, and most users won't mind. I hope Zen 3 causes the power shift on laptops.
The idea that in 2-3 years i could just buy a used 3970 or even a 3990 to more than double the core count is amazing to me, and knowing that prohibits me from having my mind blown by the insane value of the smaller Ryzens.
Lenovo just released a couple ThinkPads peered by AMD Ryzen 7 Pro CPUs: T495, T495s and X395. They don’t have 4K screens though.
https://arstechnica.com/gadgets/2019/05/lenovo-adds-amd-ryze...
https://www.anandtech.com/show/15772/new-lenovo-thinkpad-ran...
Deleted Comment
It offers exceptional value. Pair that with a GeForce GTX 1650 SUPER which is at the top of Performance/$ chart for GPUs: https://www.videocardbenchmark.net/gpu_value.html
And you have a champion of a workstation right there.
This is such a welcome change from the crypto days with GPUs in short supplies and at very high prices, and stagnant Intel CPU performance!
Also the RTX 2070 Super might be the cheapest performance card at the moment.
Charts like these can save you a lot of money. But it will depend on the usage of course.
You have to be careful though. For example, the Radeon RX 570 is right next to the GeForce GTX 1650 SUPER at the top of the chart, but the GeForce costs almost 40% more. If the GPU isn't your bottleneck you may be better off with the less expensive one and save money to buy a faster CPU or more memory. Whereas if it is, you may be better off spending more for a faster GPU than either of those.
What these charts are great for is to look at the top fifteen or so as the set of candidates, which will all have good performance/$, and then if you prioritize better performance look at the fastest ones and if not then look at the least expensive ones.
But even then you have to be careful. For example, on the CPU chart the Core i3-9100F is one of the best values. The Ryzen 3 3200G is only slightly faster for $17 more (i.e. almost 25% more). But the 3200G has an iGPU, which is worth a lot more than $17 if it means you don't have to buy a discrete GPU.
https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...
For context, their page on the 1080 (https://www.videocardbenchmark.net/gpu.php?gpu=GeForce+GTX+1...) gives it a score / $ ratio of 20, while the lowest item on that page has a score of 30.
You could argue they should have some minimum threshold for performance before including them on the chart, certainly no one should be considering a r7 260 or gtx 770 for their new system, however cheap they've gotten, but they don't.
* https://www.tomshardware.com/reviews/evga-nvidia-geforce-gtx...
vs.
* https://www.tomshardware.com/reviews/nvidia-gtx_1650-super-t...
Note: this is NOT for gaming. Just everyday workflow and dev environments in 4k, 60hz.
7680x4320@120Hz [1]
[1] https://www.nvidia.com/en-us/geforce/graphics-cards/gtx-1650...
You can probably run as many 4K monitors as the 1650 has video ports.
..which might easily land you in 2060/2070 territory.
Sorta here: https://www.cpubenchmark.net/power_performance.html
Unfortunately it's hard to tell which are embedded versus traditional sockets.
https://www.tomshardware.com/reviews/amd-ryzen-5-3600-review...
It has also some really fantastic thermals, it does not produce that much heat. I use it in a fully fanless / passive cooling setup and it works great.
My thermals were pretty great with the stock CPU cooler, but I ended up going with a Noctua NH-U14S which might be a little overkill, but when playing Half-Life: Alyx, I stay between 45-60c which seems pretty good to me. The game runs at a perfect framerate on High settings with the AMD 5700XT, 32GB of DDR4 RAM on the Vive Pro.
I may end up upgrading to a Ryzen 7 4XXX series when they come out, but for now, this CPU is performing great. Def best bang for your buck.
Yes. I have the same cooler* on a Threadripper 3970X with 280W TDP. The only thing that it can't handle is a sustained all-core load for >10-15 minutes. Even at all-core load it still turbos above base clock (but less than max turbo).
Technically a different SKU with a larger base plate, but otherwise the same design.
One thing I didn't realize at the time is that the 3000 series is "end of the line" for B450 motherboards, which should be a real consideration for anybody eyeing a new system with this CPU now. As this article suggests it leaves you in an awkward spot. I opted to upgrade there to a X570, but only barely; a lucky decision, as it turns out!
Granted my case doesn't have the best airflow (NZXT h210i), I was getting pretty high CPU temps with the stock cooler. Averaging mid to uppers 80s during gaming, High 50s idle.
I went ahead and ordered a Noctua cooler which should be coming today actually. Hoping it helps.
That said, $30 aftermarket cooler will keep you at 50 degrees even during max load. Great chip for the money and I'm super happy with it
It took forever to get bundled, but you can get Lenovo and I think HP prebuilt machines for this as well - for work I don't like handbuilding a machine.
As GPU I had the Gigabyte GTX 1650 mini and the Gigabyte GTX 1660 TI mini. The 1650 is definitely easier to put in a passively cooled system, but the 1660 TI is much more powerful.
The 3900x has been selling for $400 which is also amazing value.
Intel CPU market is behind after many years. Long live AMD design + Taiwan manufacturing.
Except Intel still delivers better and more consistent gaming performance largely based on better overall architecture (L3, IMC and what connects these).
Both Intel and AMD are billion-dollar publicly traded companies. I don't get why people would fanboy either of them. I don't want AMD to win, because then they are going to abuse their position, just like Intel did. Conversely, I don't want Intel to win. Two players in a market is already problematic; trying for one player is just suicidal.
Intel has 5x the market cap and 10x the revenue compared to AMD, and AMD is the underdog in its dGPU competition with nVidia too! If you want innovation, you'd want to equalize investment in R&D for both these two competing pairs, it makes sense to be pro-AMD.
I tried a light touch of overclocking and got absolutely nowhere. (Stock cooler)
Running that close to the edge is an achievement in itself though I guess
This always gets brought up as a use case, but how many people are building non-trivial codebases where compile times matter? If build times are under 5s, near perfect parallelization only nets you a 1.25s gain between 3700x and 3600.
You want to see a preview as quickly as possible. Best if you can keep a live view on a second monitor.
Turns out it was the CPU and how crappy AMD handle C-states that save energy, i had to turn them completely off and that resolved the issue (In AMD defense, freezing the entire OS do save a lot of energy)
This was exactly 2 years ago, and i had the issue for almost a year before figuring out the fix. I wonder if the new generation have this problem too?
From my experience, Intel is no better.
I have a 4 year old laptop with an Intel CPU that has the same problem, it was never fixed by Intel. [see EDIT]
For the first 3 years of use I just accepted the fact that my laptop would randomly freeze and I would have to reboot it.
One day, I got sick of it and started digging into forums until I found a way to avoid the specific c-state (by modifying kernel boot parameters) that caused the issue.
EDIT: Apparently, it's getting fixed now. Just 5 years late, issue was reported in 2015 https://bugzilla.kernel.org/show_bug.cgi?id=109051
In the end I switched to a 3700X; so far the problem hasn’t reoccurred. No doubt it will now that I’ve posted about it (fool me twice, shame on me).
It turns out many PSUs don't handle low CPU power states well, and disabling C6 just prevents the CPU from being able to idle (and consequently, boost) properly. I've had perfect uptime with my 1600AF in a storage node since discovering this.
I've never had a problem running Intel based systems with Linux. So, even though everybody's talking about how good AMD is right now, and why you should buy one for your next rig, and how good are AMD graphic cards with open drivers instead of the nVidia binary blobs... I sincerely don't see any reason to ditch what has been working OK and make the switch when preparing a new build. Should I?
It wasn't until Ryzen 2000 / 3000 in particular that board vendors took the AMD stack seriously again.
I remember at release getting ram working on Zen 1 was a nightmare. Nowadays you can throw almost any stick to 3733mhz on Zen 2 no problem.
At home I've had a 3950x now for about six months and no such issues with the CPU at all. It just works and runs nicely. No freezes, no problems.
With that said, once the 3300x is out, I would likely go that route with a cheaper a320 motherboard. This would bring the cost of the cpu/motherboard to about $175 which is terrific value, and leaves you with more money for other components.
https://images.anandtech.com/doci/15774/Ryzen%203_B550_Press...
Motherboards don't need to have the xx. There is only B550. There is no other B5xx.