Just built a new rig last week. This isn't at all surprising. Prices are so out of whack right now. $1000 used to be the ceiling for the highest end video cards (like a 1080 Ti), now $1000 is the floor. There are no good current gen (or even previous gen) cards available for under $1000. The 4080 is a horrible value and still regularly listed at $1200-$1400. The 4090 is overkill and sits around $1700-$2200. Even 2 year old tech 3080's are regularly selling for near a grand. AM5 motherboards are insanely priced. Want a 10Gb onboard NIC? Be prepared to shell out $1000 just for the motherboard. Add to all of that, this latest batch of CPUs are just stupid power hungry - like 240w+ under load (except for the non-x variants of AMD 7000 series, just released last month).
It used to be you could buy a lot of computer for $2-3K, now that figure is closer to $5K. These prices, combined with the folks that just went through this pain 2 years ago during the pandemic and yeah you aren't going to see stuff flying off the shelves any time soon.
I am sorry but the "rule of thumb" website, https://www.logicalincrements.com/ disagrees with you, heavily so. You can still buy an awful lot of computer for $2k, it's right there how. The prices are real, they link -- yes, with affiliate links -- to real sales on Amazon/Newegg/etc. To quote what you can expect from their "outstanding" tier at 1628 USD:
> This expensive tier has the highest possible cards that still maintain a reasonable performance/price. Sure it is pricey, but it is luxurious!
> Expect high performance at 1440p, and solid performance at 4K even in the most demanding games.
For 1865 USD:
> Max most titles at 1440p@144Hz, and solid framerates in 4K, even with max settings.
This is misleading. For nearly $2000, you better be getting 4K with ray tracing (max settings), AKA, a top of the line device. 1440p/144 is midrange now.
Which that $1865 one does not provide. Even before ray tracing, most games struggle to hit the 144 rate you're aiming for, so with ray tracing that drops down to like ~40 (Cyberpunk 2077 for example in that 6900XT card). You have to enable workarounds like DLSS/FSR to make those games playable.
The only way you're getting good framerate at 4k is without ray tracing, but you're paying $2000 to have to worry about still disabling settings? Ridiculous!
So yes, they are overpriced. For $2000 you should not be worried about having to enable FSR.
The usual excuse when this is brought up is "well just don't play those games, they seem unoptimized" to which again, the question is, why are you spending two thousand dollars to avoid playing certain games? How absurd.
You can also save up on the case (cheaper options should be available), and grab a Ryzen 7900 which should have similar perf & comparable price point to Intels they used, and comes with stock cooler, shaving off additional ~$100. I'd also probably skip the HDD and grab 32GB RAM.
Enterprise drives shipments and the PC industry is fucked because: the surge of Windows 7 migration is over, they’ve become appliances, and everyone bought thousands of laptops in 2020/1.
The only reasons to replace business desktops are swollen batteries and Microsoft. My teams are responsible for 250k devices globally. Failure rates are <3%, and 75% of failures are batteries and power supplies. With the transition away from spinning rust complete, we have more keyboard failures than desktop PC failure. I’m taking the PC refresh savings and buying Macs, iPads and meeting room hardware.
Speaking from the smaller scale side of IT, in the past ~6 months or so, I've deployed more BeeLinks ( https://smile.amazon.com/Beelink-Graphics-Desktop-Computers-... ) or even smaller, Atom-powered boxes ( https://www.amazon.com/GMKtec-Nucbox5-Desktop-Computer-Windo... ) than Lenovos or Dells. And my clients are really happy with them, too. These are, of course, still technically PC shipments, but the amount of money involved for the manufacturers is absolutely minimal. And most office workers don't need more.
Yep, tech work, and especially remote software work, is all done with laptops and docks.
I'm writing this from my home gaming rig, which is an old, not-cool-enough-for-Win-11 (thank god), desktop. I don't know what I'll be replacing it with when it keels over and dies. Maybe a Mac tower? Maybe a Linux rig. But it'll be my PC, not Microsoft's if I can help it.
I wonder if anyone has done some large scale testing on just restricting laptop charging to 80% or around that mark. I swear the large PC vendors charge these things to "full 100%" for the extra 30 minutes on some shitty video endurance benchmark and it ends up causing bloating batteries at alarming rates.
If desktop OSes stopped being garbage maybe they'd attract users back. Win still has the ti lol r crap, Linux is suffering under snaps and usual deck chair reshuffling and osx is apple imprisoned, and neglected regardless.
>Add to all of that, this latest batch of CPUs are just stupid power hungry - like 240w+ under load (except for the non-x variants of AMD 7000 series, just released last month).
That's because in the race to get the highest benchmark scores, both companies have set the stock clocks to a level that's way beyond what's optimal (eg. adding 100W of power consumption to get 5% higher benchmark scores). The CPUs themselves are fine, you just have to adjust the power/clock limit lower.
Honestly you don't have to adjust the limit: the "power under load" angle just gets completely overblown because people go based on a reviewer's definition of load
They might be 240W under extreme load, but I can play AAA titles on my i9 at 240hz barely cracking 50% CPU load. And that's with a 3090, so not exactly a mismatched CPU/GPU situation.
At those types of loads the CPU doesn't even try to hit boost clocks most of the time, so you're nowhere near the figures you often see touted based on benchmarks.
I haven't bought a non-4K monitor in over 6 years. I honestly don't know anyone who is still using 1080p monitors as daily drivers if they are also using the machine for productivity or media work. But your point is not invalid.
Grandparent seems to have forgotten about 4070 Ti (only $800, what a bargain!) but yeah, $800 is currently the floor for current-gen hardware in the sense that nothing has launched below $800 despite being almost 6 months into this product cycle. AMD's cheapest is a $900 MSRP (but starting to fall below that) and NVIDIA's cheapest is an $800 MSRP.
That space is currently filled by older, slower, less efficient, less-featured last-gen products. Both companies have some significant amounts of inventory they want to burn through after the mining thing and it's going slow because of the general declines in shipments.
Generally though I think people are remembering the past with rose-colored glasses... not saying OP said this in particular, but a lot of people have latched onto the idea of the "$300 x70 tier", and the x70 tier has literally never been $300 MSRP for the entire time it's existed. It's bounced back and forth between $350 and $400 even 10 years ago, $329 was the lowest price it's ever launched at and people have latched onto that one as being the price x70 has to match forever, plus a little extra. GTX 680 (full-die GK104) was $499 10 years ago, for a 300mm^2 chip, GTX 670 was a GK104 cutdown for $399 for example, and GTX Titan was where you got the full GK110 at a mere $999 (in 2012 dollars).
Ampere was somewhat below the baseline, using Samsung was an attempt to make cheaper cards and push down the cost, so by $499 being a "bargain" price (for 3070) before pandemic cost spirals got too bad, and factoring in the more expensive TSMC 5nm node, I think the realistic price for a 4070 (GA104 cutdown, whatever you call it) is probably $600-700 at this point.
So there's definitely some gouging taking place, but, a lot of people are fixated on that $300 number and that's just not going to happen. Costs have just spiraled a lot more than people realize, Pascal was not cheap either and that was 7 years ago (!) this june, and everything since then has been on older, cheaper nodes to help keep the costs down... until now. Throw in the pandemic generally blowing a lot of costs up, and yeah things are expensive now.
And yes, 4850/4870 were good cheap cards, but AMD could do that because they got onto 55nm ahead of NVIDIA, and that was back in the days when shrinking first was a real advantage, you could match a high-end card with a cheap midrange card if you got to a newer node first. That's not how it works anymore, higher wafer costs and R&D costs mean newer nodes are better but they're not really cheaper even considering you get more chips per wafer. Costs are growing fast enough to eat up the increases in density.
There are complex reasons related to patent licensing why it makes no sense to put 10Gbe on a motherboard right now. If you want 10Gbe, don't get a $1000 motherboard for it, get a motherboard with a free pcie 4.0 slot and get an adapter. That'll cost an extra ~$100 today, and an extra ~$20 in August.
Fair enough - I'm clearly paying a premium to keep a cleaner interior on the build. I tend to run exactly one card - the video card, and aim to get everything else onboard, but as you pointed out that's definitely not the only way to go, and certainly not the most cost efficient way.
Posted the basic question in another post but I’ll ask here too. What are you using to saturate a 10gbe nic? Inet??? I find a true 10gbe inet connection unusual but I’ve been out of that game for a while.
The current prices are indeed high, but if I would upgrade right now my desktop with the best components that I can find, that would cost only $1500.
The $1500 would pay for an AMD 7950X, an ASUS MB with ECC support and a decent configuration of the expansion slots (Prime X670E-Pro), a Noctua cooler suitable for 7950X (e.g. NH-U12A) and 64 GB of ECC DDR5-4800 memory (which for 64 GB costs $100 more than the non-ECC variant).
For the rest, I would continue to use the case, the SSDs, the 10 Gb/s NIC and the GPU that I am currently using.
If I would want to also upgrade the GPU, I would buy a RTX 4070 Ti, for 3 reasons. It provides enough extra performance to make an upgrade worthwhile, it has the best performance per $ among all recent GPUs (at MSRP, a 7900 XTX would have better performance per $, but it can be found only at much higher prices), and lastly, 4070 Ti is the only recent GPU for which the vendor specifies the idle power consumption and the power consumption during light use (e.g. video decoding) and the specified values are small enough to be acceptable in a desktop that is not intended for gaming.
There’s been a lot of ink spilled about the decline of Moore’s Law and how it hasn’t yet exactly fallen for all aspects of computer engineering. I think it’s fallen for customers, though. The economics of exponential speed improvement in traditional CPU design have gone away, and the capability/complexity ratio of software collapsed with the fall of Dennard scaling. No fundamentally new applications have come out (save ML, which is not particularly suited to CPUs or even GPUs), so consumers are happy to keep chugging along with their current setup or move more load to their phones.
Even if the increase in hardware cost stays at parity with inflation, it’s tremendously more expensive than it used to be, when waiting six months could get you more machines for your budget.
Gaming, a previous driver of high-end consumer growth, has split into truly mobile, consoles, and very high end PCs. But complex games take more capital and time to develop, so recouping costs is important (except if the studio is part of a conglomerate like Microsoft that can weather temporary market forces). I’d imagine that places pressure on game developers to aim for maximum compatibility and a scalable core. So too bad for the Mac, great for phones, and great for consoles (especially with monetizing the back catalog). And new PCs will have to fight against good-enough, and lower demand funding new hardware and software.
> Add to all of that, this latest batch of CPUs are just stupid power hungry - like 240w+ under load
Eco mode is a thing for AMD CPUs. There is no point really in not using it by default - benefits are very marginal with power cost being disproportionally huge. And AMD are doing it more for marketing reasons to gain some single digit percentages in benchmarks.
So just enable it (105 W one) and enjoy way more reasonable and efficient power usage with basically almost the same performance.
This is almost exactly my gaming setup. I know people have different expectations but in my house this is the fastest gaming computer and can play the best games (like MSFS2020) perfectly.
My main computer is an M1 MacBook Air which is essentially a perfect computer. It never feels slow. It was a little over thousand bucks. In no world I can imagine could this not be considered an amazing performance value.
I bought a 1080 quite a while after they launched. I still have it. I have thought about building a new PC lately, and it seems to me that newer cards are ... not as much as an upgrade as I might have been led to believe? Especially given the cost. If I did build a new PC today, I am not entirely sure I would buy a new video card. The performance/value ratio doesn't seem as appealing as jumping to the 1080 did at the time back then.
> Want a 10Gb onboard NIC? Be prepared to shell out $1000 just for the motherboard.
quick search turns up options for around 500euros
but don't be fooled, 10GbE copper is a power hungry mess. go with lower n-baseT speeds if just want some progression for the last twenty years of networking innovation or invest in optical and get a real upgrade (20/40/100 Gbps)
What is the use case of 10Gb Ethernet for the regular PC user? Or even the enthusiast?
I have a sprawling homelab and home theater and scarcely need regular 1G. Last summer I transferred a media archive of ~10TB to new hardware, which completed overnight across Cat5. Is there some hot new hobby that generates multi gigabit bursts that I don't know about?
AM5 w/ PCIE 5, DDR5 support and 10GBe? I've already eaten the cost, but I would love a link to the board you found. I saw a lot in that price range with 2.5 or 5GB but didn't run into any with 10GBe. That said I may have missed one. If nothing else it may be useful for when the wifes gaming rig gets an upgrade in a few months.
As to why I'm sticking with 10Gbe - I have a 24 port 10GBe switch for the house so going with the kit that matches the network I already have.
Not only are the prices out of whack, but the newest games coming out all seem to have some sort of technical issue on the PC. Shader stutter is nearly a universal thing in most new releases, or the developer doesn't optimize for the platform at all (Callisto Protocol and to a lesser extent, Hogwarts Legacy). So not only are you paying more money than ever, you're experiencing certain issues that just aren't there on consoles.
I recall Jonathan Blow talking about how it's basically impossible to eliminate stutter on Windows now due to a number of design decisions in the OS driver system itself.
I'm wondering if this is the moment for Linux gaming. Valve has certainly taken it a long ways from where it was.
There are hardly any slots on motherboards too now these days. One network card may fit next to the GPU. All kinds of limitations to nvme SSD speeds and GPU bus width may start applying just for doing that.
Agreed, I don't mind spending extra money on GPUs (I have 2 3090s sitting next to me) because the improvement are still worthwhile for my use case, but the CPU prices have been unjustifiable, especially on the AMD side. Increasing CPU price, absurd motherboard price AND needing to buy new RAM, all for an improvement that isn't really too meaningful unless in very specific tasks, is not really worth it. I instead just got a 5900x for my computer and moved its 3900x into a server, retiring its 1600x (which was also sufficient for its work, although at least the 1600x was noticeably slower for transcoding, the 3900x is proving more than sufficient).
I don't keep up with PC component prices but I thought crypto crashing flooded the market with cheap cards? (Then again I guess BTC is back into the $20k's)
BTC hasn't been viable on GPUs for a while, either. It's the Ethereum Proof-of-Stake change that was the most exciting, but it doesn't seem to have had a significant effect, especially with newer (3-4yr old) cards.
I wonder how much of this increase is due to social media.
10-15 years ago, people would buy graphics cards to play the latest games on. Epeen was a thing, but limited to just some text in your forum signature.
Now, it seems like half the reason people buy any sort of "status" item is for the clout on Insta. It was eactly the same thing with the PS5 2 years ago - people clamouring for a useless object just to show other people they have it.
10-15 years ago people were posting their rigs, adding liquid cooling, ridiculous lighting, massive overclocking. It was definitely more than a forum signature.
I remember buying a voodoo 3 for 130 GBP. That was pretty high end at the time. Who's the target market for 1k+ cards?!
At one point you could get a decent gaming pc for about 500 or 600 GBP. I doubt that's possible now. And madness compared to the hardware in a Xbox series X. Yes I know it's hardware is subsidized.
Unless you your inet connection can sustain 10gb then what’s the point of a 10gb nic? I have gbit fiber that rarely gets above 6-700mbit. Is a full 10gb inet connection that common?
Even on LAN do you have I/O that can deliver 10gb/sec to the wire?
It doesn't have to deliver 10gb/sec to be worthwhile. 3 gbps is easy and already more than a 2.5GbE link can handle. And the prices for those switches still exceed what's available used for 10gbps.
I've got an ICX7250-24p at home that cost about $200 from ebay. That's 24 ports of PoE gigabit, plus 8 SFP+ ports for 10gig. Noise and power are quite reasonable, though it doesn't belong on the desk next to you.
Regular SATA SSDs are bottlenecked by gigabit and 2.5 gigabit. 116MB/sec is not slow per se. Usually only video editors are waiting on file loading bars frequently enough to make an upgrade worthwhile. That and moving around VM images.
I would have expected the crypto implosion to have a depressing effect on graphics card prices (certainly in the secondary market). Any theories why they remain elevated? Is it just supply chain stuff that everything is experiencing?
Reading this really points out to me how killer of a deal that GeforceNow is. I truly don't understand why more publishers don't allow their games on the platform.
It's not a killer deal for the consumer. That's just Nvidia fucking you over either way and still profiting off it. The market is not only bad in the high end, but there aren't any ~$200 value oriented graphics cards that provide significant upgrades over past generations. Just now the GTX 1060 has stepped down from being the most popular card in Steam's hardware survey just to be replaced by the GTX 1650, a much newer card that costs about the same and performs about the same. And with less VRAM!
Having what feels like 100ms of input delay at a minimum is pretty awful. Idk if there's some magic way to speculatively render extra frames so that input is resolved locally
i would suspect that they either fear a loss of profit (b/c platform cut) or reputation (because latency/jitter from bad connection will be wrongly blamed on the game rather than the platform)
> McCarron shines a glimmer of light in the wake of this gloom, reminding us that overall processor revenue was still higher in 2022 than any year before the 2020s began.
Suggests a correction precipitated by panic-buying during the supply chain chaos of the pandemic era. Too soon for doom and gloom for the PC market just yet. Mobile devices have been dominating PCs since long before 2020, and if revenues were still growing in the past decade then there's nothing to suggest that this moment is suddenly the inflection point where the whole thing will come tumbling down, even if you do believe that something like that is inevitable.
I agree with the sentiment, but mobile devices have also seen a plunge in sales. Many in the industry expected 2B units/year, but it maxed out around 1.6B. The last few years have seen volatility, between 1.2-1.4B. Last year was the worst since 2016, and the next worst was 2020.
Only Apple has been relatively flat, and probably only they and Samsung are very profitable.
Most of the profitability in "PC" (GPU/CPU) has really been datacenter for 5 years. Again, Intel executed badly, but the decision to focus on datacenter was right.
> Yeah, everybody upgraded their WFH office setups in the prior two years, now no one needs a new pc. We’re going to be good for a while.
Also, it feels like phones have entered that "Core 2 Duo" PC stage where upgrades don't really matter as much any more. I know software support can still be an issue, but at least on the iPhone side, I don't feel like I need to upgrade before my phone loses OS support.
COVID/WFH Panic Purchasing for work and school at home.
COVID Cash also put money into peoples hands to buy stuff, like computers.
Crypto mining and GPU shortage was also a factor. As people were buying systems and parts for speculation. People were buying prebuilt computers to mine or simply get the GPU.
Scalpers made everything worse, messing with parts in the supply chain.
So there is the supply and demand factors, and the extra money for consumers and the speculation for crypto, and it was a perfect storm.
The prices are simply too high for the marginal benefit they offer.
Marginal costs outweigh the benefits so why would people buy? This is simple economics and they know this, but they still price fix because they must meet their minimum profits whatever that may be.
Its a common problem with monopolies, as soon as the market place shrinks to only a few players, where the means of production has been concentrated, those people then start dictating prices and may collude without even needing some conspiratorial agreement.
Many people also ignore the fact that Intel ME/AMT and the AMD equivalent features that cannot be disabled, are not documented, and are primary targets; are becoming more well known, and in general people don't want it.
Businesses may find value in those features, but individuals find cost (i.e. their privacy, and greater future risks that are unquantifiable).
They've broken their own market, and the rot will only get worse for them since its unlikely they will right the course. Many IT people wonder if there isn't some secret government mandate requiring these companies to embed management co-processors. It is clearly offering only minimal value to IT, and its seen as a cost for individuals that know about it.
They really need to reconsider their actual market instead of the fairy magic kingdom type thinking they have been following.
You're putting the fact that modern computers have AMT up with Covid, supply shortages, and Crypto crashes in terms of sales loss???...????? You really, really need to get out of whatever bubble you're in.
AMD ST (formerly PSP) and Intel CSME (formerly ME, not the same as AMT) are the only reason that I, and 3 close friends off the top of my head, are completely disinterested in buying any new x86 CPUs.
All 4 of us work in big tech companies, have 6 figure take home pay even after expenses. I don't mind paying high prices due to shortages. Lockdowns were BS and I was continuing to eat out regularly at restaurants from September 2020 to present in a state with lax mask requirements. Only 1 of the 3 friends lost any money in crypto, and it was a very small amount compared to his annual income.
Just because you and your social sphere don't fully appreciate the privacy implications of CSME and ST, which again, aren't the same as AMT, doesn't mean nobody else cares about them. Have you considered that it might be you who is in the bubble?
I don't know about anyone else but I've been patiently waiting for the Ryzen 9 7950X3D since the 5800X3D came out. The gaming performance on that chip was so good that it was competitive with more expensive chips at the time, despite being slower for productivity workloads. My 4790k is starting to show it's age when playing games like Rimworld and Elden Ring.
I'm waiting for benchmarks. So far I'm not convinced 7950X3D will be better than 7950X, especially since there is no way scheduler will be able tell whether some thread benefits from more cache or from higher clocks, unless someone develops a very sophisticated one with AI like training capabilities? I haven't seen any kind of efforts of that sort (I'm gaming on Linux).
PCs sales aren't tied to GPU sales, though. Gamers switch GPUs way more often than CPUs (which makes sense, since CPUs often aren't the bottleneck).
PCs have simply become fast enough over the past 10 years and there's been no "Windows Vista"-moment that forced users to migrate to new hardware en masse. A 5 year old system will simply feel pretty much the same to the average user as a brand spanking new one. There's only so many video editors and professional gamers/streamers who thirst after the latest hardware.
> Most of the downturn in shipments is blamed on excess inventory shipping in prior quarters impacting current sales.
This is in line with comments from Drew Prairie (AMD’s VP of communications): [1]
> We are shipping below consumption because there is too much inventory in the channel and that partners want to carry lower levels of inventory based on the demand they are seeing
i recently built my first pc and moved my daily driver from being a thinkpad to a custom desktop.
i was already sitting at a desk, so ergonomically it’s identical.
now i can compile blender in 20 seconds and fly around with eevee. i can compile linux with custom lsm modules.
dual ssd makes it easy to dual boot. reboot into windows and i can have a magical evening.
7950x, 4090, 990 pro. it would be great if these were cheaper, then more people could afford to use them. it’s also ok that they are overpriced. cest la vie.
to anyone spending a majority of their life on a computer and making money, the cost of your primary computer doesn’t matter unless it’s ludicrous.
the opportunity cost is far higher. what might you have learned had you tinkered with blender or a kernel when you were bored?
Lowball speculation lacking any semblance of supporting data aside, remember that this doesn't require a new product line, or any new features, just removing an existing component that costs more to add in, replacing owner-controlled CPU core initialization with the code that performed that prior to the introduction of ME.
Not to mention that both team blue and team red could charge something of a fortune of a "privacy premium" for not including these much-despised coprocessors. Consider one of the only remotely comparable semi-modern options - IBM's Power 9 processors. Midrange versions of these processors and their motherboards each start in the thousands of dollars. For a Ryzen 9 7950x with no AMT ST (PSP) or Microsoft Pluton, I'd eagerly jump at the opportunity to purchase one for $2500.
Back to your estimate - even for consumer demand, your estimate is easily dismissed by simply looking at the existence of companies like RaptorCS, Purism, and the like. In addition, there is enterprise purchasing demand in the hundreds of millions, easily. Google was attempting to remove all proprietary blobs from their servers a few years ago (2017, if memory serves correctly), and ME/PSP was the big barrier they couldn't overcome at the time.
Make no mistake, there isn't a single tech company that wouldn't leap on such coprocessor-free compute to protect sensitive corporate secrets.
> Sincerely, - A larger chunk of your potential customers than you think
I wish this were true, but in reality it's pretty far from that. What do you count the tens or hundreds of thousands of potential customers amongst the sea of tens or hundreds or thousands of millions of real customers? A rounding error.
A huge chunk of their market _wants_ these features, another huge chunk just doesn't care, and there's a small (and at times very passionately vocal) minority that cares and doesn't want them.
But this is all setting up for a response to:
> Until then, I will never buy a new x86 processor ever again.
Don't hold your breath; It's not likely you'll ever have the opportunity to spend your money on x86 with the given conditions.
I've addressed this argument further down in the comments. TL;DR: Google wants servers with no Intel (CS)ME. If you don't think other big tech corporations feel the same way, or that the collective purchasing power of a few of the world's big tech companies' server budgets is a significant amount of money, I don't know what will convince you.
There may only be tens or hundreds of thousands of customers, but tens of those customers are going to want hundreds of thousands of individual units. I have no doubt annual sales for modern X86 silicon without the "security" coprocessors would be in the millions of units for the first few years, at a minimum.
I think their processors are not including management engine, so you are safe to buy one. The management engine that included in chipsets can be switched off permanently.
In general usage, it does not matter while you use third party controlled CAs, distro repositories and automatic updates, not speaking about microsoft, google, nvidia, valve, mozilla spyware that can do anything with your data anytime they (or US/EU government agencies) want.
>I think their processors are not including management engine, so you are safe to buy one.
There are no new AMD or Intel processors that come without ST (formerly PSP) or CSME (formerly ME).
>The management engine that included in chipsets can be switched off permanently.
This is factually incorrect. me_cleaner cannot neutralize or disable modern CSME, there is no way to verify the HAP bit does anything at all, nor that the included TCP/IP stack on the Minix OS cannot accept remote commands to disable the HAP bit, if set. To our current knowledge, only the onboard GbE controller is accessible to CSME's TCP/IP stack, but we're working with extremely limited information. These are closed-source, hardened opaque-boxes that are deliberately designed to be inauditable and tamper-proof. Adding firmware support for other ethernet controllers or wireless cards would conceivably be trivial.
>In general usage, it does not matter while you use third party controlled CAs, distro repositories and automatic updates,
I compile from source. OS, drivers, browser - all of it. I don't care if you think this is "unrealistic for the average user", my objective is not to have the security model that the average user has.
>not speaking about microsoft, google, nvidia, valve, mozilla spyware that can do anything with your data anytime they (or US/EU government agencies) want.
I do not run Windows, I do not use chromium (or firefox) based browsers, I do not use a discrete GPU, I don't have anything remotely gaming related (like steam) installed.
What I do have is a constitutional right to privacy that does not end where my CPU begins, and an unshakeable resolve wherein I refuse to voluntarily cede that right to privacy just because so many others do.
Right now, I use a Power 9 (PPC64 arch) processor made by IBM for my main workstation. It is 100% open source - every bit of the firmware, and hardware schematics too. I have a few old laptops with Intel and AMD processors that predate the age of ME and PSP, respectively, but they are not powerful enough for running multiple VMs, background services, a few dozen browser tabs, streaming and decoding video over SMB, etc like I do on my workstation.
It used to be you could buy a lot of computer for $2-3K, now that figure is closer to $5K. These prices, combined with the folks that just went through this pain 2 years ago during the pandemic and yeah you aren't going to see stuff flying off the shelves any time soon.
> This expensive tier has the highest possible cards that still maintain a reasonable performance/price. Sure it is pricey, but it is luxurious!
> Expect high performance at 1440p, and solid performance at 4K even in the most demanding games.
For 1865 USD:
> Max most titles at 1440p@144Hz, and solid framerates in 4K, even with max settings.
And if you want to note the 6900XT card for $720 they used here is out of stock then let me note a $700 6950XT: https://www.newegg.com/asrock-radeon-rx-6950-xt-rx6950xt-ocf... which makes it an even better bang for your buck.
Which that $1865 one does not provide. Even before ray tracing, most games struggle to hit the 144 rate you're aiming for, so with ray tracing that drops down to like ~40 (Cyberpunk 2077 for example in that 6900XT card). You have to enable workarounds like DLSS/FSR to make those games playable.
The only way you're getting good framerate at 4k is without ray tracing, but you're paying $2000 to have to worry about still disabling settings? Ridiculous!
So yes, they are overpriced. For $2000 you should not be worried about having to enable FSR.
The usual excuse when this is brought up is "well just don't play those games, they seem unoptimized" to which again, the question is, why are you spending two thousand dollars to avoid playing certain games? How absurd.
You can also save up on the case (cheaper options should be available), and grab a Ryzen 7900 which should have similar perf & comparable price point to Intels they used, and comes with stock cooler, shaving off additional ~$100. I'd also probably skip the HDD and grab 32GB RAM.
Deleted Comment
The only reasons to replace business desktops are swollen batteries and Microsoft. My teams are responsible for 250k devices globally. Failure rates are <3%, and 75% of failures are batteries and power supplies. With the transition away from spinning rust complete, we have more keyboard failures than desktop PC failure. I’m taking the PC refresh savings and buying Macs, iPads and meeting room hardware.
I'm writing this from my home gaming rig, which is an old, not-cool-enough-for-Win-11 (thank god), desktop. I don't know what I'll be replacing it with when it keels over and dies. Maybe a Mac tower? Maybe a Linux rig. But it'll be my PC, not Microsoft's if I can help it.
Windows 11 requires TPM 2.0. Windows 11 is the garbage one everyone is going to skip, but will Windows 12 require the same?
What's the failure rate of Macs?
That's because in the race to get the highest benchmark scores, both companies have set the stock clocks to a level that's way beyond what's optimal (eg. adding 100W of power consumption to get 5% higher benchmark scores). The CPUs themselves are fine, you just have to adjust the power/clock limit lower.
Someone on reddit did a power analysis of the 13900K and I reposted it here https://news.ycombinator.com/item?id=34404683 https://www.reddit.com/r/hardware/comments/10bna5r/13900k_po... and it shows the CPU at 100W has 75% of the performance at 40% of the power consumption...
They might be 240W under extreme load, but I can play AAA titles on my i9 at 240hz barely cracking 50% CPU load. And that's with a 3090, so not exactly a mismatched CPU/GPU situation.
At those types of loads the CPU doesn't even try to hit boost clocks most of the time, so you're nowhere near the figures you often see touted based on benchmarks.
Intel arc a750 are under $300 and are decent cards for 1080 and do well in 1440. Dx9 support has greatly increased since release.
Going up a little in price, amds 6650 amd 6750 are 3-400.
6800xt are under $600.
4 years ago you could get a 1070ti in Canada for $400 new.
The exchange rate is about the same as well.
That space is currently filled by older, slower, less efficient, less-featured last-gen products. Both companies have some significant amounts of inventory they want to burn through after the mining thing and it's going slow because of the general declines in shipments.
Generally though I think people are remembering the past with rose-colored glasses... not saying OP said this in particular, but a lot of people have latched onto the idea of the "$300 x70 tier", and the x70 tier has literally never been $300 MSRP for the entire time it's existed. It's bounced back and forth between $350 and $400 even 10 years ago, $329 was the lowest price it's ever launched at and people have latched onto that one as being the price x70 has to match forever, plus a little extra. GTX 680 (full-die GK104) was $499 10 years ago, for a 300mm^2 chip, GTX 670 was a GK104 cutdown for $399 for example, and GTX Titan was where you got the full GK110 at a mere $999 (in 2012 dollars).
https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_proces...
Ampere was somewhat below the baseline, using Samsung was an attempt to make cheaper cards and push down the cost, so by $499 being a "bargain" price (for 3070) before pandemic cost spirals got too bad, and factoring in the more expensive TSMC 5nm node, I think the realistic price for a 4070 (GA104 cutdown, whatever you call it) is probably $600-700 at this point.
So there's definitely some gouging taking place, but, a lot of people are fixated on that $300 number and that's just not going to happen. Costs have just spiraled a lot more than people realize, Pascal was not cheap either and that was 7 years ago (!) this june, and everything since then has been on older, cheaper nodes to help keep the costs down... until now. Throw in the pandemic generally blowing a lot of costs up, and yeah things are expensive now.
And yes, 4850/4870 were good cheap cards, but AMD could do that because they got onto 55nm ahead of NVIDIA, and that was back in the days when shrinking first was a real advantage, you could match a high-end card with a cheap midrange card if you got to a newer node first. That's not how it works anymore, higher wafer costs and R&D costs mean newer nodes are better but they're not really cheaper even considering you get more chips per wafer. Costs are growing fast enough to eat up the increases in density.
Dead Comment
The $1500 would pay for an AMD 7950X, an ASUS MB with ECC support and a decent configuration of the expansion slots (Prime X670E-Pro), a Noctua cooler suitable for 7950X (e.g. NH-U12A) and 64 GB of ECC DDR5-4800 memory (which for 64 GB costs $100 more than the non-ECC variant).
For the rest, I would continue to use the case, the SSDs, the 10 Gb/s NIC and the GPU that I am currently using.
If I would want to also upgrade the GPU, I would buy a RTX 4070 Ti, for 3 reasons. It provides enough extra performance to make an upgrade worthwhile, it has the best performance per $ among all recent GPUs (at MSRP, a 7900 XTX would have better performance per $, but it can be found only at much higher prices), and lastly, 4070 Ti is the only recent GPU for which the vendor specifies the idle power consumption and the power consumption during light use (e.g. video decoding) and the specified values are small enough to be acceptable in a desktop that is not intended for gaming.
Deleted Comment
Even if the increase in hardware cost stays at parity with inflation, it’s tremendously more expensive than it used to be, when waiting six months could get you more machines for your budget.
Gaming, a previous driver of high-end consumer growth, has split into truly mobile, consoles, and very high end PCs. But complex games take more capital and time to develop, so recouping costs is important (except if the studio is part of a conglomerate like Microsoft that can weather temporary market forces). I’d imagine that places pressure on game developers to aim for maximum compatibility and a scalable core. So too bad for the Mac, great for phones, and great for consoles (especially with monetizing the back catalog). And new PCs will have to fight against good-enough, and lower demand funding new hardware and software.
Eco mode is a thing for AMD CPUs. There is no point really in not using it by default - benefits are very marginal with power cost being disproportionally huge. And AMD are doing it more for marketing reasons to gain some single digit percentages in benchmarks.
So just enable it (105 W one) and enjoy way more reasonable and efficient power usage with basically almost the same performance.
See some details here: https://www.youtube.com/watch?v=W6aKQ-eBFk0
Wait, what? Yes there are! An AMD Radeon RX 6800 will set you back $480.
What is your criteria for "good" here?
My computer would be, at today's prices, about $1400.
AMD Ryzen 9 5900X 12 core | 32GB DDR4-3600 | 2 x 1TB PCIe gen 4 | Radeon RX 6700 XT 12GB | (Corsair case, PSU and AIO, MSI X570)
My main computer is an M1 MacBook Air which is essentially a perfect computer. It never feels slow. It was a little over thousand bucks. In no world I can imagine could this not be considered an amazing performance value.
quick search turns up options for around 500euros
but don't be fooled, 10GbE copper is a power hungry mess. go with lower n-baseT speeds if just want some progression for the last twenty years of networking innovation or invest in optical and get a real upgrade (20/40/100 Gbps)
I have a sprawling homelab and home theater and scarcely need regular 1G. Last summer I transferred a media archive of ~10TB to new hardware, which completed overnight across Cat5. Is there some hot new hobby that generates multi gigabit bursts that I don't know about?
As to why I'm sticking with 10Gbe - I have a 24 port 10GBe switch for the house so going with the kit that matches the network I already have.
I'm wondering if this is the moment for Linux gaming. Valve has certainly taken it a long ways from where it was.
Just buy an Intel PCI-E 10g card. They're like $100. Slots on motherboards are meant to be used.
xx80 and xx90s are in the class "If you need to ask the price you cant afford it."
The 4070 Ti is <$1000.
> Want a 10Gb onboard NIC? Be prepared to shell out $1000 just for the motherboard.
Why not just a PCI-e 10GbE NIC on a regular motherboard?
> now that figure is closer to $5K.
I see plenty of prebuilt PCs (eg CyberpowerPC) with 3070 Tis for $2k. 4070 Tis for $2.5k.
Deleted Comment
BTC hasn't been viable on GPUs for a while, either. It's the Ethereum Proof-of-Stake change that was the most exciting, but it doesn't seem to have had a significant effect, especially with newer (3-4yr old) cards.
10-15 years ago, people would buy graphics cards to play the latest games on. Epeen was a thing, but limited to just some text in your forum signature.
Now, it seems like half the reason people buy any sort of "status" item is for the clout on Insta. It was eactly the same thing with the PS5 2 years ago - people clamouring for a useless object just to show other people they have it.
At one point you could get a decent gaming pc for about 500 or 600 GBP. I doubt that's possible now. And madness compared to the hardware in a Xbox series X. Yes I know it's hardware is subsidized.
Even on LAN do you have I/O that can deliver 10gb/sec to the wire?
I've got an ICX7250-24p at home that cost about $200 from ebay. That's 24 ports of PoE gigabit, plus 8 SFP+ ports for 10gig. Noise and power are quite reasonable, though it doesn't belong on the desk next to you.
For example, The Radeon 6700 XT 12GB was commonly $900+ during the crypto boom, but is regularly around $350 now. That's a pretty big drop.
"Current generation" - only very expensive high end models have been announced (and some aren't selling as low as MSRP yet.)
RTX 4070 Ti $800 | RX 7900 XT $900 | RX 7900 XTX $1000 | RTX 4080 $1200 | RTX 4090 $1600
You have to stick to last generation for excellent performance with less insane pricing.
Deleted Comment
this tech is bad. awful. only worth for casual games which are already well served by comon hardware localy.
its all a ruse to try to sell idle capacity for their gpu farms because AI model training hype never materialized
Dead Comment
Suggests a correction precipitated by panic-buying during the supply chain chaos of the pandemic era. Too soon for doom and gloom for the PC market just yet. Mobile devices have been dominating PCs since long before 2020, and if revenues were still growing in the past decade then there's nothing to suggest that this moment is suddenly the inflection point where the whole thing will come tumbling down, even if you do believe that something like that is inevitable.
Only Apple has been relatively flat, and probably only they and Samsung are very profitable.
Most of the profitability in "PC" (GPU/CPU) has really been datacenter for 5 years. Again, Intel executed badly, but the decision to focus on datacenter was right.
Also, it feels like phones have entered that "Core 2 Duo" PC stage where upgrades don't really matter as much any more. I know software support can still be an issue, but at least on the iPhone side, I don't feel like I need to upgrade before my phone loses OS support.
These figures exclude ARM CPUs, another possibility is that lots of people are switching/buying ARM devices.
COVID/WFH Panic Purchasing for work and school at home. COVID Cash also put money into peoples hands to buy stuff, like computers. Crypto mining and GPU shortage was also a factor. As people were buying systems and parts for speculation. People were buying prebuilt computers to mine or simply get the GPU. Scalpers made everything worse, messing with parts in the supply chain.
So there is the supply and demand factors, and the extra money for consumers and the speculation for crypto, and it was a perfect storm.
Marginal costs outweigh the benefits so why would people buy? This is simple economics and they know this, but they still price fix because they must meet their minimum profits whatever that may be.
Its a common problem with monopolies, as soon as the market place shrinks to only a few players, where the means of production has been concentrated, those people then start dictating prices and may collude without even needing some conspiratorial agreement.
Many people also ignore the fact that Intel ME/AMT and the AMD equivalent features that cannot be disabled, are not documented, and are primary targets; are becoming more well known, and in general people don't want it.
Businesses may find value in those features, but individuals find cost (i.e. their privacy, and greater future risks that are unquantifiable).
They've broken their own market, and the rot will only get worse for them since its unlikely they will right the course. Many IT people wonder if there isn't some secret government mandate requiring these companies to embed management co-processors. It is clearly offering only minimal value to IT, and its seen as a cost for individuals that know about it.
They really need to reconsider their actual market instead of the fairy magic kingdom type thinking they have been following.
All 4 of us work in big tech companies, have 6 figure take home pay even after expenses. I don't mind paying high prices due to shortages. Lockdowns were BS and I was continuing to eat out regularly at restaurants from September 2020 to present in a state with lax mask requirements. Only 1 of the 3 friends lost any money in crypto, and it was a very small amount compared to his annual income.
Just because you and your social sphere don't fully appreciate the privacy implications of CSME and ST, which again, aren't the same as AMT, doesn't mean nobody else cares about them. Have you considered that it might be you who is in the bubble?
Is the simulation just extremely CPU-intensive?
They are artificially keeping GPU prices high, so people don't want to buy GPUs.
And if they don't want to buy a GPU then they don't want the thing that the GPU goes in.
PCs have simply become fast enough over the past 10 years and there's been no "Windows Vista"-moment that forced users to migrate to new hardware en masse. A 5 year old system will simply feel pretty much the same to the average user as a brand spanking new one. There's only so many video editors and professional gamers/streamers who thirst after the latest hardware.
This is in line with comments from Drew Prairie (AMD’s VP of communications): [1]
> We are shipping below consumption because there is too much inventory in the channel and that partners want to carry lower levels of inventory based on the demand they are seeing
[1] https://www.pcworld.com/article/1499957/amd-is-undershipping...
i was already sitting at a desk, so ergonomically it’s identical.
now i can compile blender in 20 seconds and fly around with eevee. i can compile linux with custom lsm modules.
dual ssd makes it easy to dual boot. reboot into windows and i can have a magical evening.
7950x, 4090, 990 pro. it would be great if these were cheaper, then more people could afford to use them. it’s also ok that they are overpriced. cest la vie.
to anyone spending a majority of their life on a computer and making money, the cost of your primary computer doesn’t matter unless it’s ludicrous.
the opportunity cost is far higher. what might you have learned had you tinkered with blender or a kernel when you were bored?
Pretty sure hardware speed is not the main thing holding people back from doing this. Apple silicone is also just getting too good.
I will buy a new CPU when you offer one without Intel (CS)ME / AMD Secure Technology (formerly PSP).
Until then, I will never buy a new x86 processor ever again.
Sincerely, - A larger chunk of your potential customers than you think
Reality is probably 2000 people max. Wouldn't even be worth having a meeting to discuss a new product line.
Not to mention that both team blue and team red could charge something of a fortune of a "privacy premium" for not including these much-despised coprocessors. Consider one of the only remotely comparable semi-modern options - IBM's Power 9 processors. Midrange versions of these processors and their motherboards each start in the thousands of dollars. For a Ryzen 9 7950x with no AMT ST (PSP) or Microsoft Pluton, I'd eagerly jump at the opportunity to purchase one for $2500.
Back to your estimate - even for consumer demand, your estimate is easily dismissed by simply looking at the existence of companies like RaptorCS, Purism, and the like. In addition, there is enterprise purchasing demand in the hundreds of millions, easily. Google was attempting to remove all proprietary blobs from their servers a few years ago (2017, if memory serves correctly), and ME/PSP was the big barrier they couldn't overcome at the time.
Make no mistake, there isn't a single tech company that wouldn't leap on such coprocessor-free compute to protect sensitive corporate secrets.
Deleted Comment
I wish this were true, but in reality it's pretty far from that. What do you count the tens or hundreds of thousands of potential customers amongst the sea of tens or hundreds or thousands of millions of real customers? A rounding error.
A huge chunk of their market _wants_ these features, another huge chunk just doesn't care, and there's a small (and at times very passionately vocal) minority that cares and doesn't want them.
But this is all setting up for a response to:
> Until then, I will never buy a new x86 processor ever again.
Don't hold your breath; It's not likely you'll ever have the opportunity to spend your money on x86 with the given conditions.
There may only be tens or hundreds of thousands of customers, but tens of those customers are going to want hundreds of thousands of individual units. I have no doubt annual sales for modern X86 silicon without the "security" coprocessors would be in the millions of units for the first few years, at a minimum.
There are no new AMD or Intel processors that come without ST (formerly PSP) or CSME (formerly ME).
>The management engine that included in chipsets can be switched off permanently.
This is factually incorrect. me_cleaner cannot neutralize or disable modern CSME, there is no way to verify the HAP bit does anything at all, nor that the included TCP/IP stack on the Minix OS cannot accept remote commands to disable the HAP bit, if set. To our current knowledge, only the onboard GbE controller is accessible to CSME's TCP/IP stack, but we're working with extremely limited information. These are closed-source, hardened opaque-boxes that are deliberately designed to be inauditable and tamper-proof. Adding firmware support for other ethernet controllers or wireless cards would conceivably be trivial.
>In general usage, it does not matter while you use third party controlled CAs, distro repositories and automatic updates,
I compile from source. OS, drivers, browser - all of it. I don't care if you think this is "unrealistic for the average user", my objective is not to have the security model that the average user has.
>not speaking about microsoft, google, nvidia, valve, mozilla spyware that can do anything with your data anytime they (or US/EU government agencies) want.
I do not run Windows, I do not use chromium (or firefox) based browsers, I do not use a discrete GPU, I don't have anything remotely gaming related (like steam) installed.
What I do have is a constitutional right to privacy that does not end where my CPU begins, and an unshakeable resolve wherein I refuse to voluntarily cede that right to privacy just because so many others do.
Dead Comment