Readit News logoReadit News
jessfyi · 3 years ago
These prices are beyond insulting and frankly I'm glad they're going to take a hit competing with the flood of miner cards entering the market.

Also note that nothing is preventing Optical Flow Acceleration [0] (and subsequently the DLSS 3.0 models that they claim are exclusive to the 40 series) from running on either 2/3 RTX cards. Just like RTX Voice and other gimmick "exclusives" I expect it to be available to older cards the moment they realize their backlog of 30 series cards aren't clearing as quickly as they thought.

They're competing against an over-bloated secondhand market, AMD, Intel, much better integrated GPUs, and comparatively cheaper consoles that maintain sky-high demand with subsidized games via new subscription programs. They're vastly overestimating their brand loyalty (think Microsoft v Sony after the 360) and EVGA's exit makes more sense now than ever.

[0] https://developer.nvidia.com/opticalflow-sdk

rndmize · 3 years ago
IMO they still have too many 3xxx cards and don't want to cannibalize sales, hence adding them to the top of the stack instead of replacing the 3080/ti (and renaming their 4070 as 4080 12gb).

The feeling I get from the whole thing is NV doesn't care that much about consumer graphics anymore, and doesn't care about providing competitive value. More of their revenue is coming from business applications, all of which is growing, and gaming looks like its down to 1/3 and shrinking.

ChuckNorris89 · 3 years ago
>gaming looks like its down to 1/3 and shrinking

Me and most of my PC gaming friends gave up on this hobby in the past two-three years due to the insane GPU prices. I moved to the Oculus Quest 2 and they moved to PS5 and Steamdeck since the price of a modern GPU alone buys you a whole stand-alone gaming console today. Crazy stuff.

I assume the 4000 series is basically a paper launch designed to make the existing 3000 inventory look attractive and make the people who were waiting on the 4000 series being good value rethink and just buy a 3000 instead.

dcow · 3 years ago
Tom at MLID pretty much confirmed as much in his recent two broken silicon podcasts. Nvidia is pricing this way so they can run the 4k and 3k lines concurrently.
jhanschoo · 3 years ago
Regarding

> gaming looks like its down to 1/3 and shrinking

I believe that you're looking at the same chart that I was thinking of when I saw this line, which likely incorporates crypto GPUs into the "gaming" segment.

quacker · 3 years ago
These prices are beyond insulting

The 3080 released for $800 USD in Jan 2021 which one calculator[1] says is $905 USD with inflation. So a 4080 is $900. At least they are consistent.

edit: 3080 12 GB, that is. The original 3080 was $700 USD in Sep 2020 which is $800 in 2022. So the base 4080 is priced above that.

They're competing against an over-bloated secondhand market, AMD, Intel, much better integrated GPUs

For the 4080/4090 pricing, the only competition is AMD.

Just glancing at some benchmarks, the Intel ARC A380 is worse than a GTX 1650[2] and the best integrated graphics I know of is the Ryzen 7 5700G which is worse than a 1050[3]. I don't see why these would affect 4080/4090 pricing at all.

1. https://www.bls.gov/data/inflation_calculator.htm

2. https://www.pcgamer.com/official-intel-arc-380-benchmarks/

3. https://www.tomshardware.com/reviews/amd-ryzen-7-5700g-revie...

enlyth · 3 years ago
The $900 one is the 12GB version, which is basically a 4070 compared to the 16GB version, because the memory is not the only difference, but also a different core count, memory bus size and clock speed, which is frankly even more insulting.

The "actual" 4080 16GB is $1200.

trident5000 · 3 years ago
PC gaming today: pay absurd prices, take time setting everything up, sit in an uncomfortable desk chair hovering a mouse.

Console gaming: cheaper, plug it in your tv and you're done, sit on your couch, use a controller.

I use a PC for a lot of things but gaming will never be one of them. The console looks great as it is.

kllrnohj · 3 years ago
> They're competing against [..] Intel

No they aren't. All signs point to ARC's first outing being a swing & a miss, and Intel themselves didn't even have any plans to come anywhere close to competing at the top end. Intel's highest end ARC card, the A770, was positioned against the RTX 3060. And Intel is both missing any semblance of a release date it set for itself, but the drivers are also a disaster and rumors are Intel is going to shutter the entire division, focusing only on data center compute cards only.

So used cards yes are a factor, yes, but if this really is a 2x performance jump then those aren't going to be a factor here. None of the $1000+ GPU buying market is going to care about saving $100-300 to get half the performance after all.

That just lives AMD as competition, and only if RDNA 3 comes out a winner. Which the rumor mill is saying it will be, but AMD's graphics track record is spotty to put it mildly. I'm sure Nvidia's internal plan is to just cut prices if RDNA 3 surprisingly competes at the top end instead of just the midrange, but in the meantime is happy to take those fat fat margins.

meragrin_ · 3 years ago
> That just lives AMD as competition, and only if RDNA 3 comes out a winner.

The RX 6650 XT is in the same price bracket as the RTX 3050. I'm not sure they would be considered competition even if RX 77whatever were in the same price bracket as the RTX 3050. Nvidia just seems to push just the right buttons on the software features which AMD just cannot seem to get.

Don't get me wrong. I would love AMD to be competition. They should frankly team up with Intel. AMD has the hardware and Intel can provide the software. Don't see that happening considering AMD went alone on their FP8 proposal and everyone else joined up on a separate paper.

tehsauce · 3 years ago
Hard to blame them for pricing when they can sell the exact same chip to a datacenter for $30k.
belval · 3 years ago
Bingo, Nvidia isn't making enough A100, A10G to fill demand from GCP/Azure/AWS. The idea that they are worried about their position in the gaming industry is off mark.

What's happening is that with mining dying they know the gaming market won't magically expand, there will be a few years of reduced interest. The only way they can make it look good on their balance sheet is if the entreprise segment grows and that means reassigning their capacity towards it even if it means reducing volume on the consumer side.

From a business point of view this makes perfect sense.

bitL · 3 years ago
In that case AMD should blow them out of the water in gaming as they have the RDNA for gamers and CDNA for datacenters.
ftufek · 3 years ago
While I do wish they priced lower, I think when you put things into perspective, it blows my mind that I can buy a card with 76 billion transistors(!) for just 1600$. I suspect the demand for top of the line RTXs will come from enthusiast gamers and ML researchers/startups. Most gamers would be more than fine with 3090 which handles almost anything. The market will be very interesting to watch.
mechanical_bear · 3 years ago
> handles almost anything

What is something it can’t handle?

glitchc · 3 years ago
Sorry, what are the prices exactly? When I attempt to buy I just get an option to be notified. Do I need to sign up just to see the price?
babel_ · 3 years ago
$1599 for 4090, $1199 for 4080 16GB, $899 for 4080 12GB (aka 4070). See Nvidia's News page for the announcement.

Edit: Apologies, got the 4080 numbers wrong, accidentally used GBP, corrected above. For the curious, the 4080 16GB is £1269, the 4080 12GB is £949. Meanwhile, the 4090 is £1679.

fsn4dN69ey · 3 years ago
I think it's fine honestly - remember we have inflation, semiconductor shortage, and 3090s retailed at the same price like two years ago. Nobody honestly needs this level of hardware for "gaming", like honestly 12-24gb vram? I'm not saying you shouldn't buy it if you want it but it's definitely a luxury product or for non-partnered AI work.
njdvndsjkv · 3 years ago
I think this hits the nail its head. Nobody "deserves" a top of the line gaming GPU. And if most gamers can't afford it, game developers will not develop for it as a minimum. Especially for multiplayer games where the total number of players in integral to the game being played at all (which is why many are free to play).
dudeinhawaii · 3 years ago
This is not true unfortunately. For high-end 4k gaming you can easily become memory throttled on 8 or 10GB of VRAM. This is the performance cliff you see on cards like the RTX 3070 8GB card. Granted, it's not for everyone, but it's certainly something 4k gamers would want.
brokenmachine · 3 years ago
I bought an 8Gb GPU recently and was slightly surprised to find that I needed to turn down settings to save vram in newer games.
fomine3 · 3 years ago
Some Mod users and VRChat users really wants VRAM because it's not optimized well, but don't need highend
zokier · 3 years ago
Is market bearing pricing really beyond insulting?
tyjen · 3 years ago
Seems like it's higher than it "should" be with GPU mining winding down due to difficulty maintaining profitability. Then again, companies will generally behave with an asymmetric pass-through if they figure it will not impact sales enough by pricing out consumers--prices increase like a rocket, fall like a feather type deal.

Let's see though:

- 1080 TI (2017) = $699 = $845 (22' price counting inflation)

- 2080 TI (2018) = $999 = $1,178 (22' price counting inflation)

- 3080 TI (2021) = $1199 = $1,311 (22' price counting inflation)

- 4080 16GB (2022) = $1199

I don't know when GPU mining really started to popularize, but there was a significant leap in pricing from the 1080 TI and the 2080 TI. When you roughly account for inflation, it's about on par with current pricing.

Looks like they could trim prices to 1080 TI levels, but I don't see that happening unless sales slow a bit.

Also, I'd like to add, I used general Consumer Price Index to figure this out, so it could vary if inflation did not impact the GPU supply, manufacturing, transport, and other influencing factors similarly.

neogodless · 3 years ago
Cannot wait to see if the market bears these prices without ETH mining!

Dead Comment

arduinomancer · 3 years ago
> Also note that nothing is preventing Optical Flow Acceleration [0] (and subsequently the DLSS 3.0 models that they claim are exclusive to the 40 series) from running on either 2/3 RTX cards.

Are you sure about that?

The page here says it uses "Optical Flow Accelerator of the NVIDIA Ada Lovelace architecture" which sounds like a hardware feature?

https://www.nvidia.com/en-us/geforce/news/dlss3-ai-powered-n...

stevenlafl · 3 years ago
You're right, at least for me. Been waiting for a reason to switch, given that 4G decoding capability is available on my AM4 mobo that would give this a further performance boost and available VRAM.

NVidia can suck it and go solely target cloud computing - they already dedicated a significant portion of the announcement to just that. Why didn't the fact A) this is extremely lucrative and B) they dominate there already reduce their price for regular consumers? Corporate greed.

If I see NVidia cards start shipping in consoles, I'll know they decided to rip us off so they could get great deals for the same hardware to manufacturers. A total middle finger to us PC gamers.

I'm getting an AMD vs Intel vibe like I did from years back. I switched then, I'll switch again.

/rant

foobarian · 3 years ago
Say, where might one be able to get one of these cheap miner cards? Asking for a friend.
tonightstoast · 3 years ago
eBay. Make sure the seller has a long history of good reviews and is in your country/region. There are some sellers with over 100 cards they’re trying to offload.
Timshel · 3 years ago
And those are the Founder edition prices ...

Partner prices were like 200$ more I believe last time.

forrestthewoods · 3 years ago
I think it’s great the high end keeps getting bigger and faster.

Nvidia has primarily been associated with gaming. With Machine Learning that’s starting to change. With tools like Stable Diffusion every single artist is going to want a $1600 GPU and honestly that’s a bargain for them!

I think it’d be sweet if you could spend $100 to $10,000 and get a linear increases in cores, ram, and performance. That’s not how it works. But philosophically I have no problem with it.

Hopefully AMD steps up and brings some real competition.

ChuckNorris89 · 3 years ago
>With tools like Stable Diffusion every single artist is going to want a $1600 GPU and honestly that’s a bargain for them!

Most artists who are not tech savvy to spin up their local Stable diffusion instances are probably going to pay and use one of the online services like DALL-E 2 or something.

jackmott42 · 3 years ago
Where do you work? Does your company set prices to be nice, or do you set them to maximize profit?
bornfreddy · 3 years ago
Sarcasm?

Dead Comment

iLoveOncall · 3 years ago
> These prices are beyond insulting

? There are no prices.

liquidise · 3 years ago
Prices are on the individual card pages[1][2].

  4090 - $1600
  4080 (16GB) - $1200
  4080 (12GB) - $900
1. https://www.nvidia.com/en-us/geforce/graphics-cards/40-serie...

2. https://www.nvidia.com/en-us/geforce/graphics-cards/40-serie...

neogodless · 3 years ago
https://www.theverge.com/2022/9/20/23362653/nvidia-rtx-4080-...

RTX 4090 $1,599

RTX 4080 16GB $1,199

RTX 4080 12GB $899

capableweb · 3 years ago
4080 12GB: $900

4080 16GB: $1200

4090: $1600

When you click on "Learn more" at the bottom.

Deleted Comment

Deleted Comment

thefz · 3 years ago
> competing with the flood of miner cards entering the market.

Remember: don't buy miner cards and let them rot. They artificially generated scarcity thus raising the prices for everybody during the pandemic. And don't believe that a card that has been on 24/7 is safe to use. "but it was undervolted!" means nothing.

iforgotpassword · 3 years ago
Just let them become e-waste and buy something new. Turn on the stove and open all Windows when not at home while you're at it. SMH.

And that "not safe to use"? You think it's gonna explode? Worst case it doesn't work, but even that's probably pretty low risk. Hardware doesn't "wear" that way that quickly.

mckirk · 3 years ago
I would actually be quite interested in seeing some numbers to back that up, but I'm not sure if there have been studies on the damage of mining yet.

In some ways, letting a GPU run at a constant load (and thus constant temperature) is less damaging than a typical gaming workload I imagine, where the temperature fluctuates often. In other ways it probably isn't (e.g. atoms of the chip 'migrating' over time with the constant electron flow, though I have no idea how much of an issue that is with modern chips).

zokier · 3 years ago
> They artificially generated scarcity

I don't think there was anything artificial about the scarcity generated by miners; the demand was genuine. Scalpers are a different matter....

> And don't believe that a card that has been on 24/7 is safe to use

Are you claiming that it is somehow unsafe to use?

belthesar · 3 years ago
This is but one test, but Linus Media Group did some testing and found that mining GPUs worked just fine. Fan time and thermal pad degradation are about the only real detractors from buying a used GPU. https://www.youtube.com/watch?v=hKqVvXTanzI

Of course, if a card's running at thermal throttle for long periods of time, that has a chance to damage the card, but given that most cards are running underclocked, the likelihood that you'll run into that problem is low.

Melatonic · 3 years ago
This is a horrible take and I will definitely be buying a card at rock bottom prices once miners start dumping them big time.
Ekaros · 3 years ago
I don't think there was anything artificial with demand... Just somewhat combining demand from limited other spending and mining.

For Nvidia it made no sense not to produce as many ships as they could.

saati · 3 years ago
So, generate artificial scarcity?
mrb · 3 years ago
So the 4090 is 38% faster (in FP32 FLOPS) than the current top server GPU (H100) and 105% faster than the current top desktop GPU (3090 Ti). And it's also more than twice as efficient (in FLOPS per watt) as all current top GPUs, even compared to the H100 which is manufactured on the same TSMC 4N process. This is impressive.

The computing power of these new GPUs (in FP32 TFLOPS) is as below:

  Nvidia RTX 4090:      82.6 TFLOPS (450 watts¹)
  Nvidia RTX 4080-16GB: 48.8 TFLOPS (320 watts¹)
  Nvidia RTX 4080-12GB: 40.1 TFLOPS (285 watts¹)
Compared to Nvidia's current top server and top desktop GPUs:

  Nvidia H100 (SXM card): 60.1 TFLOPS (700 watts)
  Nvidia RTX 3090 Ti:     40.0 TFLOPS (450 watts)
Compared to AMD's current top server and top desktop GPUs:

  AMD Instinct MI250X:    47.9 TFLOPS (560 watts)
  AMD Radeon RX 6950 XT:  23.7 TFLOPS (335 watts)
¹ I didn't see the wattage listed in this page by Nvidia; my source is https://www.digitaltrends.com/computing/nvidia-rtx-4090-rtx-...

adrian_b · 3 years ago
The FP32 comparisons show only a very incomplete image about the RTX 40 series vs. server GPUs.

Both the NVIDIA and the AMD server GPUs are about 30 times faster at FP64 operations. At equal FP32 Tflops, the server GPUs would be 32 times faster at FP64 operations.

A 16 factor comes from having an 1:2 FP64/FP32 speed ratio instead of an 1:32 ratio, and an additional factor of 2 comes because the server GPUs have FP64 matrix operations, which do not exist in consumer GPUs.

So the difference in FP64 speed is really huge. Moreover, the server GPUs have ECC memory. When you have a computation that takes a week, you would not want it to provide wrong results, forcing you to redo it.

mrb · 3 years ago
Of course, server GPUs remain kings of the hill on FP64 workloads, as they have extra FP64 units. I focus on FP32 performance because it is representative of general compute performance on all other workloads, such as FP32, FP16, FP8, or integer.
immmmmm · 3 years ago
are they still using the same chip in server and gaming ranges and crippling f64 by a fuse or software?
throwabro1000 · 3 years ago
The more significant changes have not been detailed.

- What will MIG look like on these GPUs?

- Are there multiple graphics queues now?

Otherwise, the RAM remains the greatest limitation of NVIDIA’s platform. They should really design a RAM add in standard and decouple it from the chip.

IMO NVIDIA’s greatest competition is against Apple, who have achieved a unified memory architecture and a naturally multi queue (actually parallel) graphics and compute API. Everyone from normals to the most turbo “ML boys” do not comprehend yet how big of a deal the M1 GPU architecture is.

mountainb · 3 years ago
It seems like they're really emphasizing the difference in RT performance from the previous generation, but I think the gaming market at least will care more about the difference in raw frames and memory size from the previous generation and AMD's offerings.

Personally, I like using RT for some single player showpiece games with a 3080 Ti, RT was useless on my 2080, but the games that I play the most do not use RT at all. DLSS is always great on any title that offers it, but again the real issue is that most of the games that people put real time into are the types of games for which RT is irrelevant. Graphical showpiece stuff is just a lot less relevant to the contemporary gaming market than it used to be.

aseipp · 3 years ago
Eh, the current 3080 can already do 60FPS @ 4k HDR on every AAA title I've thrown at it, and that's with an mITX undervolt build (cliffing the voltage at a point). "60FPS @ 4k is readily achievable" has been sort of the gold standard we've been careening towards for years since displays outpaced GPUs, and we're just about there now. The raw frame difference and memory size is nice especially if you're doing compute on these cards, but these weren't holding the gaming market back, at least. So for those segments, you need some juice to go on top of things. I can see why they advertise it this way.

Personally, people say RT is a gimmick but I find it incredible on my 3080 for games that support it. In a game like Control the lighting is absolutely stunning, and even in a game like Minecraft RTX, the soft lighting and shadows are simply fantastic. (Minecraft IMO is a perfect example of how better soft shadows and realistic bounce lighting aren't just for "ultra realism" simulators.) It's already very convincing when implemented well. So I'm very happy to see continued interest here.

shostack · 3 years ago
Try MSFS in VR on ultra settings. With mostly high and ultra I'm sometimes able to get 30fps on my quest 2. Nevermind something like the Pimax 8k or upcoming 12k.

That sim eats graphics cards for breakfast.

TwoNineA · 3 years ago
> the current 3080 can already do 4k @ 60FPS HDR on every AAA title I've thrown at it

Have you tried CyberPunk 2077?

avrionov · 3 years ago
The games are going to adapt to the new cards and 3080 performance will go down with the latest and the greatest. It always happen.
nullify88 · 3 years ago
Control was a great looking game (even on my 1080 Ti), and its art direction very unique. Another top release by Remedy.
arecurrence · 3 years ago
I suspect they also highlight RT performance (and AI acceleration which is more-so focused on a different market than these gaming cards) because it is their key differentiator with competitors.

Most upper market cards can already run most games well at 1440p or 4k.

trafficante · 3 years ago
I’d be interested in better RT performance for the purposes of VR gaming but, unfortunately, the high fidelity PCVR gaming market died with the Quest 2.
mrguyorama · 3 years ago
It died because people who build $5000 gaming computers aren't numerous enough to actually base a business on.
fomine3 · 3 years ago
No, average people never play VR with wired headset and highend PC so the market can't grow. Quest2 is acceptable device, and it can be upgraded by wireless Oculus Link with highend PC.
nomel · 3 years ago
PSVR2 will be our savior.
jeffcox · 3 years ago
While RT is a single player gimmick I mostly turn off, it won't take many more increases like this to make it a very real feature. What will we see when developers start targeting these cards, or their successors?
SketchySeaBeast · 3 years ago
If it's actually another 2x performance improvement, we're really only another generation away at most. On my 3080 RT caused a performance drop but I was able to decide the trade-off for myself, often it was still playable, I just needed to decide whether I wanted high refresh rates or not. Another doubling or two and it'll barely be a thought at all. The caveat with this still is it requires DLSS for best effect, which is barely a trade off in my experience.
izacus · 3 years ago
What does "single player gimmick" in your sentences here mean?
hoffs · 3 years ago
Not even close to a gimmick...
pornel · 3 years ago
Not everyone is a pro CS:Go player. I really like RTX shadow improvements and wish more games supported it. I've bought games for their "showpiece stuff".
seanp2k2 · 3 years ago
I wish Valve would put a bit of investment into TF2, as it's still a very popular title with ~100k players on average[1], yet it's 32-bit ONLY and stuck on DirectX 9. The Mac version doesn't even work anymore because it was never updated to be 64-bit. The dedicated server, SRCDS, is also 32-bit only and struggles with 32 players and some fairly light admin mods. It's also mostly single-threaded, so it tends to run better on Intel CPUs with high clock speed.

1. https://steamcharts.com/app/440

iLoveOncall · 3 years ago
> but I think the gaming market at least will care more about the difference in raw frames and memory size

I'm not sure I understand why memory is important for gaming? For most games, with every settings maxed up, it'll be a stretch if it uses 6GB of VRAM.

For other applications than gaming, 100% agree, but for gaming I can't imagine it's important.

capableweb · 3 years ago
> I'm not sure I understand why memory is important for gaming? For most games, with every settings maxed up, it'll be a stretch if it uses 6GB of VRAM.

Depends on what kind of gaming you do, not all games consume the same type of resources from the GPU. Simulation games will usually be limited by RAM size/speed, VRAM size/speed and CPU, rather than actual clock speed of GPUs. Action games would require better latency and not so much about VRAM/RAM size and so on.

mountainb · 3 years ago
For the majority of games, no it doesn't matter at all. For a small number of games, it's not hard to eat up all the VRAM with certain settings and especially with 4K gaming. I use a 1440p monitor so it doesn't really matter that much to me, but it does matter more to people who spend a lot on good 4K displays.
spywaregorilla · 3 years ago
I've yet to see a game that my laptop 2080 can't handle on max settings. The only games I've done that asked for raytracing I did was Control and Resident Evil 8.

The pool of games that ask for extremely high performance is very small and pretty easy to ignore by accident.

Deleted Comment

Deleted Comment

EugeneOZ · 3 years ago
> DLSS is always great on any title that offers it

It is very subjective. DLSS for me is always a blurry mess. I'm turning it off in every game - I'll better will turn off some RTX features than suffer that blurriness torture.

zzixp · 3 years ago
4080 -> $900/1200, 4090 -> $1600

What the hell Nvidia. Post EVGA breakup, this is a bad look. Seems like they're setting MSRP ridiculously high in order to undercut board partners down the line.

Philip-J-Fry · 3 years ago
That pricing is absolutely insane.

Reminder that the 780ti was $700. The top spec SKU.

Nvidia has took the GPU shortage and now set that pricing as the norm. And people will eat it up like suckers.

bigmattystyles · 3 years ago
It’s funny, I just had to have 40 feet of wood fence replaced and paid ~4K for parts and 1 day of labor. They also took away the old fence. My point, electronics are cheap. I know this point will not be received well though.
tromp · 3 years ago
> And people will eat it up like suckers.

Pricing can't be insane and causing huge demand at the same time...

quacker · 3 years ago
Reminder that the 780ti was $700. The top spec SKU.

$700 USD in 2013 is worth $900 USD in 2022. Almost a decade of inflation.

Also, the GTX Titan was $1000 USD msrp, which was essentially the 780Ti with double the memory.

postalrat · 3 years ago
Is $1600 that insane when people are paying $1100+ for a phone?
behnamoh · 3 years ago
I honestly hope nvidia’s monopoly crashes and burns to ashes…
segadreamcast · 3 years ago
For comparison you can buy 3x Xbox Series S (which will play almost all current gen AAA games) for the price of 1x 4080. Absolutely preposterous.
partomniscient · 3 years ago
As a side-note, the X-Box series S was the cheapest and most effective way for me to buy a 4K-UHD blu-ray player.

I did try to do some CUDA based AI rendering stuff on my 2080S but 8GB didn't seem to be enough.

Its weird to comprehend the 'stretching' of technology advance over time as I age, especially on the value side. There hasn't ever been the same 'feel' for me from the first leap of moving from software rendered Quake to a 3D-card - despite the various advances since, although I remember bump-mapping as another massive leap.

It's totally unfair in some ways. As an example the control you have over lighting a scene (either in a game or something you're rendering) is way beyond what multi-million dollar studios were using in my younger years.

What happens to society/reality when technology capable of producing video content indistinguishable from reality is affordable to many? Its already happening. Its going to become more commonplace.

The problem of 'truth' becomes massive - is the thing presented to you something that actually happened or was it fabricated?

capableweb · 3 years ago
For comparison, a 4080 has a lot more uses than a Xbox Series S, which I think you can only realistically use for gaming.
icu · 3 years ago
I completely agree. In the past I dropped a ton of money on gaming rig hardware that aged like milk. With a console you get the advantage of exclusives, majority of PC game releases, and a longer upgrade cycle versus a gaming rig. If you own a PS5, you got the PS VR2 coming out soon at a decent price point. If you own an Xbox, add in the amazing value of the Xbox Games Pass and I just don't see the need to be subsidising Hardware Manufacturers' bottom lines anymore.
jgtrosh · 3 years ago
Even though these GPUs are severely overpriced, you can't compare them with the console sold at the largest loss
cma · 3 years ago
Subsidized by a hidden cost multiplayer subscription and store lock-in cut on all games played on it.
honkycat · 3 years ago
I can't do music production and game development on an xbox series s though.
apazzolini · 3 years ago
You forgot to mention "at 60 FPS".
mechanical_bear · 3 years ago
I can’t train ML models on the Xbox.
LegitShady · 3 years ago
Raising MSRP doesn't undercut board partners, it allows board partners to charge more which is what they want.

Personally I think what they're doing is maximizing launch profit without totally crashing the 30x0 market which has a supply glut, something else board partners are worried about.

I'm really not sure why you think this is bad for board partners - you haven't really explained your reasoning and what you did say doesn't make much sense. This is all positive for board partners. It's the cheap founders edition cards that undercut board partners.

madamelic · 3 years ago
> Raising MSRP doesn't undercut board partners, it allows board partners to charge more which is what they want.

My understanding regarding the EVGA debacle is that Nvidia sets really high MSRP along with charging for the chips & giving themselves a headstart and the partners can't go above the MSRP then Nvidia comes in and slashes prices that only Nvidia can compete at (since they can give themselves discounts on the chips along with all of the other advantages they have).

What board partners likely want is either: Nvidia is time-limited to a few months with Founder Cards or Nvidia allows them to sell cards for more money and Nvidia can't make any card except Founder.

Aunche · 3 years ago
GPU prices will trend towards market prices either way. I'd rather them somewhat overshoot MSRP rather than undershoot, which is what happened with the 30-series GPU. It's not fair that scalpers were getting rich while EVGA was operating at a loss.
coolspot · 3 years ago
I am pretty sure EVGA sold directly to miners, likely at a good margin: my day-one EVGA queue was never fullfilled, while miners showed warehouses full of EVGA cards.
ls612 · 3 years ago
The Titan in 2013 was $1000, when you have had 25% inflation since then and the 25% tariff that is almost exactly $1600.
terafo · 3 years ago
But 4090 isn't the Titan. It doesn't have Titan drivers and everything.
Philip-J-Fry · 3 years ago
The 780ti was effectively the Titan and was $700
Nokinside · 3 years ago
Low yield with new process at beginning and 4090 Ti might be delayed. By setting price high, you manage demand, avoid empty shelves and scalpers.

3090 is still just fine and generates profits.

The price is right if you sell all you produce. Empty shelves means you charge too little.

adrian_b · 3 years ago
While I expected about $1600 for 4090, for the other two, whose GPU sizes are 9.5/16 and 7.5/16 of the top model, I expected prices proportional to their sizes, i.e. about $1000 and $700.

However, NVIDIA added $200 for both smaller models making them $1200 and $900.

It is debatable whether 4090 is worth $1600, but in comparison with it the two 4080 models are grossly overpriced, taking into account their GPU sizes and their 2/3 and 1/2 memory sizes.

Philip-J-Fry · 3 years ago
They will price the 4080 at $1200 to push people over the line for the 4090. After all, the performance difference for "just" $400 is quite significant.
omegalulw · 3 years ago
They are probably working towards cleaning out the 30 series stock given that crypto demand is dying off. It takes a lot to put 3060 prices on a marketing slide for 40 series.
sheepybloke · 3 years ago
That sort of make sense... I was thinking of upgrading when the 40 series came out, but looking at these prices, it makes the 30 series look a lot more affordable for what I need, even if it's less performant.
mnd999 · 3 years ago
I’ll pass. Needs to be 1/4 of that to be worth considering. You can have a whole PC for less.
seanp2k2 · 3 years ago
>You can have a whole PC for less.

Not with even 50% of the FLOPS. The 4090 won't be the best perf/$, but considering how much perf/$ and perf/watt you can get these days, I think it's pretty hard to complain that these are available. If you don't play games, get a Ryzen 5 7600X for $300 with an iGPU more than capable of high-quality desktop computing.

You can also get a brand new laptop for $130 right now https://www.bestbuy.com/site/asus-14-0-laptop-intel-celeron-... , which is $57.37 in 1990 dollars. (according to https://www.usinflationcalculator.com/ ). The high-end is more expensive than ever, and the low-end is cheaper than ever. The Apple Lisa was $10k in 1983, which is about $30k today. You can get a Boxx Apex with multiple $10k Quadro GV100 GPUs and 8TB SSDs for that money today, or roughly a dozen high-end (but not TOTL) gaming PCs.

Melatonic · 3 years ago
Yea this is insane. Screw Nvidia and wait for the second hand market to hit rock bottom (I would estimate this will start to happen in 1 to 3 months) and buy a used card for dirt cheap.
paulmd · 3 years ago
they're setting prices high (and delaying the launch of the mainstream and lower-tier cards) because they have huge stockpiles of Ampere chips they need to burn through. Launch the stuff at the top to get the sales from the whales (VR, enthusiast gaming, etc) who are willing to pay for performance and delay the lower cards that would compete with your ampere inventory.

It's the Turing strategy all over again - when you have a stockpile of older cards, you don't make your new cards too attractive. And yes, they also have a big TSMC allocation too - but they gotta get rid of the Ampere stuff first, the longer it sits the more it will have to be marked down, launching new cards that obsolete the inventory they're trying to sell would just make things worse.

AMD is going to be doing the same thing - they pushed back the midrange Navi 33 and will be launching only the high-end Navi 31 until that miner inventory burns through a bit. Similarly to NVIDIA, that likely implies they'll be launching at a high price, they'll undercut NVIDIA by $100 or $200 or something and take the margins but they're not gonna be the heroes of the $700 market either.

---

The EVGA thing is a tempest in a teapot though, the losses he's talking about are something that's happened in the last month (he supposedly decided to quit back at the start of the year) and not representative of the (large, 10x normal) margins that board partners have been making in recent years. I personally didn't see much evidence of "price caps" with partner's first-party storefronts selling 3080s at 2.50-3x FE MSRP either.

And yes, jensen is an asshole and being a partner is a low-margin business, everyone already knows that.

EVGA is losing money because of some of it's CEO's ridiculous prestige side-projects (custom motherboards, enthusiast monitors, video capture cards that turned out to be falsely advertised, pcie sound cards, etc) and generous warranty support (long and transferable with absurdly cheap extended warranties) coupled with a higher-than-average failure rate (because they contract out assembly) and a generally lower-than-industry margin (because they contract out assembly) and they're just being drama queens on the way out.

Someone else with a personal axe to grind (EVGA tried to blacklist him for a critical review), but relaying some commentary from the other board partners: https://www.igorslab.de/en/evga-pulls-the-plug-with-loud-ban...

sylens · 3 years ago
I might be priced out of the market when my current rig goes. For the price of a 4080 you could have both consoles or a console and a Steam Deck
perryizgr8 · 3 years ago
You can get a 3060ti that performs better than both consoles combined. Don't need to go to 40 series if you care about vfm at all.
sheepybloke · 3 years ago
Especially when they've been have lower than expected revenue because of the crypto crash. You'd think that they would drop the price a bit to entice the people who've been waiting for GPU prices to drop to buy the new version.
izacus · 3 years ago
Didn't partners demand raising of prices for them to capitalize on high GPU demand?
datalopers · 3 years ago
They must think there's still gpu crypto miners out there.
leetcrew · 3 years ago
meh. after the last couple years, I'd rather have something expensive on the shelf than something "reasonably priced" but perpetually unavailable.
mdorazio · 3 years ago
450 watt TDP? I feel like a crazy person every time a new generation of GPU comes out and raises the bar on power consumption and heat generation. How is this ok?
aseipp · 3 years ago
The 3090 Ti series was already pushing 450W so this isn't new.[1] And it's because they clock these things incredibly high, well beyond the efficiency curve where it makes sense. Because that's what gaming customers expect, basically. On the datacenter cards they quadruple or octuple the memory bus width, and they drop the clocks substantially and hit iso-perf with way, way better power. But those high-bandwidth memory interfaces are expensive and gaming workloads often can't saturate them, leaving the compute cores starved. So they instead ramp the hell out of the clocks and power delivery to make up for that and pump data into the cores as fast as possible on a narrow bus. That takes a lot of power and power usage isn't linear. It's just the nature of the market these things target. High compute needs, but low-ish memory/bus needs.

This isn't necessarily a bad or losing strategy, BTW, it just is what it is. Data paths are often very hot and don't scale linearly in many dimensions, just like power usage doesn't. Using smaller bus widths and improving bus clock in return is a very legitimate strategy to improve overall performance, it's just one of those tough tradeoffs.

Rule of thumb: take any flagship GPU and undervolt it by 30%, saving 30% power/heat dissipation, and you'll retain +90% of the performance in practice. My 3080 is nominally 320/350W but in practice I just cliff it to about 280W and it's perfectly OK in everything.

[1] Some people might even be positively surprised, since a lot of the "leaks" (bullshit rumors) were posting astronomically ridiculous numbers like 800W+ for the 4090 Ti, etc.

PragmaticPulp · 3 years ago
> The 3090 Ti series was already pushing 450W so this isn't new.[1]

Every time this comes up, people forget that:

1) Flagship cards like the 3090 Ti and 4090 consume significantly more power than typical cards. You should not buy a flagship card if power is a concern.

2) You don't have to buy the flagship card. Product lineups extend all the way down to tiny single-fan cards that fit inside of small cases and silent HTPCs. It's a range of products and they all have different power consumption.

3) You don't have to run any card at full power. It's trivial to turn the power limit down to a much lower number if you want. You can drop 1/3 of the power and lose only ~10-15% of the performance in many cases due to non-linear scaling.

4) The industry has already been shipping 450W TDP cards and it's fine.

If power is an issue, buy a lower power card. The high TDP cards are an option but that doesn't mean that every card consumes 450W

seanp2k2 · 3 years ago
Yeah, I thought those leaks were pretty insane, like are we really going to all need 1KW+ PSUs to run flagship GPUs? I picked up an 850W Seasonic Prime a few upgrades ago and even running benchmarks on an OC 12900KS + 3090 with 8 case fans it's totally fine. I was hoping to not have to upgrade PSUs for another few years.
ftufek · 3 years ago
For what it's worth, you can power limit them and I'd highly recommend it if you plan on running a few of these. In the past we've power limited RTX 3090s to 250W (100W lower than original) while losing negligible amount of performance.
Melatonic · 3 years ago
Do you just limit using something like MSI Afterburner?

I was undervolting and underclocking my Titan XP for a long time (with a separate curve for gaming) but lost the config and have been lazy to go back. I mainly did this because it has a mediocre cooler (blower design that is not nearly as good as the current ones but also not terrible) and even at low usage it was producing a lot of heat and making some noise (the rest of my system I specifically designed to have super low noise).

thamer · 3 years ago
This is not a new development with the 40 series. The RTX 3090 Ti already had a 450W TDP: https://www.techpowerup.com/gpu-specs/geforce-rtx-3090-ti.c3...

What you describe is accurate, the 3090 Ti produces a tremendous amount of heat under load in my experience and I would expect the same with these new cards.

anonporridge · 3 years ago
It's quite funny seeing this and the 2000 watt society on the HN frontpage at the same time.

https://news.ycombinator.com/item?id=32899656

cjbgkagh · 3 years ago
When I first saw that headline I thought it was a society for people who have 2000 watt PSUs.
paulmd · 3 years ago
Dennard Scaling/MOSFET Scaling is over and it's starting to really bite. Power-per-transistor still goes down, but density is going up faster. Meaning an equal-sized chip on an old node vs a new node... the power goes up on the newer chip.

Physics is telling you that you need to let the chip "shrink" when you shrink. If you keep piling on more transistors (by keeping the chip the same size) then the power goes up. That's how it works now. If you make the chip even bigger... it goes up a lot. And NVIDIA is increasing transistor count by 2.6x here.

Efficiency (perf/w) is still going up significantly, but the chip also pulls more power on top of being more efficient. If that's not acceptable for your use-case, then you'll have to accept smaller chips and slower generational progress. The 4070 and 4060 will still exist if you absolutely don't want to go above 200W. Or you can buy the bigger chips and underclock them (setting a power limit is like two clicks) and run them in the efficiency sweet spot.

But, everyone always complains about "NVIDIA won't make big chips, why are they selling small chips at a big-chip price" and now they've finally gone and done a big chip on a modern node, and people are still finding reasons to complain about it. This is what a high-density 600mm2 chip on TSMC N5P running at competitive clockrates looks like, it's a property of the node and not anything in particular that NVIDIA has done here.

AMD's chips are on the same node and will be pretty spicy too - rumors are around 400W, for a slightly smaller chip. Again, TDP being more or less a property of the chip size and the node[0], that's what you'd expect. For a given library and frequency and assuming "average" transistor activity: transistor count determines die size, and die size determines TDP. You need to improve performance-per-transistor and that's no longer easy.

[0] an oversimplification ofc but still

That's the whole point of DLSS/XeSS/Streamline/potentially a DirectX API, get more performance-per-transistor by adding an accelerator unit which "punches above its weight" in some applicable task and pushes the perf/t curve upwards. But, people have whined nonstop about that since day 1 because using inference is a conspiracy from Big GPU to sell more tensor cores, or something, I guess. Surely there is some obvious solution to TAAU sample weighting that doesn't need inference, and it's just that every academic and programmer in the field has agreed not to talk about it for the last 20 years, right?

amelius · 3 years ago
Hardware designers produce more Watts where software developers create more bloat.
capableweb · 3 years ago
I'm not sure what else to expect? Is it so crazy that they make the card even faster and bigger than before, and it uses more power? What else is the next generation of cards supposed to do, have the same performance but make them more energy efficient? Not sure how many people would buy cards that have the same performance but less TDP.
bentcorner · 3 years ago
In a generation or two we're going to start bumping up against regular US outlet limits. Gamers might start setting up their rigs in the kitchen to use the 20A outlets.
msk-lywenn · 3 years ago
It is. I remember when I bought my GTX 260. It was crazy how much power hungry it was at the time for a mid market card. I had to buy a drumroll 450W PSU. Years later, I upgraded to a GTX 960. At first I thought "I probably need an even bigger PSU now...". I was shocked and then very happy to learn that the GTX 960 consumed less than my 7 generations older GPU. They can do it right when they care.
ollien · 3 years ago
> What else is the next generation of cards supposed to do, have the same performance but make them more energy efficient? Not sure how many people would buy cards that have the same performance but less TDP.

This is what Intel did for a while with their "tick tock" strategy https://en.wikipedia.org/wiki/Tick%E2%80%93tock_model

disconcision · 3 years ago
i mean, the option is there depending on what your requirements are. i recently went from a 980ti to a 3070, doubling performance while decreasing TDP... as someone who has the card for casual use (only really make use of it a few times a month), i actually did just look for the fastest card that ran less hot than my current one (i've come to value quiet a bit more)
perryizgr8 · 3 years ago
Power is a non issue for most people buying these cards for gaming. I couldn't care less if my desktop draws 500W or 1000W while gaming. If I get 144 fps vs 100 fps, I'll gladly accept the extra power consumption.
caycep · 3 years ago
unless you're Apple and have your cake and eat it too...sometimes I wonder if maybe the design goals at NVIDIA and AMD are misplaced a bit
tepmoc · 3 years ago
Nowadays i found that watt per fps is better metric. And nvida is just milking performance by power draw more current (less efficient)
jltsiren · 3 years ago
It's good to remember that you don't have to buy the most powerful GPU model just because you can afford it.

Some people are probably in the target audience for the 4090. Others may prefer the 4080 models, which have a slightly lower TDP than the 3080 models but still get a nice performance boost from much higher clock rates.

jackmott42 · 3 years ago
People want to render more triangles. What are you gonna do?
fomine3 · 3 years ago
I wonder when Nvidia start selling lower powered model like Intel "T" CPU. It's just underclocked/undervolted and a bit binned chip, but some consumer like it. EnergyStar also will like it.
behnamoh · 3 years ago
Apple M1 showed us that it’s possible to increase performance while keeping the power consumption low—laptop level low.
cma · 3 years ago
They bought out exclusive access to the most efficient node, making it impossible for others rather than showing it is possible for them (though it presumably would have been possible if they outbid Apple at the time).
nfRfqX5n · 3 years ago
Market doesn’t seem to care yet
izacus · 3 years ago
As a gamer... Should I care?
TylerE · 3 years ago
Depends, do you like your PC sounding like an airplane from all the fans spinning up?
leetcrew · 3 years ago
imo people get a bit too worked up over the peak power draw of parts that are going to be fully loaded only a couple hours per day.

the bigger issue is where all that additional heat is going and how. I'll be paying close attention to the SPL metrics for this generation.

aordano · 3 years ago
If you're on 110V you should care about the current limits
rimunroe · 3 years ago
I have bigger things to worry about personally as far as my energy consumption is concerned, but the main way this impacts me is that my wife and I probably need to get some electrical work done in order to support our devices being in the same room. Our two gaming computer towers theoretically have a maximum combined power draw of 1800 Watts, which is well above the 1450 W which I think is the maximum sustained (not burst) load for a 15 A residential circuit in the US. This is ignoring the few hundred extra watts for monitors, speakers, laptops, and a couple lamps.
jaimehrubiks · 3 years ago
Some people might need to upgrade the PSU
causi · 3 years ago
NVIDIA has unleashed its next-gen GeForce RTX 4080 series graphics cards that come in 16 GB & 12 GB flavors at $1199 & $899 US pricing.

Welp it looks like I'm buying a Series X because fuck that noise. PC gamers get put through hell for the last two years because Nvidia preferred selling to Chinese crypto miners and this is our reward?

nvllsvm · 3 years ago
Why not just buy a less powerful GPU? The 4080 is likely far, far more powerful than the series X.
groovybits · 3 years ago
As a PC gamer who's slowly losing interest in favor of console gaming...

I don't mean to sound ignorant, but how is any GPU comparable to a console.

For $500, I can get: a Series X and a controller, ready to play out of the box. With GamePass (currently $1 first month), I get instant access to a large library of games.

A current GPU (20, 30, or 40 -series) costs more than that, and I also need to purchase the additional components to put it all together.

I built a PC in college because it used to be the most cost effective option. But that doesn't seem to be the case anymore. These 40-series prices will define the cost for the next generation of cards, and IMO its too much to justify for casual gaming.

causi · 3 years ago
Sure, but with console optimization the series x is roughly equivalent to a 3070 in real-world performance and I get the value-add of a controller, a UHD blu-ray drive, and compatibility with my library of 360 games. My GTX 1070 already plays 90% of my game library at the highest details and framerate on my 60fps TV. Also, never having to deal with Steam's screwed-up gamepad framework again is probably worth two hundred bucks in frustration alone.
PartiallyTyped · 3 years ago
NB: 4080 12GiB has 9720 cuda cores and 2.21/2.51 base/boost clock (in GHz), whereas the 8GiB version has 7680 and 2.31/2.61 (in GHz).

In other words, the name is utter bollocks because the 12GiB model has approximately 1.26 times the cuda cores of the 8GiB version for approx 0.96 times the core clock. Assuming no other bottleneck exists, the 12GiB version should be 1.21 times faster. This is supposed to be the xx70 and xx80 difference.

raydev · 3 years ago
The Series X performance is roughly equivalent to an Nvidia RTX2080. Why would you compare it to a 4080?
Arrath · 3 years ago
I briefly flirted with the idea of going back to console town, but I just love modding my games too much.
izacus · 3 years ago
I'd you want 3050 level performance, you can get that cheap as well?
balls187 · 3 years ago
Try Xbox Cloud Gaming on your PC over your browser.
SketchySeaBeast · 3 years ago
I find the fact that they have two 4080's with very different specs but with only a difference in memory size indicated very frustrating.
radicalbyte · 3 years ago
The 12GB is the 3070 replacement, they've just changed the label to try to hide the price hike.
SketchySeaBeast · 3 years ago
But then they'll also create a 4070 here. And then a 4070 ti, and 4080 ti, and then.... But that's another complaint. The 4080 is a different problem entirely - not enough names. Somehow they have too many names but also not enough so they can only differentiate via GB.
crthpl · 3 years ago
Their Performance section shows that the 12GB is 50%-150% faster than the 3080 Ti (so better than 3090 Ti!), and it also has more cuda cores and memory than the 3070.
sidewndr46 · 3 years ago
Sounds like Intel SKUs, I think I could get some part time contract work just explaining the difference between two SKUs to folks.
sdenton4 · 3 years ago
Compared to the 3090, the 4090 has about 60% more CUDA cores (16k vs 10k), runs at ~2.2GHz (up from 1.4GHz) and eats about an extra 100W of power.

Over the last couple weeks, it's been possible to get 3090's on sale for juuuust under $1k (I picked one up after watching the prices come down from like $3k over the last couple years). The 4090 is priced at $1500... (and for me at least would require a new power supply.)

xbar · 3 years ago
RTX 3090 was $1500 at launch, and was a 50% boost over the previous gen at best. RTX 4090 is $1700 an is asserted to be ~1.9x performance, when taken with salt grain.

You can buy a 3090 for $999 today at Best Buy. That's not a bad price per frame. $1700 for the 4090, if 1.9x speed for raster, is not a bad price per frame either.

Both are stupid. Anyone spending $2k for video games graphics cards is wasting money in a stupid way: they are big, hot, shipped from far away, and turn into toxic landfill garbage in a matter of years. They mark the worst form of destructive consumerism that late-stage capitalism has produced, on par with leasing a new vehicle every 2 years: both are bad as products that turn into needless waste and both consume energy and produce unnecessary heat.

I am definitely going to buy a 4090 on October 12.

svnt · 3 years ago
Your comment is a perfect superposition.
enlyth · 3 years ago
>They mark the worst form of destructive consumerism that late-stage capitalism has produced

Wait till you hear about disposable vapes with perfectly rechargeable lithium-ion batteries