If that's true, how would one rank pumping supercritical liquid nitrogen at high rates through a heatsink? Super-grandmaster?
Seems like the heat flow would be substantially impeded by any boiling of the LN2.
Or, for that matter, simply using a chilled copper ingot as the heat sink? There must be some threshold at which the limiting problem is getting the heat out of the die, not getting the heat out of the chip's package.
Carrier freezout actually makes this non-workable -- there's a limit to how cold you can make CMOS devices before they stop functioning. To say nothing of the specific heat of liquid helium, which is miniscule compared to LN2
Today we didn't learn actually. They don't mention it being stable:
“It's a lot easier to control a benchmark which is always the same”, explains Rywak. "A game makes the whole process less predictable in terms of hardware load which could lead to stability issues while working on frozen hardware.”
Human eyes don't have a refresh rate, per se. I recall hearing that they can discern differences in frequency up to 3 kHz or so; though obviously you can go much slower than that and still feel smooth and relatively responsive.
What is interesting, from the screenshot, is that the game is actually CPU bound. Contrary to an often held belief in the high-end video games optimization circles.
No. I just bought a 4k screen only to find out that 12k is coming down the pipe. I do not want to think about what 1000hz 12k screens will cost. Stop this madness now.
Something is always coming down the pipe. Fortunately you have need to walk the treadmill, and if you're only just getting a 4K screen now, then you're probably not the kind of insane early-adopter who is.
12k is possible but it's hardly around the corner. Even 4K has both content and hardware issues around it. 8K is going to be the next mass market push but we're not even done with the 4K party.
Also 4K display devices are available at a modest price point now. The bigger issues are content. We're mostly there with mass-market media, but if you want to drive a AAA video game at 4K resolution you're having to make compromises and spend a lot on the hardware to drive it.
They're going to keep making new things. And the new things are going to have bigger numbers. It's okay.
12k? Never heard of. Let's go with 8k, which is 7680x4320.
Assuming our current standard of 8 bits per color with no alpha (3 bytes per pixel), which may be too low if you care so much about your monitor, your required bandwidth becomes:
7680 * 4320 * 3 * 1000 = 99532800000
99532800000 / 1024 / 1024 / 1024 = 92 gigabytes per second of bandwidth you will consume just to pump stuff to your monitor. Better not use integrated graphics!
To give a comparison, here's 4k@60hz:
3840 * 2160 * 3 * 60 = 1492992000
1492992000 / 1024 / 1024 = 1423 Mb/s.
Also notice that 8k monitors already employ tactics such as "visually lossless compression" (which means: lossy compression but they think you won't notice) and other stuff aimed at trying to not really submit full frames all the time.
Forget your 12k. It will only be useful to increase your energy bill.
In real life it would be subsampled 4:2:0 and 1000hz is nonsense because it's not divisible by 24 or 30. So a more reasonable 8k@960hz 4:2:0 is (76804320960) * (8 + 2 + 2) = ~356Gb/s or only 35.6Gb/s if you pick a reasonable 96Hz.
By 960Hz even a lossless delta coding scheme on the wire could reduce the bandwidth by over 10X for any normal footage.
For comparison, netflix was just barely able to saturate a 100gbps (that's gigabits per second, so only 12.5 gigabytes) network link from one computer, and that's just pumping data without having to render anything.
At some point, it’s not worth upgrading resolution. I don’t know what that point is for you; but, eyes only have a certain arc-length resolution, beyond which everything additional as far as resolution is meaningless.
For me, that’s a bit more than 1440p at 3 feet at 27”.
I had a quick look about to see what the eye/brain can actually perceive and the below is interesting. We can appreciate frame rates far higher than I thought.
A pilot identifying a plane displayed for 1/220th of a second (reddit link) is pretty impressive.
There is no proven limit of how many frames per second our eyes can see, and I'm sure you would be able to discern a difference between 144hz and 1khz. You may not be able to fully comprehend each still image, but the procession would almost certainly appear smoother, especially for fast moving objects.
This is probably good enough in practice, although you can see differences even beyond 1000Hz by observing the phantom array effect of flickering signals during fast eye movement.
Nonsense. We wouldn't cope with it in the same sense we can't cope with the millions of colors in a 24-bit color space. Do we distinguish each individual color? No, but he full spectrum enables continuous color flow.
When it comes to such a high framerate, the upgrade is akin from going from 256-color palette-swapping VGA to 24-bit HD, or ather from stop motion to realistic motion blur.
I also learned that literally pouring liquid nitrogen over a CPU from a cup is a grandmaster overclocker move.
Seems like the heat flow would be substantially impeded by any boiling of the LN2.
Or, for that matter, simply using a chilled copper ingot as the heat sink? There must be some threshold at which the limiting problem is getting the heat out of the die, not getting the heat out of the chip's package.
https://www.qats.com/cms/2019/08/02/quantum-computing-coolin...
Of course, the real grandmaster move is to use liquid helium - its boiling point is about 70C colder than nitrogen :)
Deleted Comment
“It's a lot easier to control a benchmark which is always the same”, explains Rywak. "A game makes the whole process less predictable in terms of hardware load which could lead to stability issues while working on frozen hardware.”
Edit: seems mobile and desktop have a different crop of that image. Here is the image that shows 720: https://images.ctfassets.net/rporu91m20dc/1XYHhlYZzNI1NxRRJl...
That would make sense as well in their discussion about CPU power - that resolution would require a lot of it compared to, say, 4k.
It weighed a tonne and took up quite a lot of desk.
A RTX 2080 Ti can't even push 144fps on 1440p at max, much less 4k.
12k is possible but it's hardly around the corner. Even 4K has both content and hardware issues around it. 8K is going to be the next mass market push but we're not even done with the 4K party.
Also 4K display devices are available at a modest price point now. The bigger issues are content. We're mostly there with mass-market media, but if you want to drive a AAA video game at 4K resolution you're having to make compromises and spend a lot on the hardware to drive it.
They're going to keep making new things. And the new things are going to have bigger numbers. It's okay.
Dead Comment
https://www.digitaltrends.com/computing/asus-rog-swift-360hz...
Assuming our current standard of 8 bits per color with no alpha (3 bytes per pixel), which may be too low if you care so much about your monitor, your required bandwidth becomes:
7680 * 4320 * 3 * 1000 = 99532800000
99532800000 / 1024 / 1024 / 1024 = 92 gigabytes per second of bandwidth you will consume just to pump stuff to your monitor. Better not use integrated graphics!
To give a comparison, here's 4k@60hz:
3840 * 2160 * 3 * 60 = 1492992000
1492992000 / 1024 / 1024 = 1423 Mb/s.
Also notice that 8k monitors already employ tactics such as "visually lossless compression" (which means: lossy compression but they think you won't notice) and other stuff aimed at trying to not really submit full frames all the time.
Forget your 12k. It will only be useful to increase your energy bill.
Edit: fix calculations.
By 960Hz even a lossless delta coding scheme on the wire could reduce the bandwidth by over 10X for any normal footage.
If you have a few million dollars to spare, you could jump to 16k: https://www.techradar.com/news/sonys-16k-crystal-led-display...
For comparison, netflix was just barely able to saturate a 100gbps (that's gigabits per second, so only 12.5 gigabytes) network link from one computer, and that's just pumping data without having to render anything.
For me, that’s a bit more than 1440p at 3 feet at 27”.
https://www.quora.com/What-is-the-highest-frame-rate-fps-tha...
https://www.reddit.com/r/askscience/comments/1vy3qe/how_many...
https://blurbusters.com/blur-busters-law-amazing-journey-to-...
This is probably good enough in practice, although you can see differences even beyond 1000Hz by observing the phantom array effect of flickering signals during fast eye movement.
When it comes to such a high framerate, the upgrade is akin from going from 256-color palette-swapping VGA to 24-bit HD, or ather from stop motion to realistic motion blur.