Readit News logoReadit News
sdenton4 · 5 years ago
TIL that there's such a thing as a 'grandmaster overclocker.'

I also learned that literally pouring liquid nitrogen over a CPU from a cup is a grandmaster overclocker move.

ISL · 5 years ago
If that's true, how would one rank pumping supercritical liquid nitrogen at high rates through a heatsink? Super-grandmaster?

Seems like the heat flow would be substantially impeded by any boiling of the LN2.

Or, for that matter, simply using a chilled copper ingot as the heat sink? There must be some threshold at which the limiting problem is getting the heat out of the die, not getting the heat out of the chip's package.

VectorLock · 5 years ago
Exposing the die directly and removing heat spreaders is increasingly common in "grandmaster" overclocking circles.
rcpt · 5 years ago
Quantum computer research might be super-grandmaster

https://www.qats.com/cms/2019/08/02/quantum-computing-coolin...

pvarangot · 5 years ago
They also screwed a cool little tower on top of the CPU and are wearing a Doom helmet. Don't under represent their mastery.
fuzxi · 5 years ago
With no gloves, to boot!

Of course, the real grandmaster move is to use liquid helium - its boiling point is about 70C colder than nitrogen :)

gsteinb88 · 5 years ago
Carrier freezout actually makes this non-workable -- there's a limit to how cold you can make CMOS devices before they stop functioning. To say nothing of the specific heat of liquid helium, which is miniscule compared to LN2
sdenton4 · 5 years ago
Yeah, my favorite part of the article was the warning to please use proper safety precautions with liquid N, right under the no-gloves picture...
cheerlessbog · 5 years ago
How does even N2 not crack and warp the packages and boards? If I plunged my TV into liquid nitrogen I would expect it to not work - in a violent way.

Deleted Comment

saagarjha · 5 years ago
TIL processors can actually hit a stable 6.6 GHz if you just pump liquid nitrogen through them…
system2 · 5 years ago
Today we didn't learn actually. They don't mention it being stable:

“It's a lot easier to control a benchmark which is always the same”, explains Rywak. "A game makes the whole process less predictable in terms of hardware load which could lead to stability issues while working on frozen hardware.”

SketchySeaBeast · 5 years ago
Is it overly picky of me to wish they specified resolution and game settings?
leddt · 5 years ago
You can see on the photo at the top that they were running at 1280x720. As for graphics settings I would guess minimum.

Edit: seems mobile and desktop have a different crop of that image. Here is the image that shows 720: https://images.ctfassets.net/rporu91m20dc/1XYHhlYZzNI1NxRRJl...

sedatk · 5 years ago
So, does it count if I push 5000fps on 320x200?
SketchySeaBeast · 5 years ago
Thanks - I even looked in the images for it.

That would make sense as well in their discussion about CPU power - that resolution would require a lot of it compared to, say, 4k.

Whatarethese · 5 years ago
Cant wait for my 1000hz monitor!
gerdesj · 5 years ago
My last CRT Iiyama monitor had quite a refresh rate - 120Hz or more - and the picture looked absolutely gorgeous.

It weighed a tonne and took up quite a lot of desk.

petterparker · 5 years ago
Monitors with high refresh rates (usually 140Hz but up to 240Hz) are quite common among gamers these days.
Ziggy_Zaggy · 5 years ago
Can't wait for my 1000hz eyeballs!
moonchild · 5 years ago
Human eyes don't have a refresh rate, per se. I recall hearing that they can discern differences in frequency up to 3 kHz or so; though obviously you can go much slower than that and still feel smooth and relatively responsive.
stephc_int13 · 5 years ago
What is interesting, from the screenshot, is that the game is actually CPU bound. Contrary to an often held belief in the high-end video games optimization circles.
654wak654 · 5 years ago
You can just lower graphics settings to get more out of the GPU, but there is no CPU equivalent to that like "lowering AI quality".
endergen · 5 years ago
I’d want motion blur on then for super natural looking motion
dwighttk · 5 years ago
But can it run Crysis?
sandworm101 · 5 years ago
No. I just bought a 4k screen only to find out that 12k is coming down the pipe. I do not want to think about what 1000hz 12k screens will cost. Stop this madness now.
smabie · 5 years ago
No one is gonna make 12k at 1000hz for decades, at least. And if they do, it would be irrelevant without a comparable GPU.

A RTX 2080 Ti can't even push 144fps on 1440p at max, much less 4k.

eterm · 5 years ago
Just the monitor cables alone would never handle the bandwidth required.
nobodyshere · 5 years ago
It easily pushes more in doom.
hellotomyrars · 5 years ago
Something is always coming down the pipe. Fortunately you have need to walk the treadmill, and if you're only just getting a 4K screen now, then you're probably not the kind of insane early-adopter who is.

12k is possible but it's hardly around the corner. Even 4K has both content and hardware issues around it. 8K is going to be the next mass market push but we're not even done with the 4K party.

Also 4K display devices are available at a modest price point now. The bigger issues are content. We're mostly there with mass-market media, but if you want to drive a AAA video game at 4K resolution you're having to make compromises and spend a lot on the hardware to drive it.

They're going to keep making new things. And the new things are going to have bigger numbers. It's okay.

recursive · 5 years ago
You aren't obligated to own the highest spec hardware in existence.

Dead Comment

theandrewbailey · 5 years ago
Frames per second is not the same thing as your display refresh rate. Further, I'm not aware of any monitor capable of 1000hz operation.
mikeyouse · 5 years ago
Fastest consumer ones seem to max out at 360hz right now;

https://www.digitaltrends.com/computing/asus-rog-swift-360hz...

dyingkneepad · 5 years ago
12k? Never heard of. Let's go with 8k, which is 7680x4320.

Assuming our current standard of 8 bits per color with no alpha (3 bytes per pixel), which may be too low if you care so much about your monitor, your required bandwidth becomes:

7680 * 4320 * 3 * 1000 = 99532800000

99532800000 / 1024 / 1024 / 1024 = 92 gigabytes per second of bandwidth you will consume just to pump stuff to your monitor. Better not use integrated graphics!

To give a comparison, here's 4k@60hz:

3840 * 2160 * 3 * 60 = 1492992000

1492992000 / 1024 / 1024 = 1423 Mb/s.

Also notice that 8k monitors already employ tactics such as "visually lossless compression" (which means: lossy compression but they think you won't notice) and other stuff aimed at trying to not really submit full frames all the time.

Forget your 12k. It will only be useful to increase your energy bill.

Edit: fix calculations.

verall · 5 years ago
In real life it would be subsampled 4:2:0 and 1000hz is nonsense because it's not divisible by 24 or 30. So a more reasonable 8k@960hz 4:2:0 is (76804320960) * (8 + 2 + 2) = ~356Gb/s or only 35.6Gb/s if you pick a reasonable 96Hz.

By 960Hz even a lossless delta coding scheme on the wire could reduce the bandwidth by over 10X for any normal footage.

Izkata · 5 years ago
> 12k? Never heard of. Let's go with 8k, which is 7680x4320.

If you have a few million dollars to spare, you could jump to 16k: https://www.techradar.com/news/sonys-16k-crystal-led-display...

moonchild · 5 years ago
> 92 gigabytes per second of bandwidth

For comparison, netflix was just barely able to saturate a 100gbps (that's gigabits per second, so only 12.5 gigabytes) network link from one computer, and that's just pumping data without having to render anything.

t-writescode · 5 years ago
At some point, it’s not worth upgrading resolution. I don’t know what that point is for you; but, eyes only have a certain arc-length resolution, beyond which everything additional as far as resolution is meaningless.

For me, that’s a bit more than 1440p at 3 feet at 27”.

anticensor · 5 years ago
Neither your eyes nor your brain would be capable to cope with that.
lostlogin · 5 years ago
I had a quick look about to see what the eye/brain can actually perceive and the below is interesting. We can appreciate frame rates far higher than I thought. A pilot identifying a plane displayed for 1/220th of a second (reddit link) is pretty impressive.

https://www.quora.com/What-is-the-highest-frame-rate-fps-tha...

https://www.reddit.com/r/askscience/comments/1vy3qe/how_many...

tripnull · 5 years ago
There is no proven limit of how many frames per second our eyes can see, and I'm sure you would be able to discern a difference between 144hz and 1khz. You may not be able to fully comprehend each still image, but the procession would almost certainly appear smoother, especially for fast moving objects.
mrob · 5 years ago
1000fps on a 1000Hz display gives you blurless motion without needing flicker:

https://blurbusters.com/blur-busters-law-amazing-journey-to-...

This is probably good enough in practice, although you can see differences even beyond 1000Hz by observing the phantom array effect of flickering signals during fast eye movement.

blueboo · 5 years ago
Nonsense. We wouldn't cope with it in the same sense we can't cope with the millions of colors in a 24-bit color space. Do we distinguish each individual color? No, but he full spectrum enables continuous color flow.

When it comes to such a high framerate, the upgrade is akin from going from 256-color palette-swapping VGA to 24-bit HD, or ather from stop motion to realistic motion blur.