Readit News logoReadit News
JonathanFly · a year ago
So this a new method that simulates a CRT and genuinely reduces motion blur on any type of higher framerate displays, starting a 120hz. But it doesn't dim the image like black frame insertion which is the only current method that comes close to the clarity of a CRT. But it also simulates other aspects of CRT displays, right?

Can you use this method just to reduce blur without reducing brightness, on any game? They mention reducing blur for many things other than retro games in "Possible Use Cases of Refresh Cycle Shaders" but does reducing blur in a flight simulator also make it visually look like a CRT with phosphors?

delusional · a year ago
They do mention that it does reduce brightness. The selling point compared to strobing sounds to be less eyestrain. I'd expect it to lose more brightness than strobing, considering the lower relative pixel on time.
stevage · a year ago
I do not understand at all what this is talking about or why. Is it some elaborate joke?

Don't visual effects people go to lots of effort to add motion blur? Why would you want to remove it?

Why are they trying to simulate old CRT displays?

Can someone explain what this is about?

Springtime · a year ago
This is about improving motion clarity, so each displayed frame of moving content looks crisp rather than having blur (something that monitors can struggle with even at high refresh rates / high Hz).

Most good monitor reviews of high Hz displays (eg: 120Hz+) take fast photographs of moving objects (typically from Blur Busters' 'Test UFO' web page) to demonstrate how good or poorly a monitor handles fast moving content.

One technique of significantly improving motion clarity is inserting frames of pure black in the display output (aka BFI, black frame insertion). A downside is some are sensitive to this where it causes eyestrain.

This CRT beam simulating shader is said to be similarly effective to BFI at improving motion clarity but with the benefit of reducing eyestrain. However from what I understand the current version is limited to simulating a lower Hz display and requires a higher Hz monitor.

All this is distinct from the kind of in-media motion blur that can be enabled in games or seen in recorded video. It's instead about the monitor not being able to render fast moving content clearly enough which leads to non-crisp output frames.

stevage · a year ago
Thank you, that's a really great explanation.
noduerme · a year ago
What is the method used on newer TVs that attempts to double the framerate / interpolate frames / make everything shot on film look like an overlit soap opera? I find it impossible to watch; it destroys the lighting and the performances. My recollection of CRT TVs was that they had a lot of blur, both motion and spatial, and that was kind of what made them feel warmer and more analog / less overly crispy.
haunter · a year ago
First thing I turn off in every single game is motion blur. It’s only useful in racing sims to have more sense of speed but that’s also a personal taste.

Motion blur made a bit more sense on the 30fps Xbox 360 and PS3 games.

martini333 · a year ago
Why exactly do you think motion blur is added?
cubefox · a year ago
Our eyes are constantly and mostly unconsciously tracking moving objects in our field of view in order to keep them still relative to our eyes. It's called Smooth pursuit: https://en.wikipedia.org/wiki/Smooth_pursuit

This is because our retina has a very low "refresh rate", which means things can easily blur together. Smooth pursuit prevents that. However, modern sample-and-hold displays like LCD and OLED work against Smooth pursuit. If you watch anything moving on a screen (including "still" objects moving on screen due to camera movement), your eye will automatically track those objects if they are momentarily the focus of attention, which should make them be still relative to your eyes and thus appear sharp.

However, since the tracked object is being still relative to your eyes and the individual frames on screen are being still relative to your screen, the frames move (are not being still) relative to your eyes. Which means they appear blurry during smooth pursuit, when in reality they should be perfectly sharp.

For example, your eyes track a sign that moves on the screen due to camera movement. Say it moves 10 pixels per frame horizontally. This means you will see a 10 pixel wide horizontal blur on this sign. Which could make it unreadable. In reality (without screen with a real sign) the sign would appear perfectly clear.

On CRT screens this doesn't happen (to the same extent) because the frame is not displayed for the entire frame time (e.g. 1/60th of a second) but much shorter. The CRT just very quickly flashes the frames and is dark in between. Strobing/flickering basically. So if the tracked object moves 10 pixels per frame, the frame might only be (say) visible for 1/5th of that frame time, which means it moves only 2 pixel while the frame is actually on screen. So you get only 2 pixel blur, which is much less.

Of course at 60 FPS you might instead get some degree of perceptible flicker (computer CRTs therefore often ran higher than 60) and in general the overall achievable screen brightness will be darker, since the screen is black most of each frame time. CRTs had a low maximum brightness. But they had very little of the "persistence blur" which plagues sample-and-hold screens like OLED and LCD.

The motion blur intentionally introduced by video games is there to make moving objects appear smoother that are not tracked by our eyes. In that case motion blur is natural (since smooth pursuit doesn't try to remove it). So some forms of motion blur are undesirable and others are desirable.

The optimal solution would be to run games (and videos content in general) at an extremely high frame rate (like 1000 FPS) which would introduce natural perceptible motion blur where it naturally occurs and remove it where it doesn't naturally occur (during smooth pursuit). But obviously that would be computationally an extremely inefficient way to render games.

By the way, if you have a screen with 120+ Hz you can test the above via this black frame insertion demo, which emulates how CRTs work:

https://testufo.com/blackframes

On my 120 Hz OLED screen, the 40 FPS (1 frame + 2 black frames) UFO looks as clear as the native 120 Hz UFO. A real 60 or even 80 Hz CRT screen would be even better in terms of motion clarity. Perhaps better than a 240 or even 480 Hz OLED.

fishermanbill · a year ago
Yeah they are two different effects. Theres motion blur on individual objects that you want (as human eyes see/have) then there is full screen motion blur that is due to the display technology (lcd,oled etc) that you dont want (as human eyes dont see/have). CRTs dont have this motion blur as the screen is blank most of the time - see slo mo guys on youtube for crt displays.
7734128 · a year ago
Because I hate it.
fishermanbill · a year ago
We need display manufacturers to provide a refresh cycle that is agnostic of the incoming signal hz sent down the cable AND to either provide shader support (ideally) at the displays hz OR to implement this shader.

There really is no need for an expensive RetroTink if we had this. Some manufacturer must be able to do it and the rest would follow.

nopurpose · a year ago
With about half of the screen is black, can it also boost FPS by not spending GPU time on pixels in those areas if integrated deep into engine?
kevingadd · a year ago
You could definitely do this, but a lot of modern rendering techniques rely on having full copies of previous frames lying around, like TXAA, screen-space reflections, etc.
rzzzt · a year ago
The images are briefly persisted and averaged in the back of the viewer's eye.
Scene_Cast2 · a year ago
I don't think this is how it works.

The technique is for when you have X fps content and Y fps capable monitor, where Y > X. In games, you'll still render at your old FPS cap, but this shader is for relatively cheaply generating extra frames that will make the content look smoother / sharper.

fishermanbill · a year ago
Does all this shader really get around is the problem of display inversion?

From Gemini: "Display inversion is the process of alternating the voltage between positive and negative for each pixel on an LCD screen to prevent damage. This process is called polarity inversion."

If display manufacturers knocked that on the head for certain scenarios then surely we could just have a simple block of horizontal screen scrolling down the display at high refresh rates?

Phosphor fall off as far as can be seen in Slo Mo Guys is quite a small effect not on the scale of this shader.

P_I_Staker · a year ago
Cycle refresh shaders where someting my last team really nailed. The key challenge was during the day there is a lot of sun. Adjustments can be made to the location, it really pays dividents.
vlovich123 · a year ago
Does this mean that the original duck hunt gun might work again?
AnthonBerg · a year ago
There’s a really interesting discussion of precisely this in the comments under the article! Recommended. Might have to dig to see it.
vlovich123 · a year ago
I don’t see any comments under the article. Maybe have to be logged in?