Readit News logoReadit News
zamadatix · 2 years ago
A bit of a let down that the video demoing SDR->HDR conversion is itself only published in SDR. Makes as much sense as demoing a colorization tool in a grayscale video!
sharperguy · 2 years ago
At this point, with any new model I think it makes sense to wait until you can run the model on your own input before making any assumptions based on cherry picked examples.
mysteria · 2 years ago
If they were serious about showing this tech off they should've provided a video file download. Also indicate that it's a HDR file and should only be viewed on a HDR display. Youtube is just making this look bad as people won't see a difference.
zamadatix · 2 years ago
YouTube supports HDR video, no need for a separate download.
CamperBob2 · 2 years ago
YouTube tends to post a downscaled SD version first, then they encode and post the higher-res versions when they get around to it. This can take days in some cases. Meanwhile the creator catches the flak...
unsane · 2 years ago
Creators don't publish videos until the high-res versions are done processing.
zamadatix · 2 years ago
You don't need high res for HDR on YouTube (144p HDR is a thing there oddly enough) and the 4k version had already processed when I posted that comment (with no change since in HDR availability). Usually media announcements/large channels pre-upload the video so it's ready when they want it to actually publish to avoid that kind of issue though.
jiggawatts · 2 years ago
4K processing takes just minutes, but HDR processing can take over a month to… never. There is no indication of this at all, no progress bar or eta. Just check manually every few days!

This is why everyone is giving up on HDR, it’s just too painful because the content distributors are all so bad at it, with Netflix being the sole exception.

kevingadd · 2 years ago
HDR video playback in the browser is pretty unreliable unless you're on a Mac.
ffsm8 · 2 years ago
It's also pretty unreliable on Mac too...

It's more reliable then on linux though, and windows has been doing "auto HDR" for videos for years, so kinda hard to tell when something is HDR or not there.

zamadatix · 2 years ago
In what way? I've been doing it without issue on PC longer than I've even owned a Mac.
devwastaken · 2 years ago
HDR through YouTube appears to work fine even on my non HDR certified HDR monitor.

Deleted Comment

Sparkyte · 2 years ago
I am frequently disappointed by such videos.
rado · 2 years ago
Ridiculous. Like when James Cameron promoted Avatar HDR with an SDR YouTube video, while YT is perfectly capable of HDR playback.
babypuncher · 2 years ago
At least as of a couple of years ago, HDR support on YouTube has been pretty bad[1]. I know they've been working to improve things since, but I kind of don't blame people for walking away from that mess.

1. https://www.youtube.com/watch?v=DwDWQyBF9II

kelseyfrog · 2 years ago
I guess. There's a lot of details we don't know that would change the calculus on this.

To use a analogous workflow, it could be like saying, "It's pointless to shoot video in 10-bit log if it's going to be displayed on Rec.709 at 8-bits." It completely leaves out available transforms and manipulations in HDR that do have a noticeable impact even when SDR is the target.

Again, we can't know if it's important given the information that's available, but we can't know if it's pointless either.

Deleted Comment

skottenborg · 2 years ago
I could see a future where this works really well. It doesn't seem to be the case right now though.

The "super resolution" showcased in the video seemed almost identical to adjusting the "sharpness" in any basic photo editing software. That is to say, perceived sharpness goes up, but actual conveyed details stays identical.

brucethemoose2 · 2 years ago
Note that YouTube is really bad for these demos due to the re-compression, even in zoomed in stills.
thefourthchime · 2 years ago
Allegedly the new one plus phone does this trick in real time as well as up sampling and interframe motion interpretation. Mrwhostheboss seems impressed, but I don't really trust his yet judgment on these things.

https://www.youtube.com/watch?v=J9-9fP_pcEc&t=1107s

nomel · 2 years ago
The iPhone has also done this, for a few years now. It was, surprisingly, a one sentence mention in the keynote/release notes.
moondev · 2 years ago
Whatever special sauce the Nvidia shield uses is honestly incredible. Real time upscaling of any stream, and not just optimized for low res source, its like a force multiplier on content that is already HD. Supposedly the windows drivers do it as well but the effect seems less noticeable to me in my tests
aantix · 2 years ago
I'm curious - what's the best open-source video upscaling library out there?

I looked back about a year ago, and it didn't seem like there were any good open-source solutions.

adzm · 2 years ago
Topaz is light years ahead of any open source solution unfortunately.
cf100clunk · 2 years ago
An HN search of ''Deep Space Nine'' and ''Topaz'' will show some great discussions here covering the dearth of such upscaling solutions, as well as some huge efforts before commonplace AI.
Dylan16807 · 2 years ago
I found this single discussion? https://news.ycombinator.com/item?id=19453745

And that's by avoiding the word "topaz", where I see no story results with discussions and not much of comment results.

justinclift · 2 years ago
It's not exactly what you're after, as it's anime specific and you need to process the video yourself (eg disassemble to frames, run the upscaler, then assemble back to a movie file), but Real-ESRGAN is very good for cleaning up old, low resolution anime:

https://github.com/xinntao/Real-ESRGAN/

Adverblessly · 2 years ago
If you want to avoid manual processing, Anime4K runs in real time as a GLSL shader you can load into MPV (or Plex or something called IINA according to the readme) and still gives great results.

https://github.com/bloc97/Anime4K

two_in_one · 2 years ago
It depends on what do you mean by 'open-source', along with training materials and full setup? That will be hard to find. Upscaling was popular like 10 years back. That's why there is no much interest today. Training in old style isn't that hard. But artifacts are popping up in all videos I've seen.
varispeed · 2 years ago
That seems like a gimmick and I actually prefer SDR video that is not upscaled. There is something ugly about those AI treated videos. They look fake.
ls612 · 2 years ago
The RTX video upscaling feature works really well, there's a bug in the Firefox implementation that allows you to switch between native and upscaled side by side and the difference is striking. I don't have an HDR monitor so I can't tell you how well this new HDR feature works.
deergomoo · 2 years ago
They are fake. Ultimately it’s not recovering lost detail, it’s making shit up
zeusk · 2 years ago
These remind me of the Samsung debacle about recognizing moon and emplacing a high quality texture of it into the image shot by camera.
4d4m · 2 years ago
Exactly. This is akin to upscaling or frame rate interpolation. No consumers want this, they turn it off in settings.
nomel · 2 years ago
I don't think making things up is the problem, it's if it's believable. If it's indistinguishable to a viewer, then who cares. I never would have thought the HDR of the clouds was "made up".
rixrax · 2 years ago
I recently had some old super8 films shot by my parents scanned into 1080p resolution in ProresHQ. Because of the poor optics of the original camera, imperfect focus when shooting, poor lightning conditions, and general deterioration of the film stock, most of the footage won't get anywhere near what 1080p could deliver.

What I'd like to try at some point is to let some AI/ML model process the frames, and instead of necessarily scaling it up to 4k etc., 'just' add (aka magic) missing detail into 1080p version and generally unblur it.

Is there anything out there, even in research phase that can take existing video stock, and then hallucinate into it detail that never was there to begin with? What NVidia is demoing here seem like steps to that direction...

I did test out Topaz Video and DaVinci's built-in super resolution feature, both of which gave me a 4k video with some changes to the original. But not the magic I am after.

anjc · 2 years ago
I also restored some Super 8 footage recently and had great success. The biggest win I had wasn't resolution, but slowing down the speed to be correct in DaVinci, and interpolating frames to make it 60fps using the RIFE algorithm in FlowFrames. I then used Film9 to remove shake, colour-correct, sharpen and so on.

Correcting the speed and interpolating frames added an amazing amount of detail that wasn't perceptible to me in the originals (albeit it was there).

All of this processing does remove some of the charm of the medium, so I'll be keeping the original scans in any case.

dreamcompiler · 2 years ago
How did you do the original scanning? I have a ton of Super 8 that needs to be scanned.
actionfromafar · 2 years ago
An interesting thing about Super8: the resolution is generally very poor, but it can have quite the dynamic range. Also, with film in general (and video, but it's easier with film because you have global shutter) you can compensate motion blur and get more detail out which isn't visible when you look at the film frame by frame. And none of this needs AI.

Regarding hallucination, I agree with the sibling comment, the problem is that faces change. And with video, I'm not even sure the same person would have the same face in various parts of the video...

baq · 2 years ago
there is AI tech to do this already. it has a slight problem, though: it adds detail to faces (this is marketing speak for completely changes how people look).
UberFly · 2 years ago
Something like this will always change the original as it's guessing what should be there as it up scales. Only time will improve the guessing.
poglet · 2 years ago
You could look into RTX Video Super Resolution
kwanbix · 2 years ago
The HDR transformation was really impresive. The Upscale not so much. At least not in my monitor.
DrNosferatu · 2 years ago
Speaking of which, Nvidia has built-in live AI upscaling on the Shield TV android box.

- Is there any stand-alone live AI upscaling / 'enhance' alternative for android or any other platform?

lagadu · 2 years ago
The Shield is kind of an extreme outlier in today's environment. A device from 2015 that 9 years later is still one of the top tier choices in its (consumer) market is almost unheard of.

In fact it's reportedly the currently supported Android device out there with the longest support[0], it's crazy that mine still gets updates.

[0]https://www.androidcentral.com/android-longest-support-life-...

moondev · 2 years ago
It really is awesome. I also enjoy the UI that allows you to side by side compare a stream and the difference is insane.

I have been meaning to see how well it handles streaming a desktop via moonlight to the shield to real time upscale a second monitor's content. I assume it's trained for video footage and not static UI components. The RTX windows drivers don't seem to upscale as well as the shield.

maxglute · 2 years ago
Interested in this too. I replaced my shield with a steamlink to desktop that does upscaling which is very clunky.
DrNosferatu · 2 years ago
So, should one buy a Shield TV today?

It’s pricey, and being so old, I fear it will soon be obsoleted…

bendergarcia · 2 years ago
I think they should rephrase. It makes SDR appear HDR. It’s just making up information no? It’s not actually making it HDR just it appears to be HDR?
Alghranokk · 2 years ago
Making up information? The same can be said for most commonly used modern compressed video formats. Just low bitrate streams of data that gets interpolated and predicted into producing what looks like high resolution video. AV1 even has entire systems for synthesizing film grain.

The way i see it, if the ai generated HDR looks good, why not? It wouldn't be more fake or made up than the rest of the video.