Readit News logoReadit News

Deleted Comment

crazygringo · a year ago
The "AI" versions of the movie stills are darker and "greener" or "bluer" in all cases in this article, which is NOT the case when you watch the movie. It's a mistake on the part of whoever put together the image comparisons.

The culprit here is that the non-AI screenshots are taken from presumably 1080p non-HDR sources, while all the AI screenshots are taken from presumably 4K HDR sources. The "AI" images are all displayed in the completely wrong color space -- the dark+green/blue is exactly what HDR content looks like when played on software that doesn't correctly support decoding and displaying HDR content.

It's a shame that the creator of the comparison images doesn't know enough image processing to understand that you can't grab stills from HDR content from a player that doesn't properly support HDR.

On the other hand, the state of HDR support is a mess right now in software players. Playing HDR content in common players like VLC, QuickTime, IINA, and Infuse will give you significantly different results between all of them. So I can't actually blame the creator of the images 100%, because there isn't even a documented, standardized way to compare HDR to non-HDR content side-by-side, as far as I know (hence why each player maps the colors differently).

jacobolus · a year ago
The upscaled versions also screwed up the camera focus blur by artificially removing it, and unnecessarily took out the film grain. Even leaving the grain and blur aside, the texture of the objects depicted is also getting seriously screwed up, with unrealistic looking smoothing and weird places of heightened contrast unrelated to the original scene.

More generally, automatically second guessing the artistic choices of whoever originally color graded the film, for the sake of adding narratively gratuitous features like HDR or extremely high resolution, is a nasty thing to do. There might be moderately more dynamic range in the original physical film than there was in a digital copy; if so, by all means try to capture it in a new digitization while trying to match the visual impression of a theater projection to the extent practical. The "AI" features demonstrated here are incredibly tacky though.

People put in charge of making new versions of beloved artworks need to have good taste and exercise some discretion. Art from different times and places looks different due to changes in both the medium and the culture, and we should enjoy those differences rather than trying to erase them all.

chefandy · a year ago
Right. That director almost certainly chose that particular film because that particular grain lent the film the vibe they were looking for. Directors always look for ways to implement their artistic vision within the bounds of the medium, but the end results is almost certainly their artistic vision. A lot of that missing information is missing for a reason. Surely someone could use AI to invent things that go in all of the little shadows in Nosferatu, or steady the handheld video camera footage David Lynch was into for a while, but those things serve a purpose.

If people want to AI-enhance movies for their own viewing pleasure, then great. Watch it in reversed color or with a face-swap for all I care. But "improving" the original by inventing detail that the director probably never wanted to show to begin with is a parlor trick devoid of any artistic merit.

dkjaudyeqooe · a year ago
> second guessing the artistic choices

This is the original sin of this process. They are tampering with the artistic object, it's no longer what the artists intended.

Some might say this is being pedantic, but the quality of the image has a huge impact on the feel of a movie. For some films, such as Blade Runner, the mood and feel created by the (dark, obscured) look of the film is easily half of the impact. Changing that film would be a crime against humanity, yet I imagine it's only a matter of time before it gets regraded so people can see the film "properly".

dml2135 · a year ago
> On the other hand, the state of HDR support is a mess right now in software players. Playing HDR content in common players like VLC, QuickTime, IINA, and Infuse will give you significantly different results between all of them.

This is the main thing that has kept me from adopting HDR in my media library. I'd expect a feature like HDR would be progressive (if that's the right term that I'm going for), i.e., in a non-supported player, it would look exactly like the non-HDR version of the content, and simply adds more dynamic range when in an environment that supports it. Without that, I'm not going to grab a version of a media file that might look worse if I'm playing it in a non-ideal context.

Does anyone know why this isn't the case? Is it not technically possible for some reason I'm not thinking of, or was it really just a fumble of standards design?

QualmsAplenty · a year ago
The shame is in comparing compressed-to-hell streaming versions to Blu-Ray in the first place, and commenting on how the Blu-Ray is "sharper." ANY Blu-Ray version should be much better than streaming.

The NYT isn't free. I don't respect such shoddy feature-writing.

eru · a year ago
> ANY Blu-Ray version should be much better than streaming.

That should change over time: Blu-Ray is a format released about 20 years ago, and our streaming bandwidth (and encoding algorithms) keep improving.

tio4j32oi434324 · a year ago
It's not surprising. The usual quality metrics that video-encoder people use tend to be positively correlated with saturation (and in fairness, this is what people think is better 'quality').
moomin · a year ago
Glad you pointed that out, because literally all the “after” images look dreadful.

But then im also of the opinion that most HDR content is too damn dark anyway.

crazygringo · a year ago
> most HDR content is too damn dark anyway.

It's not, though -- it's entirely the problem with players. Not just software players but also TV's.

There are lots of TV shows now that are available in 1080p SDR and in 4K HDR. When you play them both on any player or TV, they should have the same brightness in "regular" scenes like two people talking in a room. They're meant to. HDR is only meant to make a handful of elements brighter -- glowing lightsabers, glints of sunlight, explosions. HDR is never meant to make any of the content darker.

Unfortunately, far too many video players and televisions (and projectors) map HDR terribly. The TV's aren't actually meaningfully brighter, but they want to advertise HDR support, so they map the wider HDR brightness range to their limited brightness, so that lightsabers etc. get the full brightness, but everything else is way too dark. This is the disaster that makes everyone think HDR content is too dark.

When the correct thing to do would be to display HDR content at full brightness so it matches SDR for normal scenes, and then things like lightsabers, glints of sunlight, etc. just get blown out to white. To admit the hardware has brightness limitations and should never have been advertised as HDR in the first place.

So the problem isn't with HDR content. The problem is with HDR players, TV's, and projectors. It's mostly on the hardware side, with manufacturers that want to advertise HDR support when that's a lie, because they're not actually producing hardware that is HDR-level bright enough.

zo1 · a year ago
That is a matter of opinion, and in all the comparison images, I think the AI upscaled one always looks better.
contravariant · a year ago
Yeah if the difference is this obvious in a small thumbnail of a heavily compressed image then something probably went wrong. As you said the colours space doesn't look quite right, though I can't identify exactly which mix up took place.

I'm also not too sure about comparing it with some unidentified 'streaming' version. That's like comparing a high resolution digital audio file with a phone call.

Selective sharpening does have a tendency to overemphasize clear edges while leaving the 'unclear' parts untouched, this can give a bit of a plasticy effect especially on skin.

dusted · a year ago
It used to be that (rational) people would be fairly in agreement that one cannot extract information which is not there. That's why we laugh at the "zoom -> 9 pixels -> enhance -> clear picture -> zoom ->.." trop in movies..

AI does not change this.. It adds stuff that's not there, sure, it adds stuff that might be there, or could have been there, but it literally cannot know, so it just paints in plausibility.. Which is horrible and gross already in principle.

I imagine we're not far from the first conviction based on AI enhanced images.. give the model a blurry CCTV frame and a list of suspects and it will enhance the image until it looks like one of them, and since people are apparently this stupid, someone is going to be locked up for something they didn't do.

Back to movies, just, fuck no! There's no enhancements to make, the movie is _DONE_, it looks like it does, leave it.

yreg · a year ago
> That's why we laugh at the "zoom -> 9 pixels -> enhance -> clear picture -> zoom ->.." trope in movies..

> AI does not change this..

It kind of changes it in some cases. The information might be there, but superhuman pattern recognition might be needed to extract it.

And of course, in case factuality doesn't matter, the missing information can be generated and filled in. This obviously doesn't work when you are looking for a terrorist's license plate or when you want to watch an original performance in a movie.

dml2135 · a year ago
"Information" in terms of, what does this thing look like -- could maybe be determined from other shots -- yes, sure.

But I think "information" in the context of film here refers to the indexical mark of light upon the image sensor, and in that case no. If it's not recorded, you can't extract it. And whatever you do put there is of little interest to the film buff to whom "image quality" means a more faithful reproduction of the negative that was seen in theaters.

dusted · a year ago
I tend to disagree, even with superhuman pattern matching, what makes a frame unique is everything in it which does NOT follow the pattern, the way the grain is distributed, the nth order reflections and shadows, is what makes it what it is.
Mahnahnohnah · a year ago
Thats not true.

When you have moving film, you have 24fps from the same scene, a lens and a compression algorithm.

There is a higher chance for pixel a to be color y if over a span of x frames and compression artifact this pixel shows value z.

You will also have the chance to track details visible from frames/stills/pictures from seconds or even minutes ago if the actor just has the same cloth.

And you can estimate the face of the actor across the movie and create an inherant face.

Nonetheless, besides this type of upscaling, if an AI is trained to upscale based on probability of the real world and the movie is from the real world, it is still more than just random.

Btw. there have been plenty of movie makers making movies the way they did because thats the only way they were able to make it. Low light high end camera equipment is expensive.

And if you watch The Matrix without editing on a oled 60" and higher, it looks off because greenscreen details are more noticable than it has been before. Its aboslutly valid to revisit original material to adjust it to today (at least for me).

dml2135 · a year ago
It's frustrating that the article doesn't mention this relatively simple, well understood, and very relevant principle. (Unless I just missed it somewhere, someone please correct me if I'm wrong).
pimlottc · a year ago
There’s also the laws of large numbers here. When you make hundreds and thousands of guesses, even with really high accuracy, you’re going to get some wrong. For a 4K movie at 24 frames a second, you’re talking millions of guesses per minute, perhaps billions or more over the length of an entire film. It’s inevitable you’re going to get weird glitches, artifacts, or just parts that look “off”.

It’s better to use a light touch for techniques like this rather just blindly applying it to the entire film.

antifa · a year ago
I wonder how effective some laser focused AI would be for fixing really bad CGI.
shepherdjerred · a year ago
> AI does not change this.. It adds stuff that's not there, sure, it adds stuff that might be there, or could have been there, but it literally cannot know, so it just paints in plausibility.. Which is horrible and gross already in principle.

That's essentially how the brain works, too.

Free-is-Freedom · a year ago
The difference is that people can express why they made a decision
r_c_a_d_t_s · a year ago
I agree. "AI-enhanced" is just the latest badge to put on content in the hope that some suckers will pay more money for the "same" content they already have. There's no striving for a better version here, just a drive for more profit.
procflora · a year ago
Wow, that tweet they link to with a super punched in shot looks really really bad! Hard to believe Cameron thought this looked better than just a normal 4k transfer, yikes. Was really looking forward to a UHD release of The Abyss but now I'm not so sure...

https://twitter.com/RazorwireRyan/status/1735753526167347232

pimlottc · a year ago
kombookcha · a year ago
Wild that somebody thought this was an improvement.
SrslyJosh · a year ago
What the actual FUCK is that???
ornel · a year ago
The "enhanced" image looks a lot like those retouched Russian photos from the Stalinist era
Dwedit · a year ago
The one on the left looks normal.

The one on the right looks like you ran an edge-directed upscaler on it. Those things have distinct artifacts, and sometimes it looks like all curves turn into snakes. Or it can make new diagonal curves out of random noise.

Not knocking edge-directed upscalers though, they can work in real time and are very good for line-art graphics. You can even inject them into games that have never had that feature before.

echelon_musk · a year ago
The automated HD 'remaster' of Buffy is the prime example of how badly this can go wrong. A great breakdown of the problems is on YouTube here: https://youtube.com/watch?v=oZWNGq70Oyo
J_Shelby_J · a year ago
Great video and a travesty. I’m not a huge fan, but I respect that people are. That they ruined the show and it’s hard to find in it’s not ruined form…
Rinzler89 · a year ago
> Hard to believe Cameron thought this looked better

I doubt he even looked, he's too busy with his blue monkeys these days. Most likely someone duped him on taking on the AI upscaling and he signed off on it without looking and the movie studio just shipped the output without QA to save time and money, because they're going to streaming not in cinemas.

xenospn · a year ago
He’s insanely detail oriented. I’m almost certain he either approved everything, or, he no longer has right of refusal.

Dead Comment

bmacho · a year ago
> Hard to believe Cameron thought this looked better than just a normal 4k transfer, yikes.

Torrent people do this too, and movies is not their dayjob, but their hobby. I guess some people prefer the machine denoised look.

globular-toast · a year ago
Cameron doesn't give a crap. He just needs his latest shot of money.
jstummbillig · a year ago
It seems little wild to assume the maker of a movie would care less about the movie than a random mob on the internet (instead of maybe just having a different opinion) but the assumption does feel very internet.
spacecadet · a year ago
The Abyss in only good on VHS, or laserdisc if you have the setup...
CyberDildonics · a year ago
Really? VHS with its grainy 240 x 320 resolution is better than a blu ray?
Kranar · a year ago
I don't see how these can be compared. They are totally different color schemes. It's possible that if you took the right image and displayed it using the same color scheme as the left image that it would look better than the left image.
matteason · a year ago
Here's a quick and dirty attempt to even up the colours:

https://www.matteason.co.uk/share/true-lies-colours.png

And the full-size original for comparison:

https://pbs.twimg.com/media/GBaikRhX0AEcjuy?format=jpg&name=...

(I just overlaid the original on the left over the upscaled version on the right and set the blending mode to Colour in Photoshop)

actionfromafar · a year ago
But Arnold looks like some smeared painting. Look at it in black and white, it still looks like shit.
xenospn · a year ago
This is very bad.
dmitrybrant · a year ago
My biggest gripe with these AI film enhancements is that they are adding information that was never there. You're no longer watching the original film. You no longer have a sense of how contemporary film equipment worked, what its limitations were, how the director dealt with those limitations, etc.
cout · a year ago
I don't think that's universally true of all AI enhancement though. Information that is "missing" in one frame might be pulled in from a nearby frame. As others have pointed out, we are in the infancy of video enhancement and the future is not fundamentally limited.

If that takes away from the artistic nature of the film I understand the complaint, but I look forward to seeing this technology applied where the original reel has been damaged. In those cases we are already missing out on what the director intended.

pimlottc · a year ago
In part, we need more vocabulary to distinguish different techniques. Everyone is just "AI" right now, which could mean many different things.

Standard terminology would help us discuss what methods are acceptable for what purposes and what goes too far. And it has to be terminology that the public can understand, so they can make informed decisions.

ClassyJacket · a year ago
> Information that is "missing" in one frame might be pulled in from a nearby frame

Yeah - does anyone know if anyone is actually doing this? Like some sort of DLSS for video? I'd love to read about it.

janalsncm · a year ago
If there is a movie which is only shot in 1080p and I have a 4k TV, it seems like there’s three options. One, watch it in the original 1080p with 3/4 of the screen as black border. Two, stretch the image, making it blurry. Three, upscale the image. If you give me the choice, I’m choosing 3 every time.

Sorry if it sounds crass, but I feel the process of shooting the movie is less important than the story it is trying to tell.

planede · a year ago
Upscaling exactly 2x is also an option.
MyFedora · a year ago
Most people don't care. Photographers had a real great time pointing out that Samsung literally AI replaced the moon, but some Samsung S21 Ultra users were busy bragging how great “their” moon pictures turned out. Let's judge AI enhancements like sound design: Noticeably good if done well, unnoticeable if done satisfactory and noticeably distracting if done poorly. The article shows a case of noticeably distracting, so they're better off with the original version.
dml2135 · a year ago
It's a fundamentally different concept of photography though, one that becomes more similar to a painting or collage than a captured frame of light. Regardless of the merits of one over the other are for the purposes of storytelling, it's a bit worrisome when the distinction is lost on people altogether.

Deleted Comment

fnordpiglet · a year ago
I get how a film buff might care, and agree the original version should be available, but isn’t there space for people who just want to see the story but experience it with modern levels of image quality? The technical details of technology at some point of time is definitely interesting to some people, but as say the writer or others associated with the creative and less technical aspects of a film I may find the technical limitations make the story less accessible to people used to more modern technologies and quality.
dml2135 · a year ago
What does "modern levels of image quality" mean in this context?

The article is about AI upscaling "True Lies", which was shot on 35mm film. 35mm provides a very high level of detail -- about equivalent in resolution to a 4k digital picture. We're not talking about getting an old VHS tape to look decent on your TV here.

The differences in quality between 35mm film and 4k digital are really more qualitative than quantitative, such as dynamic range and film grain. But things like lighting and dynamic range are just as much directorial choices as script, story, any other aspect of a film. It's a visual medium, after all.

Is the goal to have all old movies have the same, flatly lit streaming "content" look that's so ubiquitous today?

I think the argument against "isn’t there space for people who just want to see the story but experience it with modern levels of image quality" is that such a space is a-historical -- It's a space for someone that doesn't want to engage with the fact that things were different in the (not even very distant) past, and (at the risk of sounding a bit pretentious) it breeds an intellectually lazy and small-minded culture.

orbital-decay · a year ago
The problem with that is the content is usually shot with the certain definition in mind. If you don't film certain scenes from scratch, they can end up looking weird in higher definition, simply because certain tricks rely on low definition/poor quality, or because you get a mismatch between old VFX and new resolution, for example.

It's a widespread issue with the emulation of old games that have been made for really low resolution/different ratio screens and slow hardware, especially early 3D/2D combinations like Final Fantasy, and those that relied on janky analog video outputs to draw their effects.

icehawk · a year ago
For anything that's not just "grab a camera and shoot the movie" the format that it is shot in is absolutely taken into account. I don't think you can separate the story from how the image is captured.
wolverine876 · a year ago
One perspective:

'Film buff' responses are common to every major change in technology and society. People highly invested in the old way have an understandably conservative reaction - wait! slow down! what happens to all these old values?! They look for and find flaws, confirming their fears (a confirmation bias) and supporting their argument to slow down.

They are right that some values will be lost; hopefully much more will be gained. The existance of flaws in beta / first generation applications doesn't correlate with future success.

Also, they unknowingly mislead by reasoning with what is also an old sales disinformation technique: List the positive values of Option A, compare them to Option B; B, being a different product, inevitably will differ from A's design and strengths and lose the comparison. The comparision misleads us because it omits B's concept and its strengths that are superior to A's; with a new technology, those strengths aren't even all known - in this case, we can see B's far superior resolution and cleaner image. We also don't know what creative, artistic uses people will come up with - for example, maybe it can be used to blend two very different kinds of films together.

These things happen with political and social issues too. It's just another way of saying the second step in what every innovator experiences: 'first they laugh at you, then they tell you it violates the orthodoxy, then they say they knew it all along'.

Cthulhu_ · a year ago
Where would you draw the line though? What is acceptable non-AI remastering?

I'm 99% confident that similar issues were raised with e.g. recolored films, HD upscales, etc.

mrob · a year ago
I draw the line at edits that consider semiotic meaning. Edits are acceptable if they apply globally (e.g. color correction to compensate for faded negatives), or if they apply locally based on purely geometric considerations (e.g. sharpening based on edge detection), but not if they try to decide what some aspect of the image signifies (e.g. red eye removal, which requires guessing which pixels are supposed to represent an eye). AI makes no distinction between geometric and semiotic meaning, so AI edits are never acceptable.
cratermoon · a year ago
Yes, back in the mid-late 80s Turner Entertainment colorized a huge number of old films in their vaults to show on cable movie channels. It was almost universally panned. It was seen at first as a way to give mediocre old films with known stars a brief revival, but then Turner started to colorize classic, multi-award-winning films like The Asphalt Jungle and the whole idea was dismissed as a meretricious money-grab.
JumpCrisscross · a year ago
> how contemporary film equipment worked, what its limitations were, how the director dealt with those limitations, etc.

Non-film buffs, i.e. most viewers, don't care about this.

caconym_ · a year ago
Any art and/or media production executed well enough to be culturally significant rests on an enormous depth of artistic and technical choices that most audiences have zero awareness of—and yet, if you took them all away, you would have nothing left. Every change takes you further from the original artist's vision, and if all you want to do is Consume Slop then that's fine I guess, but the stewards of these works should aim higher.
spiderxxxx · a year ago
I agree, most people watch the movie for the story that unfolds. Few are looking at things like framing the subject, the pull of the focus, subtle lighting differences between scenes, they are interested in the story, not the art of filmmaking. The people offended by this are the ones that are crying about the art being taken out of it.
nottorp · a year ago
> Non-film buffs, i.e. most viewers, don't care about this.

... consciously ...

Assuming competent cinematography, it will have an effect on the viewer whether they can analyze it or not.

Unfrozen0688 · a year ago
They do but they dont know.

See new Netflix show Ripley.

All shot in B&W, beautifully shot.

nostromo · a year ago
The originals still exist and you’re free to watch those instead.

This just provides a new way to watch older movies should you choose to do so. Or not.

caconym_ · a year ago
> The originals still exist and you’re free to watch those instead.

This is far from certain, unless "you" are willing to engage in piracy. It's often difficult or impossible to legitimately buy (or even rent) the original, unadulterated versions of older films.

rightbyte · a year ago
To watch Star Wars as it was originally you need to break US law.
Mahnahnohnah · a year ago
Its definitly an interesting point you bring up with the contemporary film equipment.

Nonetheless, i do believe that most film makers are actually want to make a film not work around contemprary limitations.

rullelito · a year ago
Do most people care? I just want to eat popcorn and watch a movie.
kelseyfrog · a year ago
How do you feel about extended cuts?
chilmers · a year ago
I think this kind of AI "enhancement" is where CGI was in the 90s. It might be state of the art tech, but it's still very unrefined, and in ten or twenty years these remasters will look painfully dated and bad.
gdubs · a year ago
I dunno – I look at the original Jurassic Park and it still looks pretty amazing to me. Same with Terminator II. In many ways I feel like as directors got more and more capabilities with the tools they became comically overused. I don't think it's the sophistication of the tools, but the way that they're wielded that will make them look dated, or timeless.
Arainach · a year ago
The original Jurassic Park made extensive use of practical effects and models rather than relying primarily on CGI.
bamboozled · a year ago
I find most movies CGI these days almost impossible to watch it looks so crap.

Then I watch Star Wars and I just cannot believe they did this back in the day.

mrob · a year ago
I think it's worse than bad CGI. With bad CGI, you can use your imagination and interpret as what it would have looked like if they had unlimited time and budget. You can't do that with bad AI "enhancement", because it's an automation of that same imaginative process. You'd have to somehow mentally reverse the AI processing before imagining a better version, which is much more difficult.
mhh__ · a year ago
Even now some CGI still looks very poor despite big spending (e.g. most marvel movies)
josefx · a year ago
People who knew what they where doing could pull of some timeless art with 90s CGI and a decade of improvements did not stop people from ruining otherwise good movies with bad GCI either. AI is just another tool that needs to be used correctly.
samsk · a year ago
Maybe I'm old, but do we really watch the movies because they are sharper, have vivid colors, etc or because of the story ?

On the other hand, I would probably pay for an AI that will 'undark' that new super artistic movies, because some of them, have worse lighting than the Blair With Project...

grujicd · a year ago
Super dark scenes might work on OLED screen, but every projector I saw including those in theatres can't display real black and darkest shades are always a problem. It's not a problem if you have bright parts in the same scene since eye will adapt. But everything is dark it won't look good in cinema. That affected how movies were shot, lighted, which kind of scenes were filmed at all. I wonder if some of those newer movies are intended to look better on modern TVs than in movie theather?
bamboozled · a year ago
You're not old ,people just have a hammer and are looking for a nail.

We'll see so much of this in the next few years, optimizing everything to the point of boring. Perfect pop songs, perfect movies, perfect novels and on it goes.

mrguyorama · a year ago
How many people watched Dune 1 and 2 because of a story that has been around for decades and already had one interesting film interpretation? How many people watched Avatar for the story?

The existence of IMAX should be a hint that there is value in very crisp visuals.

However, this stupid tiktok filter is NOT "very crisp".

acuozzo · a year ago
> Maybe I'm old, but do we really watch the movies because they are sharper, have vivid colors, etc or because of the story ?

Where would, e.g., Koyaanisqatsi fit then?

tiptup300 · a year ago
a lot of the people who purchase a lot of films buy them because they are sharper, have more vivid colors.

Who knows if those whales are even watching most of their own collection.