I did my PhD in Atomic, Molecular, and Optical (AMO) physics, and despite "optical" being part of that I realized midway that I didn't know enough about how regular cameras worked!
It didn't take very long to learn, and it turned out to be extremely important in the work I did during the early days at Waymo and later at Motional.
I wanted to pass along this fun video from several years ago that discusses HDR: https://www.youtube.com/watch?v=bkQJdaGGVM8 . It's short and fun, I recommend it to all HN readers.
Separately, if you want a more serious introduction to digital photography, I recommend the lectures by Marc Levoy from his Stanford course: https://www.youtube.com/watch?v=y7HrM-fk_Rc&list=PL8ungNrvUY... . I believe he runs his own group at Adobe now after leading a successful effort at Google making their pixel cameras the best in the industry for a couple of years. (And then everyone more-or-less caught up, just like with most tech improvements in the history of smartphones).
Try capturing fire with a non-Sony phone and a Sony phone. At least Samsung doesn't color correct blackbodies right and the flame looks nothing like reality.
This gets to a gaming rant of mine: Our natural vision can handle these things because our eyes scan sections of the scene with constant adjustment (light-level, focus) while our brain is compositing it together into what feels like a single moment.
However certain effects in games (i.e. "HDR" and Depth of Field) instead reduce the fidelity of the experience. These features limp along only while our gaze is aimed at the exact spot the software expects. If you glance anywhere else around the scene, you instead percieve an unrealistically wrong coloration or blur that frustratingly persists no matter how much you squint. These problems will remain until gaze-tracking support becomes standard.
So ultimately these features reduce the realism of the experience. They make it less like being there and more like you're watching a second-hand movie recorded on flawed video-cameras. This distinction is even clearer if you consider cases where "film grain" is added.
It's crazy that post is 15 years old. Like the OP and this post get at, HDR isn't really a good description of what's happening. HDR often means one or more of at least 3 different things (capture, storage, and presentation). It's just the sticker slapped on advertising.
Things like lens flares, motion blur, film grain, and shallow depth of field are mimicking cameras and not what being there is like--but from a narrative perspective we experience a lot of these things through tv and film. Its visual shorthand. Like Star Wars or Battlestar Galactica copying WWII dogfight footage even though it's less like what it would be like if you were there. High FPS television can feel cheap while 24fps can feel premium and "filmic."
Often those limitations are in place so the experience is consistent for everyone. Games will have you set brightness and contrast--I had friends that would crank everything up to avoid jump scares and to clearly see objects intended to be hidden in shadows. Another reason for consistent presentation is for unfair advantages in multiplayer.
> Things like lens flares, motion blur, film grain, and shallow depth of field are mimicking cameras and not what being there is like
Ignoring film grain, our vision has all these effects all the same.
Look in front of you and only a single plane will be in focus (and only your fovea produces any sort of legibility). Look towards a bright light and you might get flaring from just your eyes. Stare out the side of a car or train when driving at speed and you'll see motion blur, interrupted only by brief clarity if you intentionally try to follow the motion with your eyes.
Without depth of field simulation, the whole scene is just a flat plane with completely unrealistic clarity, and because it's comparatively small, too much of it is smack center on your fovea. The problem is that these are simulations that do not track your eyes, and make the (mostly valid!) assumption that you're looking, nearby or in front of whatever you're controlling.
Maybe motion blur becomes unneccessary given a high enough resolution and refresh rate, but depth of field either requires actual depth or foveal tracking (which only works for one person). Tasteful application of current techniques is probably better.
> High FPS television can feel cheap while 24fps can feel premium and "filmic."
Ugh. I will never understand the obsession this effect. There is no such thing as a "soap opera effect" as people liek to call it, only a slideshow effect.
The history behind this is purely a series of cost-cutting measures entirely unrelated to the user experience or artistic qualities. 24 fps came to be because audio was slapped onto the film, and was the slowest speed where the audio track was acceptable intelligible, saving costly film paper - the sole priority of the time. Before that, we used to record content at variable frame rates but play it back at 30-40 fps.
We're clinging on to a cost-cutting measure that was a significant compromise from the time of hand cranked film recording.
I'm with you on depth of field, but I don't understand why you think HDR reduces the fidelity of a game.
If you have a good display (eg an OLED) then the brights are brighter and simultaneously there is more detail in the blacks. Why do you think that is worse than SDR?
The “HDR” here is in the sense of “tone mapping to SDR”. Should also be said that even “H” DR displays only have a stop or two of more range, still much less than in a real-world high-contrast scenes
Hell yeah, this one of many issues I had with the first Avatar movie. The movie was so filled with cool things to look at but none of it was in focus. 10 minutes in I had had enough and was ready for a more traditional movie experience. Impressive yes, for 10 minutes, then exhausting.
I had a similar complaint with the few 3D things I watched when that has been hyped in the past (e.g., when Avatar came out in cinemas, and when 3D home TVs seemed to briefly become a thing 15 years ago). It felt like Hollywood was giving me the freedom to immerse myself, but then simultaneously trying to constrain that freedom and force me to look at specific things in specific ways. I don't know what the specific solution is, but it struck me that we needed to be adopting lessons from live stage productions more than cinema if you really want people to think what they're seeing is real.
Stereo film has its own limitations. Sadly, shooting for stereo was expensive and often corners were cut just to get it to show up in a theater where they can charge a premium for a stereo screening. Home video was always a nightmare--nobody wants to wear glasses (glassesless stereo TVs had a very narrow viewing angle).
It may not be obvious, but film has a visual language. If you look at early film, it wasn't obvious if you cut to something that the audience would understand what was going on. Panning from one object to another implies a connection. It's built on the visual language of still photography (things like rule of thirds, using contrast or color to direct your eye, etc). All directing your eye.
Stereo film has its own limitations that were still being explored. In a regular film, you would do a rack focus to connect something in the foreground to the background. In stereo, when there's a rack focus people don't follow the camera the same way. In regular film, you could show someone's back in the foreground of a shot and cut them off at the waist. In stereo, that looks weird.
When you're presenting something you're always directing where someone is looking--whether its a play, movie, or stereo show. The tools are just adapted for the medium.
I do think it worked way better for movies like Avatar or How to Train Your Dragon and was less impressive for things like rom coms.
These effects are for the artistic intent of the game. Same goes for movies, and has nothing to do with "second hand movies recorded on flawed cameras". or with "realism" in the sense of how we perceive the world.
This is why I always turn off these settings immediately when I turn on any video game for the first time. I could never put my finger on why I didn’t like it, but the camera analogy is perfect
It seems like a mistake to lump HDR capture, HDR formats and HDR display together, these are very different things. The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.
We’ve had HDR formats and HDR capture and edit workflows since long before HDR displays. The big benefit of HDR capture & formats is that your “negative” doesn’t clip super bright colors and doesn’t lose color resolution in super dark color. As a photographer, with HDR you can re-expose the image when you display/print it, where previously that wasn’t possible. Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact. Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later. There is a very valid argument to be made for doing the work up-front to capture what you’re after, but ignoring that for a moment, it is simply not possible to re-expose Adams’ negatives to reveal color detail he didn’t capture. That’s why he’s not using HDR, and why saying he is will only further muddy the water.
Arguably, even considering HDR a distinct thing is itself weird an inaccurate.
All mediums have a range, and they've never all matched. Sometimes we've tried to calibrate things to match, but anyone watching SDR content for the past many years probably didn't do so on a color-calibrated and brightness calibrated screen - that wouldn't allow you to have a brightness slider.
HDR on monitors is about communicating content brightness and monitor capabilities, but then you have the question of whether to clip the highlights or just map the range when the content is mastered for 4000 nits but your monitor manages 1000-1500 and only in a small window.
This! Yes I think you’re absolutely right. The term “HDR” is in part kind of an artifact of how digital image formats evolved, and it kind of only makes sense relative to a time when the most popular image formats and most common displays were not very sophisticated about colors.
That said, there is one important part that is often lost. One of the ideas behind HDR, sometimes, is to capture absolute values in physical units, rather than relative brightness. This is the distinguishing factor that film and paper and TVs don’t have. Some new displays are getting absolute brightness features, but historically most media display relative color values.
The term "HDR" arguably makes more sense for the effect achieved by tone mapping multiple exposures of the same subject onto a "normal" (e.g. SRGB) display. In this case, the "high" in "HDR" just means "from a source with higher dynamic range than the display."
> but your monitor manages 1000-1500 and only in a small window.
Owning a display that can do 1300+ nits sustained across a 100% window has been the biggest display upgrade I think I have ever had. It's given me a tolerance for LCD, a technology I've hated since the death of CRTs and turned me away from OLED.
There was a time I would have said i'd never own a non OLED display again. But a capable HDR display changed that logic in a big way.
Too bad the motion resolution on it, especially compared to OLED is meh. Again, at one point, motion was the most important aspect to me (its why I still own CRTs) but this level of HDR...transformative for lack of a better word.
Adams adjusted heavily with dodging and burning, even working to invent a new chemical process to provide more control when developing. He was great at determining exposure for his process as well. A key skill was having a vision for what the image would be after adjusting. Adams talked a lot about this as a top priority of his process.
> It's even more incredible that this was done on paper, which has even less dynamic range than computer screens!
I came here to point this out. You have a pretty high dynamic range in the captured medium, and then you can use the tools you have to darken or lighten portions of the photograph when transferring it to paper.
> The claim that Ansel Adams used HDR is super likely to cause confusion
That isn't what the article claims. It says:
"Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes."
"Use HDR" (your term) is vague to the point of not meaning much of anything, but the article is clear that Adams was capturing scenes that had a high dynamic range, which is objectively true.
where my interpretation is colored by the experience of making high quality prints and viewing them under different conditions, particularly poor illumination quality but you could also count "small handheld game console", "halftone screened and printed on newsprint" as other degraded conditions. In those cases you might imagine that the eye can only differentiate between 11 tones so even if an image has finer detail it ought to connect well with people if colors were quantized. (I think about concept art from Pokémon Sun and Moon which looked great printed with a thermal printer because it was designed to look great on a cheap screen.)
In my mind, the ideal image would look good quantized to 11 zones but also has interesting detail in texture in 9 of the zones (extreme white and black don't show texture). That's a bit of an oversimplification (maybe a shot outdoors in the snow is going to trend really bright, maybe for artistic reasons you want things to be really dark, ...) but Ansel Adams manually "tone mapped" his images using dodging, burning and similar techniques to make it so.
> It seems like a mistake to lump HDR capture, HDR formats and HDR display together, these are very different things.
These are all related things. When you talk about color, you can be talking about color cameras, color image formats, and color screens, but the concept of color transcends the implementation.
> The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.
The post never said Adams used HDR. I very carefully chose the words, "capturing dramatic, high dynamic range scenes."
> Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact.
This is just factually wrong. Film negatives have 12-stops of useful dynamic range, while photo paper has 8 stops at best. That gave photographers exposure latitude during the print process.
> Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later.
There's a photo of Ansel Adams in the article, dodging and burning a print. How would you describe that if not adjusting the exposure?
> Film negatives have 12-stops of useful dynamic range
No, that’s not inherently true. AA used 12 zones, that doesn’t mean every negative stock has 12 stops of latitude. Stocks are different, you need to look at the curves.
But yes most modern negatives are very forgiving. FP4 for example has barely any shoulder at all iirc.
I agree capture, format and display are closely related. But HDR capture and processing specifically developed outside of HDR display devices, and use of HDR displays changes how HDR images are used compared to LDR displays.
> The post never said Adams used HDR. I very carefully chose the words
Hey I’m sorry for criticizing, but I honestly feel like you’re being slightly misleading here. The sentence “What if I told you that analog photographers captured HDR as far back as 1857?” is explicitly claiming that analog photographers use “HDR” capture, and the Ansel Adams sentence that follows appears to be merely a specific example of your claim. The result of the juxtaposition is that the article did in fact claim Adams used HDR, even if you didn’t quite intend to.
I think you’re either misunderstanding me a little, or maybe unaware of some of the context of HDR and its development as a term of art in the computer graphics community. Film’s 12 stops is not really “high” range by HDR standards, and a little exposure latitude isn’t where “HDR” came from. The more important part of HDR was the intent to push toward absolute physical units like luminance. That doesn’t just enable deferred exposure, it enables physical and perceptual processing in ways that aren’t possible with film. It enables calibrated integration with CG simulation that isn’t possible with film. And it enables a much wider rage of exposure push/pull than you can do when going from 12 stops to 8. And of course non-destructive digital deferred exposure at display time is quite different from a print exposure.
Perhaps it’s useful to reflect on the fact that HDR has a counterpart called LDR that’s referring to 8 bits/channel RGB. With analog photography, there is no LDR, thus zero reason to invent the notion of a ‘higher’ range. Higher than what? High relative to what? Analog cameras have exposure control and thus can capture any range you want. There is no ‘high’ range in analog photos, there’s just range. HDR was invented to push against and evolve beyond the de-facto digital practices of the 70s-90s, it is not a statement about what range can be captured by a camera.
But the article even shows Adams dodging/burning a print, which is 'adjusting the exposure' in a localised fashion of the high dynamic range of the film, effectively revealing detail for the LDR of the resulting print that otherwise wouldn't have been visible.
If I look at one of the photography books in my shelf, they are even talking about 18 stops and such for some film material, and how this doesn't translate to paper and all the things that can be done to render it visible in print and how things behave at both extreme ends (towards black and white). Read: Tone-mapping (i.e. trimming down a high DR image to a lower DR output media) is really old.
The good thing about digital is that it can deal with color at decent tonal resolutions (if we assume 16 bits, not the limited 14 bit or even less) and in environments where film has technical limitations.
No, Adams, like everyone who develops their own film (or RAW digital photos) definitely worked in HDR. Film has much more DR than photographic paper, as noted by TFA author (and large digital sensors more than either SDR or HDR displays) especially if you’re such a master of exposure as Adams; trying to preserve the tonalities when developing and printing your photos is the real big issue.
Good question. I think it depends. They are kind of different concepts, but in practice they can overlap considerably. RAW is about using the camera’s full native color resolution, and not having lossy compression. HDR is overloaded, as you can see from the article & comments, but I think HDR capture is conceptually about expressing brightness in physical units like luminance or radiance, and delaying the ‘exposure’ until display time. Both RAW and HDR typically mean using more than 8 bits/channel and capturing high quality images that will withstand more post-processing than ‘exposed’ LDR images can handle.
HDR on displays is actually largely uncomfortable for me. They should reserve the brightest HDR whites for things like the sun itself and caustics, not white walls in indoor photos.
As for tone mapping, I think the examples they show tend way too much towards flat low-local-contrast for my tastes.
HDR is really hard to get right apparently. It seems to get worse in video games too.
I'm a huge fan of Helldivers 2, but playing the game in HDR gives me a headache: the muzzle flash of weapons at high RPMs on a screen that goes to 240hz is basically a continuous flashbang for my eyes.
For a while, No Mans' Sky in HDR mode was basically the color saturation of every planet dialed up to 11.
The only game I've enjoyed at HDR was a port from a console, Returnal. The use of HDR brights was minimalistic and tasteful, often reserved for certain particle effects.
For a year or two after it launched, The Division 2 was a really, really good example of HDR done right. The game had (has?) a day/night cycle, and it had a really good control of the brightness throughout the day. More importantly, it made very good use of the wide color gamut available to it.
I stopped playing that game for several years, and when I went back to it, the color and brightness had been wrecked to all hell. I have heard that it's received wisdom that gamers complain that HDR modes are "too dark", so perhaps that's part of why they ruined their game's renderer.
Some games that I think currently have good HDR:
* Lies of P
* Hunt: Showdown 1896
* Monster Hunter: World (if you increase the game's color saturation a bit from its default settings)
Some games that had decent-to-good HDR the last time I played them, a few years ago:
* Battlefield 1
* Battlefield V
* Battlefield 2042 (If you're looking for a fun game, I do NOT recommend this one. Also, the previous two are probably chock-full of cheaters these days.)
I found Helldivers 2's HDR mode to have blacks that were WAY too bright. In SDR mode, nighttime in forest areas was dark. In HDR mode? It was as if you were standing in the middle of a field during a full moon.
A lot of this is poor QA. When you start to do clever things like HDR you have to test on a bunch of properly calibrated devices of different vendors etc. And if you're targeting Windows you have to accept that HDR is a mess for consumers and even if their display supports, their GPU supports it, they might still have the drivers and color profiles misconfigured. (and many apps are doing it wrong or weird, even when they say they support it)
Also (mostly) on Windows, or on videos for your TV: a lot of cheap displays that say they are HDR are a range of hot garbage.
Most "HDR" monitors are junk that can't display HDR. The HDR formats/signals are designed for brightness levels and viewing conditions that nobody uses.
The end result is a complete chaos. Every piece of the pipeline doing something wrong, and then the software tries to compensate for it by emitting doubly wrong data, without even having reliable information about what it needs to compensate for.
What we really need is some standards that everybody follows. The reason normal displays work so well is that everyone settled on sRGB, and as long as a display gets close to that, say 95% sRGB, everyone except maybe a few graphics designers will have a n equivalent experience.
But HDR, it's a minefield of different display qualities, color spaces, standards. It's no wonder that nobody gets it right and everyone feels confused.
HDR on a display that has peak brightness of 2000 nits will look completely different than a display with 800 nits, and they both get to claim they are HDR.
We should have a standard equivalent to color spaces. Set, say, 2000 nits as 100% of HDR. Then a 2000 nit display gets to claim it's 100% HDR. A 800 nit display gets to claim 40% HDR, etc. A 2500 nit display could even use 125% HDR in it's marketing.
It's still not perfect - some displays (OLED) can only show peak brightness over a portion of the screen. But it would be an improvement.
There's a pretty good video on YouTube (more than one, actually) that explains how careless use of HDR in modern cinema is destroying the look and feel of cinema we used to like.
Everything is flattened, contrast is eliminated, lights that should be "burned white" for a cinematic feel are brought back to "reasonable" brightness with HDR, really deep blacks are turned into flat greys, etc. The end result is the flat and washed out look of movies like Wicked. It's often correlated to CGI-heavy movies, but in reality it's starting to affect every movie.
The washed out grey thing was an error that became a style!
Because HDR wasn’t natively supported on most displays and software, for a long time it was just “hacked in there” by squashing the larger dynamic range into a smaller one using a mathematical transform, usually a log function. When viewed without the inverse transform this looks horribly grey and unsaturated.
Directors and editors would see this aesthetic day in, day out, with the final color grade applied only after a long review process.
Some of them got used to it and even liking it, and now here we are: horribly washed out movies made to look like that on purpose.
I've found this especially a problem with those AI systems trying to add HDR to existing images/videos. The worst instance I've seen, was playing one of the recent Spongebob platformer games, and having his eyes glow like giant suns in the menu screen. I have a TV capable of a fairly high maximum brightness, and it was dimming the rest of the image just to make sure Spongebob's eyes lit up my living room like it was midday
It feels like to some photographers/cinematographers/game designers, HDR is a gimmick to make something look more splashy/eye catching. The article touches on this a bit, with some of the 2000s HDR examples in photography. With the rise of HDR TVs, it feels like that trend is just happening again.
HDR is when you’re watching a dark film at night, looking at the subtle nuances between shades of dark and black in the shadows on the screen, making out the faint contours the film director carefully curated, and the subtitles gently deposit 40W of light into your optical nerves with “♪”.
As a photographer, I get the appeal of (this new incarnation of) HDR content, but the practical reality is that the photos I see posted in my feeds go from making my display looking normal to having photos searing my retinas, while other content that was uniform white a second prior now looks dull gray.
It's late night here so I was reading this article in dark mode, at a low display brightness - and when I got to the HDR photos I had to turn down my display even more to not strain my eyes, then back up again when I scrolled to the text.
For fullscreen content (games, movies) HDR is alright, but for everyday computing it's a pretty jarring experience as a user.
While it isn't touched on in the post, I think the issue with feeds is that platforms like Instagram have no interest in moderating HDR.
For context: YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold. I think the solution for HDR is similar penalization based on log luminance or some other reasonable metric.
I don't see this happening on Instagram any time soon, because bad HDR likely makes view counts go up.
As for the HDR photos in the post, well, those are a bit strong to show what HDR can do. That's why the Mark III beta includes a much tamer HDR grade.
> YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold.
For anyone else who was confused by this, it seems to be a client-side audio compressor feature (not a server-side adjustment) labeled as "Stable Volume". On the web, it's toggleable via the player settings menu.
I didn't realize this was a thing until just now, but I'm glad they added it because (now that I think about it) it's been awhile since I felt the need to adjust my system volume when a video was too quiet even at 100% player volume. It's a nice little enhancement.
Instagram has to allow HDR for the same reason that Firefox spent the past twenty years displaying web colors like HN orange at maximum display gamut rather than at sRGB calibrated: because a brighter red than anyone else’s draws people in, and makes the competition seem lifeless by comparison, especially in a mixed-profiles environment. Eventually that is regarded as ‘garishly bright’, so to speak, and people push back against it. I assume Firefox is already fixing this to support the latest CSS color spec (which defines #rrggbb as sRGB and requires it to be presented as such unless stated otherwise in CSS), but I doubt Instagram is willing to literally dim their feed; instead, I would expect them to begin AI-HDR’ing SDR uploads in order that all videos are captivatingly, garishly, bright.
> I think the solution for HDR is similar penalization based on log luminance or some other reasonable metric.
I completely understand the desire to address the issue of content authors misusing or intentionally abusing HDR with some kind of auto-limiting algorithm similar to the way the radio 'loudness wars' were addressed. Unfortunately, I suspect it will be difficult, if not impossible, to achieve without also negatively impacting some content applying HDR correctly for artistically expressive purposes. Static photos may be solvable without excessive false positive over-correction but cinematic video is much more challenging due to the dynamic nature of the content.
As a cinemaphile, I'm starting to wonder if maybe HDR on mobile devices simply isn't a solvable problem in practice. While I think it's solvable technically and certainly addressable from a standards perspective, the reality of having so many stakeholders in the mobile ecosystem (hardware, OS, app, content distributors, original creators) with diverging priorities makes whatever we do from a base technology and standards perspective unlikely to work in practice for most users. Maybe I'm too pessimistic but as a high-end home theater enthusiast I'm continually dismayed how hard it is to correctly display diverse HDR content from different distribution sources in a less complex ecosystem where the stakeholders are more aligned and the leading standards bodies have been around for many decades (SMPTE et al).
Another related parallel trend recently is that bad AI images get very high view and like counts, so much so that I've lost a lot of motivation for doing real photography because the platforms cease to show them to anyone, even my own followers.
Why is nobody talking about the standards development? They (OS, image formats) could just say all stuff by default assumes SDR and if a media file explicitly calls for HDR even then it cannot have sharp transitions except in special cases, and the software just blocks or truncates any non conforming images. The OS should have had something like this for sound, about 25-30 years ago. For example a brightness aware OS/monitor combo could just outright disallow anything about x nits. And disallow certain contrast levels, in the majority of content.
Btw, YouTube doesn't moderate HDR either. I saw one video of a child's violin recital that was insanely bright, and probably just by accident of using a bad HDR recorder.
Completely agree. To me, HDR feels like the system is ignoring my screen brightness settings.
I set my screen brightness to a certain level for a reason. Please don’t just arbitrarily turn up the brightness!
There is no good way to disable HDR on photos for iPhone, either. Sure, you can turn off the HDR on photos on your iphone. But then, when you cast to a different display, the TV tries to display the photos in HDR, and it won’t look half as good.
> To me, HDR feels like the system is ignoring my screen brightness settings.
You might be on to something there. Technically, HDR is mostly about profile signaling and therefore about interop. To support it in mpeg dash or hls media you need to make sure certain codec attributes are mentioned in the xml or m3u8 but the actual media payload stays the same.
Any bit or Bob being misconfigured or misinterpreted in the streaming pipeline will result in problems ranging from slightly suboptimal experience to nothing works.
Besides HDR, "spatial audio" formats like Dolby Atmos are notorious for interop isuues
> To me, HDR feels like the system is ignoring my screen brightness settings.
On both Android & iOS/MacOS it's not that HDR is ignoring your screen brightness, but rather the brightness slider is controlling the SDR range and then yes HDR can exceed that, that's the singular purpose of HDR to be honest. All the other purported benefits of HDR are at best just about HDR video profiles and at worst just nonsense bullshit. The only thing HDR actually does is allow for brighter colors vs. SDR. When used selectively this really enhances a scene. But restraint is hard, and most forms of HDR content production are shit. The HDR images that newer iPhones and Pixel phones are capturing are generally quite good because they are actually restrained, but then ironically both of them have horrible HDR video that's just obnoxiously bright.
I’m under the impression this is caused by the use of “HDR mode”(s) and poor adaptive brightness implementations on devices. Displays such as the iPad Pro w/ OLED are phenomenal and don’t seem to implement an overactive adaptive brightness. HDR content has more depth without causing brightness distortion.
In contrast, my TV will change brightness modes to display HDR content and disables some of the brightness adjustments when displaying HDR content. It can be very uncomfortably bright in a dark room while being excessively dim in a bright room. It requires adjusting settings to a middle ground resulting in a mixed/mediocre experience overall. My wife’s laptop is the worst of all our devices, while reviews seem to praise the display, it has an overreactive adaptive brightness that cannot be disabled (along with decent G2G response but awful B2W/W2B response that causes ghosting).
Apple’s method involves a good deal of what they call “EDR”, wherein the display gamma is ramped down in concert with ramping the brightness up, so that the brighter areas get brighter while the non-bright areas remain dark due to gamma math; that term is helpful for searching their WWDC developer videos for more details.
That's not inherent to HDR though. BFV (unless I'm confusing it with something else) has a HDR adjustment routine where you push a slider until the HDR white and the SDR white are identical. Same could be done for desktop environments. In my experience, HDR support is very lacking in PCs atm. You can't even play Dolby Vision on Windows, which is the only widely-used HDR format with dynamic metadata.
If you mean https://i.imgur.com/0LtYuDZ.jpeg that is probably the slider GP wants but it's not about matching HDR white to SDR white, it's just about clamping the peak HDR brightness in its own consideration. The white on the left is the HDR brightness according to a certain value in nits set via the bottom slider. The white on the right is the maximally bright HDR signal. The goal of adjusting the slider is to find how bright of an HDR white your display can actually produce, which is the lowest slider value at which the two whites appear identical to a viewer.
Some games also have a separate slider https://i.imgur.com/wenBfZY.png for adjusting "paper white", which is the HDR white one might normally associate with matching to SDR reference white (100 nits when in a dark room according to the SDR TV color standards, higher in other situations or standards). Extra note: the peak brightness slider in this game (Red Dead Redemption 2) is the same knob as the brightness slider in the above Battlefield V screenshot)
I experience the same thing you do — but my take on it is different. Being hit with HDR images (and videos on YouTube), while unsettling, makes me then realize how just damned dull the SDR world I had been forced to succumb to has been.
Let the whole experience be HDR and perhaps it won't be jarring.
This seems more like a "your feeds" problem than an HDR problem. Much in the same way people screencap and convert images willy nilly. I suggest blocking non HDR content
> A big problem is that it costs the TV, Film, and Photography industries billions of dollars (and a bajillion hours of work) to upgrade their infrastructure. For context, it took well over a decade for HDTV to reach critical mass.
This is also true for consumers. I don't own a single 4k or HDR display. I probably won't own an HDR display until my TV dies, and I probably won't own a 4k display until I replace my work screen, at which point I'll also replace one of my home screens so I can remote into it without scaling.
Pretty much any display you can buy today will be HDR capable, though that doesn't mean much.
I think the industry is strangling itself putting "DisplayHDR 400" certification on edgelit/backlit LCD displays. In order for HDR to look "good" you either need high resolution full array local dimming backlighting (which still isn't perfect), or a panel type that doesn't use any kind of backlighting like OLED.
Viewing HDR content on these cheap LCDs often looks worse than SDR content. You still get the wider color gamut, but the contrast just isn't there. Local dimming often loses all detail in shadows whenever there is something bright on the screen.
HDR marketing on monitors almost seems like a scam. Monitors will claim HDR compatibility when what they actually means is they will take the HDR data stream and display it exactly the same as SDR content because they don't actually have the contrast and brightness ability of a proper HDR monitor.
Few things are in absolutes. Yes most consumers wont have every screen hdr nor 4k, but most consumers use a modern smartphone and just about every modern smartphone from the past half decade or more has hdr of some level.
I absolutely loathe consuming content on a mobile screen, but its the reality is the vast majority are using phone and tablets most the time.
Funny enough HDR content works absolutely perfect as long as it stays on device that has both HDR-recording and display tech, aka smartphones.
The problem starts with sending HDR content to SDR-only devices, or even just other HDR-standards. Not even talking about printing here.
This step can inherently only be automated so much, because it's also a stylistic decision on what information to keep or emphasize.
This is an editorial process, not something you want to emburden casual users with. What works for some images can't work for others. Even with AI the preference would still need to be aligned.
I have to think you are are the 1-3% outlier though. Everyone I know has an HDR screen even my friend who never buys anything new, but he did run out and buy an HDR tv to replace his old one that he gave to his son.
I honestly do not know if I have any screen that supports HDR. At least I've never noticed any improved image quality when viewing HDR video content and compare the image on my M3 Macbook Pro screen vs. an old external IPS monitor. Maybe my eyes are just broken?
To demonstrate some contrast (heh) with another data point from someone closer to the other extreme, I’ve owned a very HDR-capable monitor (the Apple Pro Display XDR) since 2020, so that’s 5 years now. Content that takes full advantage of it is still rare, but it’s getting better slowly over time.
I have a screen which is "HDR" but what that means is when you turn the feature on it just makes everything more muted, it doesn't actually have any more dynamic range. When you turn HDR on for a game it basically just makes most things more muddy grey.
I also have a screen which has a huge gamut and blows out colors in a really nice way (a bit like the aftereffects of hallucinogens, it has colors other screens just don't) and you don't have to touch any settings.
My OLED TV has HDR and it actually seems like HDR content makes a difference while regular content is still "correct".
Don't feel like you have to. I bought a giant fancy TV with it, and even though it's impressive, it's kinda like ultra-hifi-audio. I don't miss it when I watch the same show on one of my older TVs.
If you ever do get it, I suggest doing for a TV that you watch with your full attention, and watching TV / movies in the dark. It's not very useful on a TV that you might turn on while doing housework; but very useful when you are actively watching TV with your full attention.
I totally love HDR on my OLED TV, and definitely miss it on others.
Like a lot of things, it’s weird how some people are more sensitive to visual changes. For example:
- At this point, I need 120hz displays. I can easily notice when my wife’s phone is in power saver mode at 60hz.
- 4k vs 1080p. This is certainly more subtle, but I definitely miss detail in lower res content.
- High bitrate. This is way more important than 4k vs 1080p or even HDR. But it’s so easy to tell when YouTube lowers the quality setting on me, or when a TV show is streaming at a crappy bitrate.
- HDR is tricky, because it relies completely on the content creator to do a good job producing HDR video. When done well, the image basically sparkles, water looks actually wet, parts of the image basically glow… it looks so good.
I 100% miss this HDR watching equivalent content on other displays. The problem is that a lot of content isn’t produced to take advantage of this very well. The HDR 4k Blu-ray of several Harry Potter movies, for example, has extremely muted colors and dark scenes… so how is the image going to pop? I’m glad we’re seeing more movies rely on bright colors and rich, contrasty color grading. There are so many old film restorations that look excellent in HDR because the original color grade had rich, detailed, contrasty colors.
On top of that, budget HDR implementations, ESPECIALLY in PC monitors, just don’t get very bright. Which means their HDR is basically useless. It’s impossible to replicate the “shiny, wet look” of really good HDR water if the screen can’t get bright enough to make it look shiny. Plus, it needs to be selective about what gets bright, and cheap TVs don’t have a lot of backlighting zones to make that happen very well.
So whereas I can plug in a 4k 120hz monitor and immediately see the benefit in everything I do for normal PC stuff, you can’t get that with HDR unless you have good source material and a decent display.
I don't either see a point of having 4K TV vs 1080p TV. To me is just marketing, I have at my house both a 4K and a 1080p and from a normal viewing distance (that is 3/4 meters) you don't see differences.
Also in my country (Italy) TV transmissions are 1080i at best, a lot are still 570i (PAL resolution). Streaming media can be 4K (if you have enough bandwidth to stream it at that resolution, which I don't have at my house). Sure, if you download pirated movies you find it at 4K, and if you have the bandwidth to afford it... sure.
But even there, sometimes is better a well done 1080p movie than an hyper compressed 4K one, since you see compression artifacts.
To me 1080p, and maybe even 720p, is enough for TV vision. Well, sometimes I miss the CRT TVs, they where low resolution but for example had a much better picture quality than most modern 4K LCD TV where black scenes are gray (I know there is OLED, but is too expensive and has other issues).
This is also true for consumers. I don't own a single 4k or HDR display. I probably won't own an HDR display until my TV dies, and I probably won't own a 4k display until I replace my work screen, at which point I'll also replace one of my home screens so I can remote into it without scaling.
People in the HN echo chamber over-estimate hardware adoption rates. For example, there are millions of people who went straight from CDs to streaming, without hitting the iPod era.
A few years ago on HN, there was someone who couldn't wrap their brain around the notion that even though VCRs were invented in the early 1960's that in 1980, not everyone owned one, or if they did, they only had one for the whole family.
Normal people aren't magpies who trash their kit every time something shiny comes along.
> A few years ago on HN, there was someone who couldn't wrap their brain around the notion that even though VCRs were invented in the early 1960's that in 1980, not everyone owned one, or if they did, they only had one for the whole family.
Point of clarification: While the technology behind the VCR was invented in the '50s and matured in the '60s, consumer-grade video tape systems weren't really a thing until Betamax and VHS arrived in 1975 and 1976 respectively.
Early VCRs were also incredibly expensive, with prices ranging from $3,500 to almost $10,000 after adjusting for inflation. Just buying into the VHS ecosystem at the entry level was a similar investment to buying an Apple Vision Pro today.
>there are millions of people who went straight from CDs to streaming, without hitting the iPod era
Who?
There was about a decade there where everyone who had the slightest interest in music had an mp3 player of some kind, at least in the 15-30 age bracket.
> AI cannot read your mind, so it cannot honor your intent.
This. I can always tell when someone "gets" software development when they either understand (or don't) that computers can't read minds or infer intent like a person can.
It didn't take very long to learn, and it turned out to be extremely important in the work I did during the early days at Waymo and later at Motional.
I wanted to pass along this fun video from several years ago that discusses HDR: https://www.youtube.com/watch?v=bkQJdaGGVM8 . It's short and fun, I recommend it to all HN readers.
Separately, if you want a more serious introduction to digital photography, I recommend the lectures by Marc Levoy from his Stanford course: https://www.youtube.com/watch?v=y7HrM-fk_Rc&list=PL8ungNrvUY... . I believe he runs his own group at Adobe now after leading a successful effort at Google making their pixel cameras the best in the industry for a couple of years. (And then everyone more-or-less caught up, just like with most tech improvements in the history of smartphones).
This gets to a gaming rant of mine: Our natural vision can handle these things because our eyes scan sections of the scene with constant adjustment (light-level, focus) while our brain is compositing it together into what feels like a single moment.
However certain effects in games (i.e. "HDR" and Depth of Field) instead reduce the fidelity of the experience. These features limp along only while our gaze is aimed at the exact spot the software expects. If you glance anywhere else around the scene, you instead percieve an unrealistically wrong coloration or blur that frustratingly persists no matter how much you squint. These problems will remain until gaze-tracking support becomes standard.
So ultimately these features reduce the realism of the experience. They make it less like being there and more like you're watching a second-hand movie recorded on flawed video-cameras. This distinction is even clearer if you consider cases where "film grain" is added.
It's crazy that post is 15 years old. Like the OP and this post get at, HDR isn't really a good description of what's happening. HDR often means one or more of at least 3 different things (capture, storage, and presentation). It's just the sticker slapped on advertising.
Things like lens flares, motion blur, film grain, and shallow depth of field are mimicking cameras and not what being there is like--but from a narrative perspective we experience a lot of these things through tv and film. Its visual shorthand. Like Star Wars or Battlestar Galactica copying WWII dogfight footage even though it's less like what it would be like if you were there. High FPS television can feel cheap while 24fps can feel premium and "filmic."
Often those limitations are in place so the experience is consistent for everyone. Games will have you set brightness and contrast--I had friends that would crank everything up to avoid jump scares and to clearly see objects intended to be hidden in shadows. Another reason for consistent presentation is for unfair advantages in multiplayer.
Ignoring film grain, our vision has all these effects all the same.
Look in front of you and only a single plane will be in focus (and only your fovea produces any sort of legibility). Look towards a bright light and you might get flaring from just your eyes. Stare out the side of a car or train when driving at speed and you'll see motion blur, interrupted only by brief clarity if you intentionally try to follow the motion with your eyes.
Without depth of field simulation, the whole scene is just a flat plane with completely unrealistic clarity, and because it's comparatively small, too much of it is smack center on your fovea. The problem is that these are simulations that do not track your eyes, and make the (mostly valid!) assumption that you're looking, nearby or in front of whatever you're controlling.
Maybe motion blur becomes unneccessary given a high enough resolution and refresh rate, but depth of field either requires actual depth or foveal tracking (which only works for one person). Tasteful application of current techniques is probably better.
> High FPS television can feel cheap while 24fps can feel premium and "filmic."
Ugh. I will never understand the obsession this effect. There is no such thing as a "soap opera effect" as people liek to call it, only a slideshow effect.
The history behind this is purely a series of cost-cutting measures entirely unrelated to the user experience or artistic qualities. 24 fps came to be because audio was slapped onto the film, and was the slowest speed where the audio track was acceptable intelligible, saving costly film paper - the sole priority of the time. Before that, we used to record content at variable frame rates but play it back at 30-40 fps.
We're clinging on to a cost-cutting measure that was a significant compromise from the time of hand cranked film recording.
</fist-shaking rant>
Such a blast from the past, I used to spend so much time just clicking that button!
If you have a good display (eg an OLED) then the brights are brighter and simultaneously there is more detail in the blacks. Why do you think that is worse than SDR?
HDR in games would frequently mean clipping highlights and adding bloom. Prior the "HDR" exposure looked rather flat.
It may not be obvious, but film has a visual language. If you look at early film, it wasn't obvious if you cut to something that the audience would understand what was going on. Panning from one object to another implies a connection. It's built on the visual language of still photography (things like rule of thirds, using contrast or color to direct your eye, etc). All directing your eye.
Stereo film has its own limitations that were still being explored. In a regular film, you would do a rack focus to connect something in the foreground to the background. In stereo, when there's a rack focus people don't follow the camera the same way. In regular film, you could show someone's back in the foreground of a shot and cut them off at the waist. In stereo, that looks weird.
When you're presenting something you're always directing where someone is looking--whether its a play, movie, or stereo show. The tools are just adapted for the medium.
I do think it worked way better for movies like Avatar or How to Train Your Dragon and was less impressive for things like rom coms.
Sure, you need a good HDR-capable display and a native HDR-game (or RTX HDR), but the results are pretty awesome.
We’ve had HDR formats and HDR capture and edit workflows since long before HDR displays. The big benefit of HDR capture & formats is that your “negative” doesn’t clip super bright colors and doesn’t lose color resolution in super dark color. As a photographer, with HDR you can re-expose the image when you display/print it, where previously that wasn’t possible. Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact. Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later. There is a very valid argument to be made for doing the work up-front to capture what you’re after, but ignoring that for a moment, it is simply not possible to re-expose Adams’ negatives to reveal color detail he didn’t capture. That’s why he’s not using HDR, and why saying he is will only further muddy the water.
All mediums have a range, and they've never all matched. Sometimes we've tried to calibrate things to match, but anyone watching SDR content for the past many years probably didn't do so on a color-calibrated and brightness calibrated screen - that wouldn't allow you to have a brightness slider.
HDR on monitors is about communicating content brightness and monitor capabilities, but then you have the question of whether to clip the highlights or just map the range when the content is mastered for 4000 nits but your monitor manages 1000-1500 and only in a small window.
That said, there is one important part that is often lost. One of the ideas behind HDR, sometimes, is to capture absolute values in physical units, rather than relative brightness. This is the distinguishing factor that film and paper and TVs don’t have. Some new displays are getting absolute brightness features, but historically most media display relative color values.
Deleted Comment
Owning a display that can do 1300+ nits sustained across a 100% window has been the biggest display upgrade I think I have ever had. It's given me a tolerance for LCD, a technology I've hated since the death of CRTs and turned me away from OLED.
There was a time I would have said i'd never own a non OLED display again. But a capable HDR display changed that logic in a big way.
Too bad the motion resolution on it, especially compared to OLED is meh. Again, at one point, motion was the most important aspect to me (its why I still own CRTs) but this level of HDR...transformative for lack of a better word.
I came here to point this out. You have a pretty high dynamic range in the captured medium, and then you can use the tools you have to darken or lighten portions of the photograph when transferring it to paper.
That isn't what the article claims. It says:
"Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes."
"Use HDR" (your term) is vague to the point of not meaning much of anything, but the article is clear that Adams was capturing scenes that had a high dynamic range, which is objectively true.
https://www.kimhildebrand.com/how-to-use-the-zone-system/
where my interpretation is colored by the experience of making high quality prints and viewing them under different conditions, particularly poor illumination quality but you could also count "small handheld game console", "halftone screened and printed on newsprint" as other degraded conditions. In those cases you might imagine that the eye can only differentiate between 11 tones so even if an image has finer detail it ought to connect well with people if colors were quantized. (I think about concept art from Pokémon Sun and Moon which looked great printed with a thermal printer because it was designed to look great on a cheap screen.)
In my mind, the ideal image would look good quantized to 11 zones but also has interesting detail in texture in 9 of the zones (extreme white and black don't show texture). That's a bit of an oversimplification (maybe a shot outdoors in the snow is going to trend really bright, maybe for artistic reasons you want things to be really dark, ...) but Ansel Adams manually "tone mapped" his images using dodging, burning and similar techniques to make it so.
These are all related things. When you talk about color, you can be talking about color cameras, color image formats, and color screens, but the concept of color transcends the implementation.
> The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.
The post never said Adams used HDR. I very carefully chose the words, "capturing dramatic, high dynamic range scenes."
> Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact.
This is just factually wrong. Film negatives have 12-stops of useful dynamic range, while photo paper has 8 stops at best. That gave photographers exposure latitude during the print process.
> Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later.
There's a photo of Ansel Adams in the article, dodging and burning a print. How would you describe that if not adjusting the exposure?
No, that’s not inherently true. AA used 12 zones, that doesn’t mean every negative stock has 12 stops of latitude. Stocks are different, you need to look at the curves.
But yes most modern negatives are very forgiving. FP4 for example has barely any shoulder at all iirc.
> The post never said Adams used HDR. I very carefully chose the words
Hey I’m sorry for criticizing, but I honestly feel like you’re being slightly misleading here. The sentence “What if I told you that analog photographers captured HDR as far back as 1857?” is explicitly claiming that analog photographers use “HDR” capture, and the Ansel Adams sentence that follows appears to be merely a specific example of your claim. The result of the juxtaposition is that the article did in fact claim Adams used HDR, even if you didn’t quite intend to.
I think you’re either misunderstanding me a little, or maybe unaware of some of the context of HDR and its development as a term of art in the computer graphics community. Film’s 12 stops is not really “high” range by HDR standards, and a little exposure latitude isn’t where “HDR” came from. The more important part of HDR was the intent to push toward absolute physical units like luminance. That doesn’t just enable deferred exposure, it enables physical and perceptual processing in ways that aren’t possible with film. It enables calibrated integration with CG simulation that isn’t possible with film. And it enables a much wider rage of exposure push/pull than you can do when going from 12 stops to 8. And of course non-destructive digital deferred exposure at display time is quite different from a print exposure.
Perhaps it’s useful to reflect on the fact that HDR has a counterpart called LDR that’s referring to 8 bits/channel RGB. With analog photography, there is no LDR, thus zero reason to invent the notion of a ‘higher’ range. Higher than what? High relative to what? Analog cameras have exposure control and thus can capture any range you want. There is no ‘high’ range in analog photos, there’s just range. HDR was invented to push against and evolve beyond the de-facto digital practices of the 70s-90s, it is not a statement about what range can be captured by a camera.
Reminded me of the classic "HDR in games vs HDR in photography" comparison[0]
[0] https://www.realtimerendering.com/blog/thought-for-the-day/
The good thing about digital is that it can deal with color at decent tonal resolutions (if we assume 16 bits, not the limited 14 bit or even less) and in environments where film has technical limitations.
Deleted Comment
As for tone mapping, I think the examples they show tend way too much towards flat low-local-contrast for my tastes.
I'm a huge fan of Helldivers 2, but playing the game in HDR gives me a headache: the muzzle flash of weapons at high RPMs on a screen that goes to 240hz is basically a continuous flashbang for my eyes.
For a while, No Mans' Sky in HDR mode was basically the color saturation of every planet dialed up to 11.
The only game I've enjoyed at HDR was a port from a console, Returnal. The use of HDR brights was minimalistic and tasteful, often reserved for certain particle effects.
I stopped playing that game for several years, and when I went back to it, the color and brightness had been wrecked to all hell. I have heard that it's received wisdom that gamers complain that HDR modes are "too dark", so perhaps that's part of why they ruined their game's renderer.
Some games that I think currently have good HDR:
* Lies of P
* Hunt: Showdown 1896
* Monster Hunter: World (if you increase the game's color saturation a bit from its default settings)
Some games that had decent-to-good HDR the last time I played them, a few years ago:
* Battlefield 1
* Battlefield V
* Battlefield 2042 (If you're looking for a fun game, I do NOT recommend this one. Also, the previous two are probably chock-full of cheaters these days.)
I found Helldivers 2's HDR mode to have blacks that were WAY too bright. In SDR mode, nighttime in forest areas was dark. In HDR mode? It was as if you were standing in the middle of a field during a full moon.
Also (mostly) on Windows, or on videos for your TV: a lot of cheap displays that say they are HDR are a range of hot garbage.
The end result is a complete chaos. Every piece of the pipeline doing something wrong, and then the software tries to compensate for it by emitting doubly wrong data, without even having reliable information about what it needs to compensate for.
https://docs.google.com/document/d/1A__vvTDKXt4qcuCcSN-vLzcQ...
But HDR, it's a minefield of different display qualities, color spaces, standards. It's no wonder that nobody gets it right and everyone feels confused.
HDR on a display that has peak brightness of 2000 nits will look completely different than a display with 800 nits, and they both get to claim they are HDR.
We should have a standard equivalent to color spaces. Set, say, 2000 nits as 100% of HDR. Then a 2000 nit display gets to claim it's 100% HDR. A 800 nit display gets to claim 40% HDR, etc. A 2500 nit display could even use 125% HDR in it's marketing.
It's still not perfect - some displays (OLED) can only show peak brightness over a portion of the screen. But it would be an improvement.
Everything is flattened, contrast is eliminated, lights that should be "burned white" for a cinematic feel are brought back to "reasonable" brightness with HDR, really deep blacks are turned into flat greys, etc. The end result is the flat and washed out look of movies like Wicked. It's often correlated to CGI-heavy movies, but in reality it's starting to affect every movie.
Because HDR wasn’t natively supported on most displays and software, for a long time it was just “hacked in there” by squashing the larger dynamic range into a smaller one using a mathematical transform, usually a log function. When viewed without the inverse transform this looks horribly grey and unsaturated.
Directors and editors would see this aesthetic day in, day out, with the final color grade applied only after a long review process.
Some of them got used to it and even liking it, and now here we are: horribly washed out movies made to look like that on purpose.
It feels like to some photographers/cinematographers/game designers, HDR is a gimmick to make something look more splashy/eye catching. The article touches on this a bit, with some of the 2000s HDR examples in photography. With the rise of HDR TVs, it feels like that trend is just happening again.
It's late night here so I was reading this article in dark mode, at a low display brightness - and when I got to the HDR photos I had to turn down my display even more to not strain my eyes, then back up again when I scrolled to the text.
For fullscreen content (games, movies) HDR is alright, but for everyday computing it's a pretty jarring experience as a user.
For context: YouTube automatically edits the volume of videos that have an average loudness beyond a certain threshold. I think the solution for HDR is similar penalization based on log luminance or some other reasonable metric.
I don't see this happening on Instagram any time soon, because bad HDR likely makes view counts go up.
As for the HDR photos in the post, well, those are a bit strong to show what HDR can do. That's why the Mark III beta includes a much tamer HDR grade.
For anyone else who was confused by this, it seems to be a client-side audio compressor feature (not a server-side adjustment) labeled as "Stable Volume". On the web, it's toggleable via the player settings menu.
https://support.google.com/youtube/answer/14106294
I can't find exactly when it appeared but the earliest capture of the help article was from May 2024, so it is a relatively recent feature: https://web.archive.org/web/20240523021242/https://support.g...
I didn't realize this was a thing until just now, but I'm glad they added it because (now that I think about it) it's been awhile since I felt the need to adjust my system volume when a video was too quiet even at 100% player volume. It's a nice little enhancement.
I completely understand the desire to address the issue of content authors misusing or intentionally abusing HDR with some kind of auto-limiting algorithm similar to the way the radio 'loudness wars' were addressed. Unfortunately, I suspect it will be difficult, if not impossible, to achieve without also negatively impacting some content applying HDR correctly for artistically expressive purposes. Static photos may be solvable without excessive false positive over-correction but cinematic video is much more challenging due to the dynamic nature of the content.
As a cinemaphile, I'm starting to wonder if maybe HDR on mobile devices simply isn't a solvable problem in practice. While I think it's solvable technically and certainly addressable from a standards perspective, the reality of having so many stakeholders in the mobile ecosystem (hardware, OS, app, content distributors, original creators) with diverging priorities makes whatever we do from a base technology and standards perspective unlikely to work in practice for most users. Maybe I'm too pessimistic but as a high-end home theater enthusiast I'm continually dismayed how hard it is to correctly display diverse HDR content from different distribution sources in a less complex ecosystem where the stakeholders are more aligned and the leading standards bodies have been around for many decades (SMPTE et al).
Another related parallel trend recently is that bad AI images get very high view and like counts, so much so that I've lost a lot of motivation for doing real photography because the platforms cease to show them to anyone, even my own followers.
I set my screen brightness to a certain level for a reason. Please don’t just arbitrarily turn up the brightness!
There is no good way to disable HDR on photos for iPhone, either. Sure, you can turn off the HDR on photos on your iphone. But then, when you cast to a different display, the TV tries to display the photos in HDR, and it won’t look half as good.
You might be on to something there. Technically, HDR is mostly about profile signaling and therefore about interop. To support it in mpeg dash or hls media you need to make sure certain codec attributes are mentioned in the xml or m3u8 but the actual media payload stays the same.
Any bit or Bob being misconfigured or misinterpreted in the streaming pipeline will result in problems ranging from slightly suboptimal experience to nothing works.
Besides HDR, "spatial audio" formats like Dolby Atmos are notorious for interop isuues
On both Android & iOS/MacOS it's not that HDR is ignoring your screen brightness, but rather the brightness slider is controlling the SDR range and then yes HDR can exceed that, that's the singular purpose of HDR to be honest. All the other purported benefits of HDR are at best just about HDR video profiles and at worst just nonsense bullshit. The only thing HDR actually does is allow for brighter colors vs. SDR. When used selectively this really enhances a scene. But restraint is hard, and most forms of HDR content production are shit. The HDR images that newer iPhones and Pixel phones are capturing are generally quite good because they are actually restrained, but then ironically both of them have horrible HDR video that's just obnoxiously bright.
In contrast, my TV will change brightness modes to display HDR content and disables some of the brightness adjustments when displaying HDR content. It can be very uncomfortably bright in a dark room while being excessively dim in a bright room. It requires adjusting settings to a middle ground resulting in a mixed/mediocre experience overall. My wife’s laptop is the worst of all our devices, while reviews seem to praise the display, it has an overreactive adaptive brightness that cannot be disabled (along with decent G2G response but awful B2W/W2B response that causes ghosting).
Some games also have a separate slider https://i.imgur.com/wenBfZY.png for adjusting "paper white", which is the HDR white one might normally associate with matching to SDR reference white (100 nits when in a dark room according to the SDR TV color standards, higher in other situations or standards). Extra note: the peak brightness slider in this game (Red Dead Redemption 2) is the same knob as the brightness slider in the above Battlefield V screenshot)
Deleted Comment
I think it's because no one wants it.
Let the whole experience be HDR and perhaps it won't be jarring.
This is also true for consumers. I don't own a single 4k or HDR display. I probably won't own an HDR display until my TV dies, and I probably won't own a 4k display until I replace my work screen, at which point I'll also replace one of my home screens so I can remote into it without scaling.
I think the industry is strangling itself putting "DisplayHDR 400" certification on edgelit/backlit LCD displays. In order for HDR to look "good" you either need high resolution full array local dimming backlighting (which still isn't perfect), or a panel type that doesn't use any kind of backlighting like OLED.
Viewing HDR content on these cheap LCDs often looks worse than SDR content. You still get the wider color gamut, but the contrast just isn't there. Local dimming often loses all detail in shadows whenever there is something bright on the screen.
I absolutely loathe consuming content on a mobile screen, but its the reality is the vast majority are using phone and tablets most the time.
The problem starts with sending HDR content to SDR-only devices, or even just other HDR-standards. Not even talking about printing here.
This step can inherently only be automated so much, because it's also a stylistic decision on what information to keep or emphasize. This is an editorial process, not something you want to emburden casual users with. What works for some images can't work for others. Even with AI the preference would still need to be aligned.
[edit]
Some googling suggested I check in the Netflix app; at least Netflix thinks my phone does not support HDR. (Unihertz Jelly Max)
I also have a screen which has a huge gamut and blows out colors in a really nice way (a bit like the aftereffects of hallucinogens, it has colors other screens just don't) and you don't have to touch any settings.
My OLED TV has HDR and it actually seems like HDR content makes a difference while regular content is still "correct".
Don't feel like you have to. I bought a giant fancy TV with it, and even though it's impressive, it's kinda like ultra-hifi-audio. I don't miss it when I watch the same show on one of my older TVs.
If you ever do get it, I suggest doing for a TV that you watch with your full attention, and watching TV / movies in the dark. It's not very useful on a TV that you might turn on while doing housework; but very useful when you are actively watching TV with your full attention.
Like a lot of things, it’s weird how some people are more sensitive to visual changes. For example:
- At this point, I need 120hz displays. I can easily notice when my wife’s phone is in power saver mode at 60hz.
- 4k vs 1080p. This is certainly more subtle, but I definitely miss detail in lower res content.
- High bitrate. This is way more important than 4k vs 1080p or even HDR. But it’s so easy to tell when YouTube lowers the quality setting on me, or when a TV show is streaming at a crappy bitrate.
- HDR is tricky, because it relies completely on the content creator to do a good job producing HDR video. When done well, the image basically sparkles, water looks actually wet, parts of the image basically glow… it looks so good.
I 100% miss this HDR watching equivalent content on other displays. The problem is that a lot of content isn’t produced to take advantage of this very well. The HDR 4k Blu-ray of several Harry Potter movies, for example, has extremely muted colors and dark scenes… so how is the image going to pop? I’m glad we’re seeing more movies rely on bright colors and rich, contrasty color grading. There are so many old film restorations that look excellent in HDR because the original color grade had rich, detailed, contrasty colors.
On top of that, budget HDR implementations, ESPECIALLY in PC monitors, just don’t get very bright. Which means their HDR is basically useless. It’s impossible to replicate the “shiny, wet look” of really good HDR water if the screen can’t get bright enough to make it look shiny. Plus, it needs to be selective about what gets bright, and cheap TVs don’t have a lot of backlighting zones to make that happen very well.
So whereas I can plug in a 4k 120hz monitor and immediately see the benefit in everything I do for normal PC stuff, you can’t get that with HDR unless you have good source material and a decent display.
Also in my country (Italy) TV transmissions are 1080i at best, a lot are still 570i (PAL resolution). Streaming media can be 4K (if you have enough bandwidth to stream it at that resolution, which I don't have at my house). Sure, if you download pirated movies you find it at 4K, and if you have the bandwidth to afford it... sure.
But even there, sometimes is better a well done 1080p movie than an hyper compressed 4K one, since you see compression artifacts.
To me 1080p, and maybe even 720p, is enough for TV vision. Well, sometimes I miss the CRT TVs, they where low resolution but for example had a much better picture quality than most modern 4K LCD TV where black scenes are gray (I know there is OLED, but is too expensive and has other issues).
People in the HN echo chamber over-estimate hardware adoption rates. For example, there are millions of people who went straight from CDs to streaming, without hitting the iPod era.
A few years ago on HN, there was someone who couldn't wrap their brain around the notion that even though VCRs were invented in the early 1960's that in 1980, not everyone owned one, or if they did, they only had one for the whole family.
Normal people aren't magpies who trash their kit every time something shiny comes along.
Point of clarification: While the technology behind the VCR was invented in the '50s and matured in the '60s, consumer-grade video tape systems weren't really a thing until Betamax and VHS arrived in 1975 and 1976 respectively.
Early VCRs were also incredibly expensive, with prices ranging from $3,500 to almost $10,000 after adjusting for inflation. Just buying into the VHS ecosystem at the entry level was a similar investment to buying an Apple Vision Pro today.
Who?
There was about a decade there where everyone who had the slightest interest in music had an mp3 player of some kind, at least in the 15-30 age bracket.
This. I can always tell when someone "gets" software development when they either understand (or don't) that computers can't read minds or infer intent like a person can.