Wow, that's crazy, it actually works -- just tried out the sample videos in QuickTime on my 2016 13" MBP (P3 gamut, running Big Sur) and confirmed working.
Basically: if I set my display to ~75% brightness and open the video, the whites in the video are 100% brightness, way brighter than #FFF interface white on the rest of my screen.
But if I increase my display brightness to 100%, the whites in the video are the same as the interface white, because it obviously can't go any brighter.
If I decrease my display brightness to 50%, the whites in the video are no longer at maximum 100% brightness, maybe more like 75%.
But it's also kind of buggy -- after messing around with system brightness a bit, the video stops being brighter and I've got to quit QuickTime and restart it again for effect. Also, opening and closing the video file makes my cursor disappear for a couple seconds!
I'm wondering if it switches from hardware dimming to software dimming when the video is opened and closed, and if that switch has to do with the cursor disappears. If it is, though it's flawlessly undetectable in terms of brightness -- the interface white and grays don't change at all.
Interestingly confirming it: taking a screenshot of the video while screen is at 75% brightness show massive brightness clipping in the video, since it's "overexposed" in the interface's color range. But taking a screenshot while screen brightness is 100% shows no clipping, because the video is no longer "overexposed" to the interface.
I'm just so surprised I had utterly no idea macOS worked like this. I'd never heard of this feature until now.
> it's flawlessly undetectable in terms of brightness -- the interface white and grays don't change at all.
In order to pull this off, you need to know exactly how many nits bright the display is, as well as having complete software control of the actual hardware brightness. On Windows, you have neither. Enabling HDR mode completely throws off your current colors and desktop brightness and you have to reset both your physical monitor settings and dial in the new desktop white point with a software slider Microsoft buried in the advanced HDR settings (that almost nobody knows how to use) to hopefully be somewhere in the vicinity of what you had before.
When it comes to display technology, having vertical integration is a huge benefit. Look at high-DPI: state-of-art on Windows in 2020 is nowhere near as good from a software implementation or actual user experience point of view as it was on day 1 when Apple introduced Retina MacBooks back in 2012.
Mac OS has also had system-wide color management and calibration (ColorSync) since the early 90s, part of its legacy of being the preferred platform for desktop publishing.
On Windows the systemwide "color management" basically consists of assigning a default color profile that applications can choose to use - which is generally only done by professional design/photo/video software, and not by the desktop or most "normal" apps.
So basically until displays advertise their physical specifications to Windows, and the Windows display stack takes advantage of that to auto-calibrate, it cannot match this kind of output.
I wonder what happens if you try this on the LG 5K hooked up to a mac. It is physically the same panel that's in the iMac, so in theory it can present the same range. But if the OS needs to know the exact physical abilities of the display, it might not be able to detect that LG display. Or maybe apple does detect it, because they partnered with LG.
I'm with you on vertical integration. I've been looking at haptic feedback mouses and there are ones from gaming companies but no one has pulled it off well enough to enrich the gaming experience.
I have a Dell UP2414Q, which is a wide-gamut 4K monitor with a relatively high 375 nits peak brightness and 10-bit input.
It's not technically HDR, but if I view HDR videos with it in SDR mode with the brightness turned up to maximum, there is definitely a bit of an "HDR effect", similar to what Apple has achieved.
However, under Windows, without support from the operating system, this doesn't really work. The colours are shifted and the brightness of the content is way too low.
Microsoft could add something similar, but this is a company that thinks that colour is a feature in the sense of:
Colour: Yes.
We're approaching 2021 rapidly. I suspect we'll have to wait to 2030 before Microsoft gets its act together and copies even half of what Apple has in their display technology now.
So is this the reason why they removed the glowing apple on newer macbooks? Since it was powered by the backlight, it'd give that trick away real quick.
They removed it because the screens were getting too thin, and in bright conditions you could start to see through it - leaving a brighter blob in the middle.
What's cool is: 1) 4-year old Macs have become HDR laptops and 2) the implementation is subtle- you get full brightness in the HDR video without having the rest of the UI blast to full brightness.
I don't get it. Are they compressing the range of SDR display for SDR content so that HDR content can now use the full SDR range of SDR display?
As in "we crippled your SDR display so that you can more enjoy the HDR content and buy an expensive HDR display to be back at the normal SDR range for SDR content"?
No, at least not on my Mac. Nothing's crippled, don't worry.
If my screen is already at 100% brightness then there's no HDR effect. 100% brightness is true 100% brightness, the max capability of my backlight.
The only difference is that if my screen is at less than 100% brightness, the HDR content can be brighter than the rest of the screen because it has the headroom.
I wonder how a large organization does something like this successfully. Like you need your OS video driver team working with the application team and so on.
My impression is that PMing this is really hard. And then each of the other guys is going to have an opinion that this shouldn't be done because it's so rare, etc.
Something must be organizationally right for something like this capable engineering to have succeeded on such a barely noticeable feature.
I love it when products casually have cool things like this. Not quite the same scope but IntelliJ's subpixel hinting option has each element of the drop down displaying with the hints that it describes. You don't have to pick an option to see it. You can just preview it off directly.
I'm not sure it's as difficult as you suggest. As long as you don't let applications touch the pixel data of decoded images, it's all within libraries, and you can change the bit depth of buffers and change how they map to physical pixels easily.
Most frameworks don't let applications touch pixel data without jumping through some hoops, because by restricting it you can implement things like lazy loading, GPU jpeg decoding GPU resizing, etc.
> My impression is that PMing this is really hard. And then each of the other guys is going to have an opinion that this shouldn't be done because it's so rare, etc.
It's easier when you control all parts of the stack. No way they could have pulled that one off with NVidia who were "famous" for breaking with Apple years ago when Apple demanded to code the drivers themselves... for valid reasons when one looks at the quality of their Windows and Linux drivers. The Windows ones are helluvalot buggy and the Linux ones barely integrate with Linux because NVidia refuses to follow standards.
I still use a mac with an nvidia GPU, and the driver appears to be leaking memory a lot. Unless I reboot it about once a week, it becomes noticeably laggy.
It seems to me that Microsoft really makes no effort to improve subtle aspects of Windows and hardware integration. In fact, their OS is in such shambles and is a disoriented mess with respect to UI consistency. They introduced Vista, Metro, and now Fluent. Yet there's almost no coherence and the UI is now a mish-mash of XP, Metro, and Fluent era elements. By the time they announce their next UI, you can bet you'll now see yet another ingredient added to the jumbled soup.
It really is beyond belief that an organization with so many employees can fail to adhere to a uniform vision and standard and focus on correcting details.
I'm a life long Windows and Android user. But honestly, seeing articles like this and how smooth the UI on macOS and uniformly they apply new updates and UI changes makes me extremely jealous and resentful that Microsoft is so bad at something so basic.
Features are great, but users at their start point interact with UI first. They need to fix that before anything else.
Now they want to give you the option to run Android apps on Windows through emulation. This just going to create a bigger jumbled mess.
>It seems to me that Microsoft really makes no effort to improve subtle aspects of Windows and hardware integration.
Latest Windows 10 iteration is by far the snappiest OS I've used in a long time since it uses GPU acceleration for the desktop window manager. You can check this in task manager. The icing on the cake, if you gave a laptop with 2 GPUS(Optimus) is when you can run a demanding 3D app like a game in windowed mode in parallel with other stuff like watching videos on youtube and you can see in task manager how windows uses the external GPU to render the game and the integrated GPU to accelerate your web browser, all running butterly smooth.
>In fact, their OS is in such shambles and is a disoriented mess with respect to UI consistency.
True, but that's what you get with 30 years worth of built in backwards compatibility. I can run a copy of Unreal Tournament 1999 that was just copied off an old PC with no sweat right after ripping and tearing in Doom Eternal. Can you run 20 year old software on current Apple without emulation? Apple can afford to innovate in revolutionary ways when it dumps older baggage whenever it feels like it and start from a fresh drawing board without looking back, see intel to apple silicon transition. In 2 years x86 apps will be considered legacy/obsolete on Mac hardware. Microsoft can't really do this with windows so yeah, it's a mess of new GUI elements for the simple stuff and windows 2000 era GUI elements for the deep pro settings. The advantage is that if you're an old time Windows user you can easily find your way using the "old" settings and if you're new to windows you can do most configs through the "new" GUI without touching the scary looking "old" settings.
Gotta be a point at which you decide to do a complete rewrite and ship a copy of the old OS in an emulation layer. That’s what Apple did with early OS X. Ship with a very well integrated Windows 2000 or whatever in an emulated container and be done with it
The long-standing problem with Microsoft's internal culture is that it rewards people and teams for 'innovating' where innovation in many cases is just reinvention.
I worked at Facebook for years and it now has a similar problem. Developers are evaluated every six months on their 'impact', which results in many dropping boring work and joining teams that are doing new things, even if they aren't needed.
Similar with Google. I'm amazed at how Apple can take something as old as typography, breathe new life into it, and make everyone shocked and awed. Another one: I have yet to use a trackpad as good as theirs. It's good because it doesn't have any moving parts. It's all haptic feedback.
> It really is beyond belief that an organization with so many employees can fail to adhere to a uniform vision and standard and focus on correcting details.
If you’re not spent time in a huge company, this might seem to be the case. But really, Apple’s uniform standard really is the exception. I’m sure there are other organizational costs for this, such as it being harder to take risks with products or execute quickly.. but gosh they are good at producing a cohesive, mostly consistent set of products. I deeply appreciate their attention to detail and long term commitment.
Even on their own hardware (Surface Pro) I've ran into really annoying issues with DPI scaling, especially when docking/undocking from a secondary monitor.
> This EDR display philosophy is so important to Apple that they are willing to spend battery life on it. When you map “white” down to gray, you have to drive the LED backlight brighter for the same perceived screen brightness, using more power. Apple has your laptop doing this all the time, on the off chance that some HDR pixels come along to occupy that headroom.
This is a bit misleading. The backlight isn’t at a higher level than necessary for sRGB content all the time, just whenever any HDR encoded videos or EDR apps are open. When you open an HDR video you can see the highlights getting brighter as the backlight gets pushed.
> just whenever any HDR encoded videos or EDR apps are open
Yup, I think this has to be the case. But the crazy thing is, I can't perceive any shift in UX brightness whatsoever, even a flicker, when I open/close an EDR video.
I would have thought that there would be some slight mismatch at the moment the backlight is brightened and pixels are darkened -- whether it would be a close but not perfect brightness match, or a flicker while they're not synced. But nothing.
As I mentioned in another comment, the only giveaway is that my cursor (mouse pointer) disappears for a second or two. I have to guess that adjusting its brightness happens at a different layer in the stack that can't be so precisely synced.
Yes, this is what I came to see. It gradually increases the brightness over 5 seconds on my Catalina Macbook. It's very impressive that there are no visible brightness changes on the rest of the screen when the brightness of the backlight increases.
>When you open an HDR video you can see the highlights getting brighter as the backlight gets pushed.
but what about blacks? If you have a dark scene with bright highlights (eg. campfire at night), does the black parts of the scene get blown out because backlight bleed?
I wonder if this contributed to Apple removing the light-up Apple logo on the backs of MacBook screens. If it were still there, it would give it away when the backlight brightness is changing and potentially become distracting.
I’ve had the XDR Pro Display since June, and use it primarily for dev. I just trusted Apple in their focus on this direction. Suffice it to say I am very happy with it.
What I like about conclusions of this piece is it points to how strategic Apple is thinking in its leverage due to the breadth of distribution of advanced hardware and software.
Apple is able to set entire new standards of expectation for customers that are _very_ hard for competitors to follow.
While competitors fixate on some single feature like night photo quality, Apple is also subtly chipping away at something like this.
There's no purer expression of a certain class of cloistered Apple-thought than this - the competitors "focus on one feature, photos at night" (come on, really?), in comparison to the focus Apple shows on a __six thousand dollar display__, which is being lauded for...checks notes...using a display-specific ICC profile, which is _tablestakes_ by every modern operating system and premium hardware manufacturer for setting colorspaces and tuning per-display at the factory, and the color profile for the display uses an artificially low white point. A banal, baseline features of displays and OS 1/6th the cost is laundered into a declaration of supremacy, and we all lose all the cool information about the underlying tech and color science we could have been discussing.
> Apple is able to set entire new standards of expectation for customers that are _very_ hard for competitors to follow.
The cheaper version of this display has a price tag of $5k, the more expensive one $6k.
I never spent even remotely as much money on a display, so I cannot speak from first-hand experience. But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Apple does many things, but certainly not bleeding-edge innovation. Of course it likes to sell its products as such, but I guess that's just how marketing works.
> I never spent even remotely as much money on a display, so I cannot speak from first-hand experience.
Translation: I don't understand this display.
> But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Translation: Despite my lack of knowledge, I feel qualified to say Apple sucks.
> Apple does many things, but certainly not bleeding-edge innovation.
What? Having been in the inner sanctum of engineering within Apple for years, that's exactly the engineering priority for groups I saw and worked within. I'm genuinely curious why you assert otherwise, find it surprising.
When the XDR came out it was competing with monitors that were 40,000 USD and up. (With some compromises like the stand which is just an art piece and the viewing angles being squiff). If the competition is now priced competitively then that’s very good for consumers.
The main issue with comparing the spec is the difference are not really shown in these specs, e.g. luminance and chromaticity uniformity (aka Delta E), among other things. Consumer grade monitor usually do not have these, and when combining with wide color gamut and _measured_ color space (not claimed), we're talking about at least higher grade EIZO monitors which are not less than $1k.
The monitor that are comparable to the Pro XDR Display are ASUS PA32UCX ($4500) or EIZO CG319X ($6000) which usually requires full recalibration every certain amount of usage.
There are both cheaper and far more expensive displays used for graphics design to special effects and major film mastering. In this case, Apple seems to be trying to bridge the gap with the XDR and create a new middle tier of price and functionality.
As for bleeding edge, Apple has pioneered plenty of technology. It's true that it builds upon foundations of technical designs and scientific discoveries by others but that applies to every other company as well. Very few organizations are capable of going straight from invention in a science lab to large scale commercial product all by themselves. If you judge by how much "new" technology has actually reached consumers though, Apple is clearly leading the field.
The Windows HDR implementation is complete crap, exactly because they don't have have full control of the hardware stack as Apple does and can't change all the monitor settings like they would need to.
When you toggle HDR on Windows, the desktop becomes dull gray and desaturated exactly because they pull down the previous desktop brightness to something less than 255. So you have to then adjust your monitor's brightness up to compensate. The monitor's brightness effectively sets the upper cap of the HDR brightness, so let's say your brightness was set at 50% before, now you've got to fiddle with the monitor to boost the screen brightness to 100% to allow HDR to function, and to achieve your previous desktop white brightness (you'll probably also have to adjust the software "desktop white" point slider you mentioned, since MS has no clue what the correct monitor brightness and SDR pull down amount should be, so good luck matching your previous desktop colors and brightness). In my experience very few people successfully manage to setup their Windows HDR correctly, and even if you do there's no way to "seamlessly" switch between the two modes (which you have to do since tons of stuff on Windows doesn't work properly when HDR mode is enabled). I haven't checked Surface or other MS hardware, perhaps they're able to do something more clever there?
What Apple does, is that when your display brightness is 50% and you display HDR content, the HDR content will seamlessly appear at a brightness somewhere between 75% to 100% of the maximum screen brightness. That is a seamless HDR effect, giving you the whiter than white experience next to your other windows that just works.
I haven't encountered such issues with LG C9. I don't need to touch the settings on my TV when enabling/disabling Windows HDR (which puts my TV in HDR mode), the previous SDR-mode desktop brightness is achieved in HDR mode just fine.
Though I remember having read that the Windows HDR stuff works slightly differently for internal monitors (e.g. in laptops), is your experience with those?
It can also be argued that Apple’s approach is a bit user-hostile. If someone wants their display at a 50% brightness ceiling (e.g. working in a darker room) it would be jarring to see HDR content overriding that, especially as such content becomes more prevalent.
The huge Windows flaw is that it isn't color managed by default. It means that colors in most applications look extra saturated on wide gamut displays. I wonder if it the same flaw applies to HDR.. apps would look extra bright.
Macs and its apps have been properly color managed for decades. That's why the transition from SDR to HDR monitors has been painless. Apps have been ready for it for a long time.
The "non-HDR" displays are actually high-quality, wide gamut displays. They may not have been certified for HDR when they were made, but it's quite likely to be HDR400 capable.
In particular https://gitlab.freedesktop.org/swick/wayland-protocols/-/blo... was linked which discussed how a wayland compositor would have to display mixed "HDR" and SDR content on the same display. This document even has references to EDR. Ultimately this would end up achieving a similar result as what's described in the blog post here.
If you're interested in the technical details on what may be necessary to achieve something like this, the wayland design document might be a good read.
I noticed this 'extra bright whites in video' effect on my iPhone 12 just after upgrading. I had to go and shoot 'normal' + HDR video of the same thing a few times back and forth to be sure I knew what I was seeing.
The first thing you think is, how was I OK with this terrible standard video to begin with? The HDR version just looks SO MUCH better and the standard looks so flat next to it. Like comparing an old non HDR photo with an HDR one.
Basically: if I set my display to ~75% brightness and open the video, the whites in the video are 100% brightness, way brighter than #FFF interface white on the rest of my screen.
But if I increase my display brightness to 100%, the whites in the video are the same as the interface white, because it obviously can't go any brighter.
If I decrease my display brightness to 50%, the whites in the video are no longer at maximum 100% brightness, maybe more like 75%.
But it's also kind of buggy -- after messing around with system brightness a bit, the video stops being brighter and I've got to quit QuickTime and restart it again for effect. Also, opening and closing the video file makes my cursor disappear for a couple seconds!
I'm wondering if it switches from hardware dimming to software dimming when the video is opened and closed, and if that switch has to do with the cursor disappears. If it is, though it's flawlessly undetectable in terms of brightness -- the interface white and grays don't change at all.
Interestingly confirming it: taking a screenshot of the video while screen is at 75% brightness show massive brightness clipping in the video, since it's "overexposed" in the interface's color range. But taking a screenshot while screen brightness is 100% shows no clipping, because the video is no longer "overexposed" to the interface.
I'm just so surprised I had utterly no idea macOS worked like this. I'd never heard of this feature until now.
In order to pull this off, you need to know exactly how many nits bright the display is, as well as having complete software control of the actual hardware brightness. On Windows, you have neither. Enabling HDR mode completely throws off your current colors and desktop brightness and you have to reset both your physical monitor settings and dial in the new desktop white point with a software slider Microsoft buried in the advanced HDR settings (that almost nobody knows how to use) to hopefully be somewhere in the vicinity of what you had before.
When it comes to display technology, having vertical integration is a huge benefit. Look at high-DPI: state-of-art on Windows in 2020 is nowhere near as good from a software implementation or actual user experience point of view as it was on day 1 when Apple introduced Retina MacBooks back in 2012.
On Windows the systemwide "color management" basically consists of assigning a default color profile that applications can choose to use - which is generally only done by professional design/photo/video software, and not by the desktop or most "normal" apps.
I wonder what happens if you try this on the LG 5K hooked up to a mac. It is physically the same panel that's in the iMac, so in theory it can present the same range. But if the OS needs to know the exact physical abilities of the display, it might not be able to detect that LG display. Or maybe apple does detect it, because they partnered with LG.
> The operating system is complicit in this trickery, so the Digital Color Meter eyedropper shows “white” as 255, as do screenshots.
Digital Color Meter shows both UI white and video white as exactly #fff despite the video white being much brighter!
Even at full screen brightness, the video white was noticeably brighter
It's not technically HDR, but if I view HDR videos with it in SDR mode with the brightness turned up to maximum, there is definitely a bit of an "HDR effect", similar to what Apple has achieved.
However, under Windows, without support from the operating system, this doesn't really work. The colours are shifted and the brightness of the content is way too low.
Microsoft could add something similar, but this is a company that thinks that colour is a feature in the sense of:
We're approaching 2021 rapidly. I suspect we'll have to wait to 2030 before Microsoft gets its act together and copies even half of what Apple has in their display technology now.Deleted Comment
https://youtu.be/4xgx4k83zzc
https://appleinsider.com/articles/20/08/03/what-hdr-hdr10-an...
That plus P3 gamut means a video playing on a display is closer to what a filmmaker intended.
https://en.wikipedia.org/wiki/DCI-P3
What's cool is: 1) 4-year old Macs have become HDR laptops and 2) the implementation is subtle- you get full brightness in the HDR video without having the rest of the UI blast to full brightness.
That video can have a very bright sky, for instance. You can have a bright sky and a blinding text box or neither in other OSs.
It’s also a very bright display in its own right, with 1600 nits vs the 300-400 of a regular one. And 1,000,000:1 contrast as well.
If my screen is already at 100% brightness then there's no HDR effect. 100% brightness is true 100% brightness, the max capability of my backlight.
The only difference is that if my screen is at less than 100% brightness, the HDR content can be brighter than the rest of the screen because it has the headroom.
Does that make sense?
My impression is that PMing this is really hard. And then each of the other guys is going to have an opinion that this shouldn't be done because it's so rare, etc.
Something must be organizationally right for something like this capable engineering to have succeeded on such a barely noticeable feature.
I love it when products casually have cool things like this. Not quite the same scope but IntelliJ's subpixel hinting option has each element of the drop down displaying with the hints that it describes. You don't have to pick an option to see it. You can just preview it off directly.
https://hbr.org/2020/11/how-apple-is-organized-for-innovatio...
Most frameworks don't let applications touch pixel data without jumping through some hoops, because by restricting it you can implement things like lazy loading, GPU jpeg decoding GPU resizing, etc.
It's easier when you control all parts of the stack. No way they could have pulled that one off with NVidia who were "famous" for breaking with Apple years ago when Apple demanded to code the drivers themselves... for valid reasons when one looks at the quality of their Windows and Linux drivers. The Windows ones are helluvalot buggy and the Linux ones barely integrate with Linux because NVidia refuses to follow standards.
It really is beyond belief that an organization with so many employees can fail to adhere to a uniform vision and standard and focus on correcting details.
I'm a life long Windows and Android user. But honestly, seeing articles like this and how smooth the UI on macOS and uniformly they apply new updates and UI changes makes me extremely jealous and resentful that Microsoft is so bad at something so basic.
Features are great, but users at their start point interact with UI first. They need to fix that before anything else.
Now they want to give you the option to run Android apps on Windows through emulation. This just going to create a bigger jumbled mess.
Latest Windows 10 iteration is by far the snappiest OS I've used in a long time since it uses GPU acceleration for the desktop window manager. You can check this in task manager. The icing on the cake, if you gave a laptop with 2 GPUS(Optimus) is when you can run a demanding 3D app like a game in windowed mode in parallel with other stuff like watching videos on youtube and you can see in task manager how windows uses the external GPU to render the game and the integrated GPU to accelerate your web browser, all running butterly smooth.
>In fact, their OS is in such shambles and is a disoriented mess with respect to UI consistency.
True, but that's what you get with 30 years worth of built in backwards compatibility. I can run a copy of Unreal Tournament 1999 that was just copied off an old PC with no sweat right after ripping and tearing in Doom Eternal. Can you run 20 year old software on current Apple without emulation? Apple can afford to innovate in revolutionary ways when it dumps older baggage whenever it feels like it and start from a fresh drawing board without looking back, see intel to apple silicon transition. In 2 years x86 apps will be considered legacy/obsolete on Mac hardware. Microsoft can't really do this with windows so yeah, it's a mess of new GUI elements for the simple stuff and windows 2000 era GUI elements for the deep pro settings. The advantage is that if you're an old time Windows user you can easily find your way using the "old" settings and if you're new to windows you can do most configs through the "new" GUI without touching the scary looking "old" settings.
Uh, Windows started doing that in Vista.
I worked at Facebook for years and it now has a similar problem. Developers are evaluated every six months on their 'impact', which results in many dropping boring work and joining teams that are doing new things, even if they aren't needed.
Windows struggles to correctly display Adobe RGB (wide gamut but not HDR) JPG images I've downloaded in 1999, now over 21 years ago and counting.
They'll get there... eventually. Maybe next decade, by which I mean 2030, not 2021.
If you’re not spent time in a huge company, this might seem to be the case. But really, Apple’s uniform standard really is the exception. I’m sure there are other organizational costs for this, such as it being harder to take risks with products or execute quickly.. but gosh they are good at producing a cohesive, mostly consistent set of products. I deeply appreciate their attention to detail and long term commitment.
This is a bit misleading. The backlight isn’t at a higher level than necessary for sRGB content all the time, just whenever any HDR encoded videos or EDR apps are open. When you open an HDR video you can see the highlights getting brighter as the backlight gets pushed.
Yup, I think this has to be the case. But the crazy thing is, I can't perceive any shift in UX brightness whatsoever, even a flicker, when I open/close an EDR video.
I would have thought that there would be some slight mismatch at the moment the backlight is brightened and pixels are darkened -- whether it would be a close but not perfect brightness match, or a flicker while they're not synced. But nothing.
As I mentioned in another comment, the only giveaway is that my cursor (mouse pointer) disappears for a second or two. I have to guess that adjusting its brightness happens at a different layer in the stack that can't be so precisely synced.
So that's why Lunar[1] reads a much higher brightness and makes all external monitors as bright as ten thousand suns when HDR content is played.
I wonder if there's any way to detect this and read or compute the SDR brightness instead.
[1] https://github.com/alin23/Lunar/issues/86
but what about blacks? If you have a dark scene with bright highlights (eg. campfire at night), does the black parts of the scene get blown out because backlight bleed?
What I like about conclusions of this piece is it points to how strategic Apple is thinking in its leverage due to the breadth of distribution of advanced hardware and software.
Apple is able to set entire new standards of expectation for customers that are _very_ hard for competitors to follow.
While competitors fixate on some single feature like night photo quality, Apple is also subtly chipping away at something like this.
The cheaper version of this display has a price tag of $5k, the more expensive one $6k.
I never spent even remotely as much money on a display, so I cannot speak from first-hand experience. But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Apple does many things, but certainly not bleeding-edge innovation. Of course it likes to sell its products as such, but I guess that's just how marketing works.
Translation: I don't understand this display.
> But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Translation: Despite my lack of knowledge, I feel qualified to say Apple sucks.
Typical HN.
What? Having been in the inner sanctum of engineering within Apple for years, that's exactly the engineering priority for groups I saw and worked within. I'm genuinely curious why you assert otherwise, find it surprising.
The monitor that are comparable to the Pro XDR Display are ASUS PA32UCX ($4500) or EIZO CG319X ($6000) which usually requires full recalibration every certain amount of usage.
As for bleeding edge, Apple has pioneered plenty of technology. It's true that it builds upon foundations of technical designs and scientific discoveries by others but that applies to every other company as well. Very few organizations are capable of going straight from invention in a science lab to large scale commercial product all by themselves. If you judge by how much "new" technology has actually reached consumers though, Apple is clearly leading the field.
The closest competitor I’ve found is the PA32UCX at $4500, and supposedly its fan is noisy enough that I’m hesitant to buy it.
> it’s strange and new, and possibly unique to Apple.
But that is exactly how Windows 10 does it with HDR displays, too. So not really unique to Apple. To the article's benefit, it did say "possibly" :)
I'm using Win 10 myself with an HDR display, and HDR white appears brighter than "desktop" white just like in the article photo.
There is also a slider in Win10 HDR settings that allows you to bring SDR/"desktop" white up if you wish to oversaturate SDR.
When you toggle HDR on Windows, the desktop becomes dull gray and desaturated exactly because they pull down the previous desktop brightness to something less than 255. So you have to then adjust your monitor's brightness up to compensate. The monitor's brightness effectively sets the upper cap of the HDR brightness, so let's say your brightness was set at 50% before, now you've got to fiddle with the monitor to boost the screen brightness to 100% to allow HDR to function, and to achieve your previous desktop white brightness (you'll probably also have to adjust the software "desktop white" point slider you mentioned, since MS has no clue what the correct monitor brightness and SDR pull down amount should be, so good luck matching your previous desktop colors and brightness). In my experience very few people successfully manage to setup their Windows HDR correctly, and even if you do there's no way to "seamlessly" switch between the two modes (which you have to do since tons of stuff on Windows doesn't work properly when HDR mode is enabled). I haven't checked Surface or other MS hardware, perhaps they're able to do something more clever there?
What Apple does, is that when your display brightness is 50% and you display HDR content, the HDR content will seamlessly appear at a brightness somewhere between 75% to 100% of the maximum screen brightness. That is a seamless HDR effect, giving you the whiter than white experience next to your other windows that just works.
Though I remember having read that the Windows HDR stuff works slightly differently for internal monitors (e.g. in laptops), is your experience with those?
You have the HDR/SDR brightness balance to set and the monitor's own contrast adjustments.
Macs and its apps have been properly color managed for decades. That's why the transition from SDR to HDR monitors has been painless. Apps have been ready for it for a long time.
No, apps on Windows HDR look normal unless they are HDR-aware and use the "extra brightness".
In particular https://gitlab.freedesktop.org/swick/wayland-protocols/-/blo... was linked which discussed how a wayland compositor would have to display mixed "HDR" and SDR content on the same display. This document even has references to EDR. Ultimately this would end up achieving a similar result as what's described in the blog post here.
If you're interested in the technical details on what may be necessary to achieve something like this, the wayland design document might be a good read.
The first thing you think is, how was I OK with this terrible standard video to begin with? The HDR version just looks SO MUCH better and the standard looks so flat next to it. Like comparing an old non HDR photo with an HDR one.