I really wish more people wanted screens that looked as good as their cellphone.
Bright, sharp text, great color. We've had the great Apple Studio Display for years now, it's about time others came to fix some of it's short-comings like 27" size, 60hz and lack of HDMI ports for use with other systems.
So many of us have to stare at a screen for hours every day and having one that reduces strain on my eyes is well worth $1-3k if they'd just make them.
The company I work at gives all new developers a pair of 1080p displays that could have come right out of 2010.
It amazes me, and it’s so sad. They have no idea what they’re missing. I’m sure high PPI would pay off fast in eye strain. And it’s not like monitors need replacement yearly. Tons of time to recoup that small cost.
I’m not arguing for $2k 37” monitors, just better than $200 ones.
Even $200 will already buy a 4K 27" (LG). Which aren't even bad. I swear by HiDPI as well but my work is the same. 1080p displays and really bad contrast screens too. Definitely not TN (they're not that bad) and not VA (they tend to have way better contrast than IPS). Probably just bottom barrel IPS.
Just about every company does something like this.
At one point in my career, I just started buying my own monitors and bringing them into work.
I remember when ~19 or 20" was the norm, and I bought a dell 30" 2560x1600 monitor. Best $1400 I ever spent, used it for years and years.
(I still have it although I retired it a few years back because it uses something called dual-link DVI which is not easily supported anymore)
I think if you are an engineer, you should dive headlong into what you are. Be proactive and get the tools you need. Don't wait for some management signoff that never comes while you suffer daily, and are worse at your job.
I work for a white-shoe law firm in Boston. I and most of my peers have total compensation approaching $500k.
And we have 1280x1024 monitors from the 00s, and we're not allowed to have anything better, even out of our own pockets, because "that's what we use here".
I don't think people care all that much about phones. It's just that phones are power-constrained, so manufacturers wanted to move to OLEDs to save on backlight; and because the displays are small, the tech was easier to roll out there than on 6k 32-inch monitors.
But premium displays exist. IPS displays on higher-end laptops, such as ThinkPads, are great - we're talking stuff like 14" 3840x2160, 100% Adobe RGB. The main problem is just that people want to buy truly gigantic panels on the cheap, and there are trade-offs that come with that. But do you really need 2x32" to code?
The other thing about phones is that you have your old phone with you when you buy a new one, so without even really meaning to you're probably doing a side by side direct comparison and improvements to display technology are a much bigger sales motivator.
Outside Thinkpads IPS is basically the cheap/default option on laptops, with OLED being the premium choice. With Thinkpads TN without sRGB coverage is the cheap/default option, with IPS being the premium choice.
A fast color e ink would be possible but development would be very expensive for an unknown market. Would be a perfect anti eye strain second monitor though.
Is there a shortlist of top of the line utilitarian monitors that you can just buy, without researching or being some niche gamer?* Something similar to LG G-series TV's. Seems like Apple Studio, Dell UltraSharp are on that list. Any others?
*Struggling for words, but I'm looking more for the expedient solution rather than the "craft beer" or "audiophile" solution.
Keep in mind that normal OLEDs are quite bad for typical development tasks: lots of text with high contrast. Here is an example that would be unbearable for me: [1]. For text, IPS rules so far. For video and games, definitely OLED.
If you truly don't want to research use rtings best monitor for X articles and find your budget and buy that one, if you feel the need to compare further pop the model numbers into your favourite LLM
I have 27" 5K monitors at home since I WFH. One reason I don't really want to RTO is because these monitors aren't standard yet even if they have been out for more than a decade now (and my FAANG employer won't spring for the good stuff). That and my mechanical keyboard would never work in an open office :P.
Isn't the main difference glossy vs matte? With glossy you get usually bright great color and that's what you get on cellphones and Macbooks as well. For some reason matte is still the preference when it comes to monitors and you can't escape their muted color palette.
I have trouble making out details on my 45" UWQHD (3440x1440) displays... so I don't see much point.. maybe slightly easier to read typefaces... I am already zooming 25% in most of the time.
On the plus side, I can comfortably fit my editor on half the screen and my browser on the other half.
I have a Studio Display and would also love if it had a much higher refresh rate, but only because I play WoW on it. Why isn't 60hz enough for programming? I don't think I notice the refresh rate at all when not playing a video game or watching videos.
It seems rather silly to assume all people universally have the same needs, desires, and expenses. We don't live in the world of The Giver. I can accept that firefighters need a truck much more advanced and expensive than I ever will. It would be odd to compare that expense to how many pizza's I order each year.
> So many of us have to stare at a screen for hours every day and having one that reduces strain on my eyes is well worth $1-3k if they'd just make them.
I'm 53 y/o and didn't have glasses until 52. And at 53 I only use them sporadically. For example atm I'm typing this without my glasses. I can still work at my computer without glasses.
And yet I spent 10 hours a day in front of computer screens since I was a kid nearly every day of my life (don't worry, I did my share of MX bike, skateboarding, bicycling, tennis, etc.).
You know the biggest eye-relief for me? Not using anti-aliased font. No matter the DPI. Crisp, sharp, pixel-perfect font only for me. Zero AA.
So a 110 / 120 ppi screen is perfect for me.
Not if you do use anti-aliased font (and most people do), I understand the appeal of smaller pixels, for more subtle AA.
But yup: pixel perfect programming font, no anti-aliasing.
38" ultra-wide, curved, monitor. Same monitor since 2017 and it's my dream. My wife OTOH prefers a three monitors setup.
So: people have different preferences and that is fine. To each his own bad tastes.
Anecdata but I played games at 4K on a 4GHz Haswell (2013) + 1080 Ti (2017). Definitely faster at 2K but 4K was servicable. It's probably less true now that I'm 1+ years away from the hardware, but 4K gameplay is surprisingly accessible for modest hardware IMO.
So good news, there is a fair amount of monitors coming soon which are super high resolution that offer a "dual mode" which is lower resolution that has higher refresh rate. They are pretty cool.
That’s not really true. I tried out 5K which would reasonably be quite heavy, but honestly with a DLSS it’s super viable. If you get the gaming versions of these displays, they also have dual modes and in that mode a 6K display is going to be less heavy to run than a 4K one and a 5K display is going to be 1440p.
Apple has updated Studio Display and XDR coming out soon, they just filed for regulatory approval in China, which historically has meant 1-3 months until release [1]. The updated models are expected to have a 2000+ zone Mini LED backlight and 120hz refresh rate, while the LG is old technology - 60hz and edge lit, plus the matte coating is anecdotally quite blurry and loses a lot of the detail of the 6K panel. May be worth waiting to see what Apple has in store for this product category before putting down thousands of dollars.
As an owner of the XDR, I find it ironic that you're saying "maybe wait to see before putting down thousands of dollars" for this $2K offering when there's no way that a refreshed XDR is going to be less than $6K.
Also as an owner of 2 27" 4K HDR 144Hz monitors that Apple rendered pointless to make the XDR work in the first place ("Wow, Apple's done some magic to make the display bandwidth work!" = "Apple fucked DP 1.4 users post-Catalina and will not admit it". Myself and countless other users saw our Macs that could drive those setups with Catalina be limited to 60Hz HDR, 95Hz SDR with Big Sur on. Hell, we got better performance if we told our displays to downgrade to DP 1.2).
And let's do the math:
- 6016 x 3384
- at 120Hz
- in 10 bit HDR
- 4:4:4 Chroma
works out to be just shy of 80Gbit/s.
Oh... plus if it's like the XDR with three additional ports that we'd assume should be at least 10Gbit/s each, and we're at 120Gbit/s.
Not that the XDR supports HDMI but you'd need at least HDMI 2.2, which isn't on any Mac right now.
And you'd need a full speed TB5 setup, so M4 Pro or Max (which I'll grant if you're laying out $6-7,000 for your display should be the least of your concern).
But saying that the XDR and this are comparable offerings is strange. "Hmm, why would I buy this $2,000 display now, when at some point in the near future Apple might have a better one for only triple the price!"
Fantasy price for your personal usage or the personal usage of most average consumers/ software engineers, sure. It bears repeating: you're not the target audience.
They've been in the display game a long time. For people that need the product capabilities for their specific job, like color grading, they seem to price them quite well, given everywhere I used to see $30,000-$50,000 reference monitors, I see Studio Displays now.
Other manufacturers are likely to use the same panel as the new XDR and Studio display. The peak display tech this year will be the same glossy, high refresh rate mini LED used in the Apple displays, sold by third parties for a more reasonable price. You have to compromise a bit on the design, but in return you get a sensible price, much greater input connectivity and 'dual mode' which is useful if you want to also use it for gaming.
Basically what he just said, last week I put in a ton of time researching and came to the same exact conclusion.
But of course, you read that. So, I’ll take a hack at guessing what’s on your mind past that.
I bet it won’t be much more expensive than the LG list price, $2000. LG seems hellbent on making margins on this panel in their 1st party monitor. Ex. You can get the same panel but better quality in Asus Proart for $1300.
The key thing to watch here is, is Apple’s also 6K? If so, are they getting better panels than LG gives itself or Asus? (likely)
Regardless, it’s a shitshow with this panel, I’d rather get a used UltraFine 5K than get one of LGs. I’d try Asus if it was easy for me to return. Only new option that’s better than a 10 year old UltraFine 5K with my fellow HiDPI nerds is…the $6K XDR display :/
(n.b. I’m not being precious either, this an extremely painful conclusion I have every incentive to avoid, tl;dr I abandoned an UltraFine 5K to the trash heap because I didn’t have time to figure out how to move it 3,000 miles and assumed _surely_ there was a good option between 24-32” in hidpi…)
It amazes me it’s so hard to find monitors around 210+ PPI. Glad this is one.
That said I would be scared to buy this. I’ve heard so many horror stories about the LG UltraFine 5k and the ports breaking and then having to send it in for repair for a long time.
At this point I don’t trust their build quality for monitors.
In general though, I am so glad to see big high DPI monitors have more than one or two options finally.
For a long time, the only very high (>200) DPI monitors on the market were Apple's first-party ones and the LG UltraFine, the former being stupidly overpriced and the latter having, as you say, reliability horror stories. I assume the dearth of other options was because macOS doesn't do fractional scaling, only 2x, so only Apple users really needed 5K-at-27" or 6K-at-32" whereas Windows/Linux users can be ok with 150% scaling.
But that's finally changing: several high-DPI monitors came out last year, and even more are coming this year, which should force manufacturers to do better re: both price and reliability. Last year I got a pair of the Asus ProArt 5K monitors, plus a CalDigit Thunderbolt hub, and have been very happy with this setup.
> I assume the dearth of other options was because macOS doesn't do fractional scaling
Except it does? I have a 14" MBP with a 3024x1964 display. By default, it uses a doubling for an effective 1512x982, but I can also select 1800x1169, 1352x878, 1147x745, or 1024x665. So it certainly does have fractional scaling options.
As a Linux user, I am confused when I hear other people talking about "scaling" and even more when they talk about being able to use only a restricted set of values for "scaling".
For much more than a decade, I have not used any monitor with a resolution less than 4k with Linux. I have never used any kind of "scaling" and I would not want to use any kind of "scaling", because that by definition means a lower image quality than it should be.
In X Window System, and in any other decent graphic interface system, the sizes of graphic elements, e.g. the size of fonts or of document pages, should be specified in length units, e.g. typographic points, millimeters or inches.
The graphic system knows the dots-per-inch value of the monitor (using either a value configured by the user or the value read from the monitor EDID when the monitor is initialized). When the graphic elements, such as letters are rasterized, the algorithm uses the dimensions in length units and the DPI value to generate the corresponding bitmap.
"Scaling" normally refers to the scaling of a bitmap into another bitmap with a greater resolution, which can be done either by pixel interpolation or by pixel duplication. This is the wrong place for increasing the size of an image that has been generated by the rasterization of fonts and of vector graphics. The right place for dimension control is during the rasterization process, because only there this can be done without image quality loss.
Thus there should be no "scaling", one should just take care that the monitor DPI is configured correctly, in which case the size of the graphic elements on the screen will be independent of the resolution of the connected monitor. Using a monitor with a higher resolution must result in more beautiful letters, not in smaller letters.
Windows got this wrong, with its scaling factor for fonts, but at least in Linux XFCE this is done right, so I can set whatever DPI value I want, e.g. 137 dpi, 179 dpi, or any other value.
If you configure the exact DPI value of your monitor, then the dimensions of a text or picture on the screen will be equal to those of the same text or picture when printed on paper.
One may want to have a bigger text on screen than on paper, because you normally stay at a greater distance from the monitor than the distance at which you would hold a sheet of paper or a book in your hand.
For this, you must set a bigger DPI value than the real one, so that the rasterizer will believe that your screen is smaller and it will draw bigger letters to compensate for that.
For instance, I set 216 dpi for a Dell 27 inch 4k monitor, which will magnify the images on screen by about 4/3 in comparison with their printed size. This has nothing to do with a "scaling". The rasterizer just uses the 216 dpi value, for example when rasterizing a 12 point font, in such a way that the computed bitmap will have the desired size, which is greater than its printed size by the factor chosen by me.
I've found it much easier to increase your viewing distance for an equivalent effect. All else being equal this provides the additional benefit of reduced eye strain from a decrease in parallax.
For example, I've settled on ~160 PPI viewed at 100cm as my optimal desktop solution. It has an identical perceived pixel density as ~220 PPI viewed at 75cm.
I was forced to do this as my eyes have aged and I can't focus at 5K 27" at a reasonable distance, and can't read the text when I sit far enough back to focus. Hence why 4K 27" (~160 ppi) has become perfect for me.
Would be nice if Apple supported non-integer scaling so I could just dynamically resize everything (without the current technique and performance hit/blurriness of upscaling then downsizing).
I bought a launch LG UltraFine 5K that was in the batch of defective units but I was too lazy to return it. Somehow, it's held up just fine a decade later; only color bleeding is an issue.
I am still using an LG UltraFine 5k since launch. I experienced flickering in the first month and had the monitor replaced by supplier - and it's been amazing ever since! Also, this DPI is perfect for having both crisp text and correct sized elements on screen (in MacOS).
I just bought an Asus 27" 5K to replace my Samsung 27" 5K S9 that bit the dust. They are around 700 bucks now, at least, you can get a 4K 28K for 2-300, but those are just at 157 PPI.
Short story: not worth it because LG has terrible quality control. I bought two copies and experienced bad light uniformity / banding with both, color repro wasn't great, and the matte finish was a bit fuzzy. Many others have had this issue.
I wound up buying the Asus ProArt 32" and out of the box it had good light uniformity, a better matte finish, better color accuracy (using the M-Model P3 profile), and was much cheaper.
I bought the Asus 6K ProArt on launch, replacing an older 4k 27" Dell monitor. The new monitor is definitely an upgrade, but not as great as I was hoping. The matte coating is by far the worst part of the monitor. It's not bad enough to return the monitor, but the graininess is noticeable on white windows. I've definitely enjoyed having the extra screen real estate over the 27" monitor, and the extra resolution has been very helpful for having a bunch of windows open in Unity.
This year at CES there were a number of new monitors unveiled that compete in this space. There's a new Samsung monitor (G80HS) that is a 32" 6k with a higher refresh rate than the LG or Asus. Unfortunately it has the matte coating instead of glossy, so clarity will suffer.
Also of interest are the new OLED offerings with true RGB stripe subpixel layout. This should fix text rendering problems on systems with subpixel antialiasing. Both Samsung and LG are making these OLED monitors with the true RGB layout. There will almost certainly be glossy coatings offered with these panels, and they'll have higher refresh rates than IPS.
Debating a getting that proart to replace my 27" 4k. Do you find the productivity benefits to be meaningful? I'm wondering if I'll just end up making everything bigger and not benefiting or having to move my head too much
Pointless superficial review, standard for Wired, Verge etc - no brightness uniformity check, no color uniformity check, no color accuracy check, no coating grain check.
Meanwhile a good number of reports mention terrible uniformity issues with that model.
Rtings publishes charts in abundance, but the subjective quality of a monitor is more important. For example, a chart will tell you a monitor has low color deviation from sRGB after calibration, but won't tell you that the monitor UI takes 10 laggy clicks to switch from sRGB to DCI-P3 and will reset your selection every time you toggle HDR mode.
I admire Rtings' attempts to add more and more graphs to quantify everything from VRR flicker to raised black levels. They were helpful when I last shopped for a monitor. But the most valuable information came from veteran monitor review sites such as Monitors Unboxed and TFTCentral.
I've just bought one. The color uniformity is terrible across the screen. It's tolerable to me, but not what I'd expect at the price. Everything else is quite fantastic though.
I bought and returned this display. The panel I got was practically unusable if you have even just a little care for color. All four corners were significantly dimmer than the center, and color accuracy dropped off toward the bottom of the screen. Somehow the macOS dock icons were washed-out and dim.
Unfortunately, this seems to be a common issue with this display, not a one-off panel discrepancy.
Do yourself a favor and wait for whatever Apple has upcoming, at least if you’re in the Apple ecosystem already.
If you're in the market for a new monitor, I recommend one with USB-C connectivity with power delivery - so convenient to just have one cable (and works with phones too).
Bright, sharp text, great color. We've had the great Apple Studio Display for years now, it's about time others came to fix some of it's short-comings like 27" size, 60hz and lack of HDMI ports for use with other systems.
So many of us have to stare at a screen for hours every day and having one that reduces strain on my eyes is well worth $1-3k if they'd just make them.
It amazes me, and it’s so sad. They have no idea what they’re missing. I’m sure high PPI would pay off fast in eye strain. And it’s not like monitors need replacement yearly. Tons of time to recoup that small cost.
I’m not arguing for $2k 37” monitors, just better than $200 ones.
At one point in my career, I just started buying my own monitors and bringing them into work.
I remember when ~19 or 20" was the norm, and I bought a dell 30" 2560x1600 monitor. Best $1400 I ever spent, used it for years and years.
(I still have it although I retired it a few years back because it uses something called dual-link DVI which is not easily supported anymore)
I think if you are an engineer, you should dive headlong into what you are. Be proactive and get the tools you need. Don't wait for some management signoff that never comes while you suffer daily, and are worse at your job.
And we have 1280x1024 monitors from the 00s, and we're not allowed to have anything better, even out of our own pockets, because "that's what we use here".
Unsurprisingly this is not a motivating factor to come back to the office, given I have a 220 PPI 6K at home.
But we have gray on gray, to compensate. One even has a choice. Do you want light or dark eye strain ?
:(
But premium displays exist. IPS displays on higher-end laptops, such as ThinkPads, are great - we're talking stuff like 14" 3840x2160, 100% Adobe RGB. The main problem is just that people want to buy truly gigantic panels on the cheap, and there are trade-offs that come with that. But do you really need 2x32" to code?
I expect people are VERY sensitive to mobile phone screen quality, to the point that it's a big factor in phone choice.
*Struggling for words, but I'm looking more for the expedient solution rather than the "craft beer" or "audiophile" solution.
[1] https://www.savanozin.com/projects/qod
If you're a gamer QDOLED is best. If you do office work just get whatever is high resolution and makes text sharp.
The larger screen size of a monitor is more likely to reflect lights than a mobile phone screen.
On the plus side, I can comfortably fit my editor on half the screen and my browser on the other half.
But 1440p on a 45” is not good PPI. That could be why you’re struggling to see text clearly
Even doing programming, 60hz is not enough man.
Plus more peripheral ports.
I'm 53 y/o and didn't have glasses until 52. And at 53 I only use them sporadically. For example atm I'm typing this without my glasses. I can still work at my computer without glasses.
And yet I spent 10 hours a day in front of computer screens since I was a kid nearly every day of my life (don't worry, I did my share of MX bike, skateboarding, bicycling, tennis, etc.).
You know the biggest eye-relief for me? Not using anti-aliased font. No matter the DPI. Crisp, sharp, pixel-perfect font only for me. Zero AA.
So a 110 / 120 ppi screen is perfect for me.
Not if you do use anti-aliased font (and most people do), I understand the appeal of smaller pixels, for more subtle AA.
But yup: pixel perfect programming font, no anti-aliasing.
38" ultra-wide, curved, monitor. Same monitor since 2017 and it's my dream. My wife OTOH prefers a three monitors setup.
So: people have different preferences and that is fine. To each his own bad tastes.
[1] https://www.macrumors.com/2026/01/15/new-studio-display-or-p...
Also as an owner of 2 27" 4K HDR 144Hz monitors that Apple rendered pointless to make the XDR work in the first place ("Wow, Apple's done some magic to make the display bandwidth work!" = "Apple fucked DP 1.4 users post-Catalina and will not admit it". Myself and countless other users saw our Macs that could drive those setups with Catalina be limited to 60Hz HDR, 95Hz SDR with Big Sur on. Hell, we got better performance if we told our displays to downgrade to DP 1.2).
And let's do the math:
- 6016 x 3384
- at 120Hz
- in 10 bit HDR
- 4:4:4 Chroma
works out to be just shy of 80Gbit/s.
Oh... plus if it's like the XDR with three additional ports that we'd assume should be at least 10Gbit/s each, and we're at 120Gbit/s.
Not that the XDR supports HDMI but you'd need at least HDMI 2.2, which isn't on any Mac right now.
And you'd need a full speed TB5 setup, so M4 Pro or Max (which I'll grant if you're laying out $6-7,000 for your display should be the least of your concern).
But saying that the XDR and this are comparable offerings is strange. "Hmm, why would I buy this $2,000 display now, when at some point in the near future Apple might have a better one for only triple the price!"
They've been in the display game a long time. For people that need the product capabilities for their specific job, like color grading, they seem to price them quite well, given everywhere I used to see $30,000-$50,000 reference monitors, I see Studio Displays now.
Here's the third party version of the upcoming 5K studio display refresh - 271KRAW16, 5K 27" 165hz glossy with mini led backlight https://www.tweaktown.com/news/109565/msi-unveils-worlds-fir...
But of course, you read that. So, I’ll take a hack at guessing what’s on your mind past that.
I bet it won’t be much more expensive than the LG list price, $2000. LG seems hellbent on making margins on this panel in their 1st party monitor. Ex. You can get the same panel but better quality in Asus Proart for $1300.
The key thing to watch here is, is Apple’s also 6K? If so, are they getting better panels than LG gives itself or Asus? (likely)
Regardless, it’s a shitshow with this panel, I’d rather get a used UltraFine 5K than get one of LGs. I’d try Asus if it was easy for me to return. Only new option that’s better than a 10 year old UltraFine 5K with my fellow HiDPI nerds is…the $6K XDR display :/
(n.b. I’m not being precious either, this an extremely painful conclusion I have every incentive to avoid, tl;dr I abandoned an UltraFine 5K to the trash heap because I didn’t have time to figure out how to move it 3,000 miles and assumed _surely_ there was a good option between 24-32” in hidpi…)
That said I would be scared to buy this. I’ve heard so many horror stories about the LG UltraFine 5k and the ports breaking and then having to send it in for repair for a long time.
At this point I don’t trust their build quality for monitors.
In general though, I am so glad to see big high DPI monitors have more than one or two options finally.
You're in luck; several 5120 × 2880, 600 mm × 340 mm monitors at high refresh rates were announced at CES a couple weeks ago.
And many more.But that's finally changing: several high-DPI monitors came out last year, and even more are coming this year, which should force manufacturers to do better re: both price and reliability. Last year I got a pair of the Asus ProArt 5K monitors, plus a CalDigit Thunderbolt hub, and have been very happy with this setup.
Except it does? I have a 14" MBP with a 3024x1964 display. By default, it uses a doubling for an effective 1512x982, but I can also select 1800x1169, 1352x878, 1147x745, or 1024x665. So it certainly does have fractional scaling options.
If you connect a 4k 2160p monitor, you can go down or up from the default 1080p doubling (https://www.howtogeek.com/why-your-mac-shows-the-wrong-resol...). If you select 2560x1440 for a 4k 2160p screen, that's 150% scaling rather than 2x (https://appleinsider.com/inside/macos/tips/what-is-display-s..., see the image where it compares "native 2x scaling" to "appears like 2560x1440").
For much more than a decade, I have not used any monitor with a resolution less than 4k with Linux. I have never used any kind of "scaling" and I would not want to use any kind of "scaling", because that by definition means a lower image quality than it should be.
In X Window System, and in any other decent graphic interface system, the sizes of graphic elements, e.g. the size of fonts or of document pages, should be specified in length units, e.g. typographic points, millimeters or inches.
The graphic system knows the dots-per-inch value of the monitor (using either a value configured by the user or the value read from the monitor EDID when the monitor is initialized). When the graphic elements, such as letters are rasterized, the algorithm uses the dimensions in length units and the DPI value to generate the corresponding bitmap.
"Scaling" normally refers to the scaling of a bitmap into another bitmap with a greater resolution, which can be done either by pixel interpolation or by pixel duplication. This is the wrong place for increasing the size of an image that has been generated by the rasterization of fonts and of vector graphics. The right place for dimension control is during the rasterization process, because only there this can be done without image quality loss.
Thus there should be no "scaling", one should just take care that the monitor DPI is configured correctly, in which case the size of the graphic elements on the screen will be independent of the resolution of the connected monitor. Using a monitor with a higher resolution must result in more beautiful letters, not in smaller letters.
Windows got this wrong, with its scaling factor for fonts, but at least in Linux XFCE this is done right, so I can set whatever DPI value I want, e.g. 137 dpi, 179 dpi, or any other value.
If you configure the exact DPI value of your monitor, then the dimensions of a text or picture on the screen will be equal to those of the same text or picture when printed on paper.
One may want to have a bigger text on screen than on paper, because you normally stay at a greater distance from the monitor than the distance at which you would hold a sheet of paper or a book in your hand.
For this, you must set a bigger DPI value than the real one, so that the rasterizer will believe that your screen is smaller and it will draw bigger letters to compensate for that.
For instance, I set 216 dpi for a Dell 27 inch 4k monitor, which will magnify the images on screen by about 4/3 in comparison with their printed size. This has nothing to do with a "scaling". The rasterizer just uses the 216 dpi value, for example when rasterizing a 12 point font, in such a way that the computed bitmap will have the desired size, which is greater than its printed size by the factor chosen by me.
For example, I've settled on ~160 PPI viewed at 100cm as my optimal desktop solution. It has an identical perceived pixel density as ~220 PPI viewed at 75cm.
Use a PPD (pixel per degree) calculator to find a setup that suits your needs: https://qasimk.io/screen-ppd/
Would be nice if Apple supported non-integer scaling so I could just dynamically resize everything (without the current technique and performance hit/blurriness of upscaling then downsizing).
32" 6K is very tempting!
Here are photos of what I saw:
https://www.reddit.com/media?url=https%3A%2F%2Fpreview.redd....
I wound up buying the Asus ProArt 32" and out of the box it had good light uniformity, a better matte finish, better color accuracy (using the M-Model P3 profile), and was much cheaper.
This year at CES there were a number of new monitors unveiled that compete in this space. There's a new Samsung monitor (G80HS) that is a 32" 6k with a higher refresh rate than the LG or Asus. Unfortunately it has the matte coating instead of glossy, so clarity will suffer.
Also of interest are the new OLED offerings with true RGB stripe subpixel layout. This should fix text rendering problems on systems with subpixel antialiasing. Both Samsung and LG are making these OLED monitors with the true RGB layout. There will almost certainly be glossy coatings offered with these panels, and they'll have higher refresh rates than IPS.
Meanwhile a good number of reports mention terrible uniformity issues with that model.
haven't found anyone who compares
I admire Rtings' attempts to add more and more graphs to quantify everything from VRR flicker to raised black levels. They were helpful when I last shopped for a monitor. But the most valuable information came from veteran monitor review sites such as Monitors Unboxed and TFTCentral.
I originally found them because they were one of the only sources that tested for PWM flicker in monitors.
Unfortunately, this seems to be a common issue with this display, not a one-off panel discrepancy.
Do yourself a favor and wait for whatever Apple has upcoming, at least if you’re in the Apple ecosystem already.