Readit News logoReadit News
piker · a month ago
The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off. For example, it's clear that a lot of the Rust UI framework developers have been working on Macs for the last few years. The font rendering on many of those look bad once you plug them into a more normal DPI monitor. If they hadn't been using Macs with Retina displays they would have noticed.
guhcampos · a month ago
This is more widespread than we like to admit.

Developers writing software on 64GB M4 Macs often don't realize the performance bottlenecks of the software they write.

Developers working over 1gbps Internet connections often don't realize the data gluttony of the software they write.

Developers writing services over unlimited cloud budgets often don't realize the resource wastes into which their software incurrs.

And to extend this to society in general.

Rich people with nice things often alienate themselves from the reality of the majority of people in the World.

zerkten · a month ago
You can nerf network performance in the browser devtools or underprovision a VM relatively easily on these machines. People sometimes choose not to and others are ignorant. Most of the time, it's just the case that they are dealing with too many things that are vague making it difficult to prioritize seemingly less important things.

A number of times I've had to have a framing discussion with a dev that eventually gets to me asking "what kind of computer do your (grand)parents use? How might X perform there" around some customer complaint. Other times, I've heard devs comment negatively after the holidays when they've tried their product on a family computer.

Moto7451 · a month ago
> Developers working over 1gbps Internet connections often don't realize the data gluttony of the software they write.

As a developer and AirBnB owner, what I’ve also noticed is the gluttony of the toolchain as well. I’ve had complaints about a 500/30 connection from remote working devs (very clear from the details they give) which is the fastest you can get for much of the metro I am in.

At home I can get up to 5/5 on fiber because we’re in a special permitting corridor and AT&T can basically do whatever they want with their fiber using an on old discontinued sewer run as their conduit.

I stick to the 1/1 and get 1.25 for “free” since we’re so over-provisioned. The fastest Xfinity provides in the same area as my AirBnB is an unreliable 230/20 which means my “free” excess bandwidth is higher than what many people near me can pay for.

I expect as a result of all this, developers on very fast connections end up having enough layers of corporate VPN, poorly optimized pipelines, a lot of dependency on external servers, etc that by the time you’re connected to work your 1/1 connection is about 300/300 (at least mine is) so the expectation is silently set that very fast internet will exist for on-Corp survival and that the off-corp experience is what others have.

guerrilla · a month ago
I wish we could have this as a permanent sticky for this website. It's out of control, especially with web stuff.

Spotify's webapp, for example, won't even work on my old computer, whereas YouTube and other things that you'd think would be more resource intensive work without any issue whatsoever.

jonhohle · a month ago
I tend to use older hardware and feel like I’m constantly fighting this battle. It’s amazing thr hardware we have and I have to wait for dozens of seconds to start an app or load a web page.
npteljes · a month ago
I agree, but developers don't have freedom over the product. Product managers are the ones who have a say, and even then, they are in a strict hierarchy, often ending at "shareholders". So, many of the wrongs come from the system itself. It's either systemic change (at least an upgrade), or no meaningful change.
bombcar · a month ago
You need two "classes" of developers; which may be the exact same people - those who are on the fastest, biggest hardware money can buy - but you also need some time running on nearly the worst hardware you can find.
rangestransform · a month ago
At a "rich world" company that wants to make money, it's completely rational to not give a shit about "poor world" people that won't make you much money (relatively speaking) anyways. It basically only makes sense to milk the top leg of the K-shaped economy.

Conversely, it opens up a niche for "poor world" people to develop local solutions for local challenges, like mobile payments in India and some of Africa.

caycep · a month ago
On the other hand there's the Fremen Mirage and why the Sardaukar are unrealistic https://acoup.blog/2020/01/17/collections-the-fremen-mirage-...
pmbanugo · a month ago
+1000
nine_k · a month ago
As a designer, one should keep a couple of cheap, low-res monitors reset to the factory defaults for proofing what many users are going to see.
eb0la · a month ago
I must confess I felt a lot of lust looking at the self color calibration feature.

It is extremely useful if your work ends up in paper. For photography (edit: film and broadcast, too) would be great.

My use case are comics and illustration, so a self-color-correcting cintiq or tablet would be great for me.

zerkten · a month ago
This is probably one of the few things I think works better in an office environment. There was older equipment hanging around with space to set it up in a corner so people could sit down and just go. When mobile came along there would be sustainable lending program for devices.

With more people being remote, this either doesn't happen, or is much more limited. Support teams have to repro issues or walk through scenarios across web, iOS, and Android. Sometimes they only have their own device. Better places will have some kind of program to get them refurb devices. Most times though people have to move the customer to someone who has an iPhone or whatever.

sim7c00 · a month ago
this exactly. same ppl do for sound, listen in the car, over shity headphones etc. - that's just quality control not the fault of any piece of equipment.
SeasonalEnnui · a month ago
Yes! I’m glad to see this pointed out - when working on UIs, I regularly move them between 3 monitors with varying resolution & DPI. 4k @ 200%, 2K at 125%, and 2K at 100%. This reveals not only design issues but application stack issues with DPI support.
dragonwriter · a month ago
> The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off.

That’s not the problem of using this monitor for creating the work, that’s the problem of not also using a more typical monitor (or, better, an array covering the common use cases, but which is practical depends on whether you are talking about a solo creator or a bigger team) for validating the work.

Just as with software, developers benefit from a more powerful machine for developing, but the product benefits from also being tested on machines more like the typical end-user setup.

stephenr · a month ago
Conversely if you only use a ~110 DPI display you won't know how bad it looks on a ~220 DPI display.

The solution here is wide device testing, not artificially limiting individual developers to the lowest common denominator of shitty displays.

mrbungie · a month ago
Yeah sure, as long as you have a lot of resources for testing widely.

Still, if you were to make an analogy you should target for a few devices that represent the "average", just as its done for (most) pop music production.

MBCook · a month ago
I can’t tell you how often I see this. Brand new designs or logos in 2024 or 2025 that look abysmal on a retina monitor because no one bothered to check.

Stands out like a sore thumb.

mschuster91 · a month ago
This is just as valid for mobile app and website development.

When all you use for testing is Browserstack, local emulators and whatnot and only the latest iPhone and Samsung S-series flagship, your Thing will be unusable for wide parts of the population.

Always, always use at the very least the oldest iPhone Apple still supports, the cheapest and oldest (!) Samsung A-series models still being sold in retail stores as "new", and at least one Huawei and Xiaomi device. And then, don't test your Thing only on wifi backed by your Gbit Wifi 7 router and uplink. Disable wifi and limit mobile data to 2G or whatever is the lowest your phone provider supports.

And then, have someone from QA visit the countryside with long stretches of no service at all or serious degradation (think packet loss rates of 60% or more, latencies of 2 seconds+). If your app survives this with minimal loss of functionality, you did good.

A bunch of issues will only crop up in real world testing. Stuff like instead of keeping a single socket to the mothership open, using fresh from scratch SSL connections for each interactions is the main bummer... latency really really eats such bottlenecks alive. Forgotten async handling leading to non-responsiveness of the main application. You won't catch that, not even with Chrome's network inspector - you won't feel the sheer rage of the end user having a pressing need and be let down by your Thing - even if you're not responsible for their shitty phone service, they will associate the bad service with your app.

Oh, and also test out getting interrupted while using your Thing on the cheap-ass phones. Whatsapp and FB Messenger calls, for example - these gobble so much RAM that your app or browser will get killed by OOM or battery saver, and when the user has their interruption finished, if you didn't do it right your Thing's local state will have gotten corrupted or removed, leading the user having to start from scratch!

ponector · a month ago
Also, it's not only about the screen resolution. Developers uses powerful macs and users have old windows - the usability is different, but devs usually don't care. Works fine on my machine!

Had reported many issues where to reproduce they needed to enable 10x throttling in the browser. Or use a Windows machine.

throw0101d · a month ago
> Developers uses powerful macs and users have old windows - the usability is different, but devs usually don't care. Works fine on my machine!

Part of what QA testing should be about: performance regressions.

bn-l · a month ago
This was (is?) the issue with zed and maybe the cause also.
sz4kerto · a month ago
This is exactly how sound studios do mixing. They don't just use top-end monitors -- they generally also listen on low-end speakers that color sound in a way that's representative to what people have at home (hello, Yamaha NS-10).
Intermernet · a month ago
People used to buy NS-10s because they knew professional studios used them. They were then underwhelmed when they sounded worse than the hifi speakers they had at home.

Many audio engineers live by the mantra "if it sounds good on NS-10s, it'll sound good on anything".

We need such a touchstone for software engineers.

cosmic_cheese · a month ago
I make a point of keeping my secondary monitor a "normal" DPI 2560x1440 display precisely to avoid this kind of problem. The loss of legibility has little impact on secondary monitor use cases, and I can easily spot-check my UI and UI graphics work by simply dragging the window over.

High quality normal DPI monitors are so cheap these days that even if multi-monitor isn't one's cup of tea there's not really a good reason to not have one (except maybe space restrictions, in which case a cheap ~16" 1080p/1200p portable monitor from Amazon will serve the purpose nicely).

freejazz · a month ago
>The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off.

Really? It's not a problem for photo retouchers, for whom a monitor like this is basically designed for.

ec109685 · a month ago
I don’t get marketing people. The only link in the press release is to adobe’s creative cloud. Why isn’t there two taps to buy the monitor with Apple Pay and have it shipped when it’s available?

> The redemption period ends August 31, 2026. For full details, visit https://www.asus.com/content/asus-offers-adobe-creative-clou....

Well, the monitor is €8,999, so maybe it’d be more than two taps for me:

> The monitor is scheduled to be available by October 2025 and will costs €8,999 in Europe (including VAT)

pjerem · a month ago
Buy a 9k€ monitor and get free 3 months for a cloud subscription. What a deal !
ryanjshaw · a month ago
If you’re not careful, that adobe creative cloud sub will cost you more than the monitor when you try to cancel
gigatexal · a month ago
Too rich for me. Also I don’t need a creative cloud sub. But I’m the wrong customer for such a monitor.

I’ll wait till 8k becomes more of the norm for say 1-1.5k

nine_k · a month ago
Human eye resolution is about 1 arcminute. The comfortable field of view is about 60°, or 3600 arcmimutes. A 4K display should mostly suffice %)
martinald · a month ago
Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.

Why is this? 5k/6k at 27" would be the sweet spot for me, and potentially 8k at 32". However, I'm not willing to drop $2k per monitor to go from a very nice 27" 4k to 27" 5k.

You can get 8K TVs for <$1000 now. And an Quest 3 headset has 2 displays at far higher PPI for $600.

throw0101d · a month ago
> Me and a friend were just chatting how annoying it is monitors stalled out at 4K.

There's been a bit of a 'renaissance' of 5K@27" in the last ~year:

> In just the past few months, we've taken a look at the ASUS ProArt Display 5K, the BenQ PD2730S, and the Alogic Clarity 5K Touch with its unique touchscreen capabilities, and most recently I've been testing out another new option, the $950 ViewSonic VP2788-5K, to see how it stacks up.

* https://www.macrumors.com/review/viewsonic-vp2788-5k-display...

There are 15 monitors discussed in this video:

* https://www.youtube.com/watch?v=EINM4EysdbI

The ASUS ProArt PA27JCV is USD 800 (a lot less than $2k):

* https://www.youtube.com/watch?v=ojwowaY3Ccw

wooger · a month ago
And it has terrible reviews...

5k is appealing, but I'll take a 4k with a much nicer panel, much better anti-reflective coating, faster refresh rate etc.

Aurornis · a month ago
> You can get 8K TVs for <$1000 now.

8K at jumbo TV size has relatively large pixels compared to an 8K desktop monitor. It’s easier to manufacture.

> And an Quest 3 headset has 2 displays at far higher PPI for $600

Those displays are physically tiny. Easier to deal with lower yields when it’s only taking a few square inches.

Ultra high resolution desktop monitors would exist in the middle: Very small pixel sizes but also relatively large unit area.

However, the demand side is also not there. There are already a number of 5K, 6K, and 8K monitors on the market. They’re just not selling well. Between difficult software support for scaling legacy apps, compatibility issues with different graphics cards and cables, and the fact that normal monitors are good enough, the really high resolution monitors don’t sell well. That doesn’t incentivize more.

If we get to a place where we could reliably plug a 6K monitor into any medium to high end laptop or desktop and it just works, there might be more. Until then, making a high res monitor is just asking for an extremely high return rate.

Kon5ole · a month ago
>> You can get 8K TVs for <$1000 now.

>8K at jumbo TV size has relatively large pixels compared to an 8K desktop monitor. It’s easier to manufacture.

I don't think that's true.

I've been using a 8k 55" TV as my main monitor for years now. It was available for sub-800 USD before all such tv's vanished from the market. Smaller pixels were not more expensive even then, the 55"s were the cheapest.

4k monitors can be had for sub-200 usd, selling 4x the area of the same panel should be at most 4x that price. And it was, years ago.

So they were clearly not complicated or expensive to manufacture - but there was no compelling reason for having 8k on a TV so they didn't sell. However, there IS a compelling reason to have 8K on a desktop monitor!

That such monitors sell for 8000 usd+ is IMO a very unfortunate situation caused by a weird incompetence in market segmentation by the monitor makers.

I firmly believe that they could sell 100x as many if they cut the price to 1/10th, which they clearly could do. The market that never appeared for tv's is present among the world's knowledge workers, for sure.

nicoburns · a month ago
> There are already a number of 5K, 6K, and 8K monitors on the market. They’re just not selling well. Between difficult software support for scaling legacy apps, compatibility issues with different graphics cards and cables, and the fact that normal monitors are good enough, the really high resolution monitors don’t sell well.

They're available, but they never seem to have become a mass-market product at mass-market prices. The cheapest 5k monitor is at least double the price of the cheapest 4k monitor. And it was more like 4x until recently.

You're probably right that we're starting to hit the point where people don't care though.

rickdeckard · a month ago
Because the vast majority of Monitor Sales-Volume are (public) tenders from companies buying huge volume, and those companies still mostly look for monitors <4K (without fancy specs and without i.e. USB-C).

If 4K reaches mass-market for those, the specs will shift down and there will be room in the (much smaller) Premium-Tier monitor segment

Heck, even if you just want USB-C and an integrated webcam on an average display, the price-hike compared to one without it is crazy, because everything except those basic office-monitors is still niche-production...

4ggr0 · a month ago
as a gamer 8k makes me sweat because i can't imagine what kind of hardware you'd need to run a game :O probably great for text-based work, though!
Aurornis · a month ago
Once you get into the high pixel densities you stop running everything at native resolution. You have enough pixel density that scaling the output doesn’t produce significant visible artifacts.

With 8K small pixels you could pick a number of resolutions up to 4K or higher and you wouldn’t even notice that the final product was scaled on your monitor.

People with Macs with retina displays have been doing this for years. It’s really nice once you realize how flexible it is.

pornel · a month ago
You don't really need 8K for gaming, but upscaling and frame generation have made game rendering resolution and display resolution almost independent.
swiftcoder · a month ago
> and potentially 8k at 32"

What's your actual use-case for this? I run a 32" 4K, and I have to stick my nose within a foot (~30cm) of the display to actually spot individual pixels. Maybe my eyesight isn't what it used to be

I'd kill for a 40" 5k or 6k to be available - that's significantly more usable desktop real estate, and I still wouldn't be able to see the pixels.

ak217 · a month ago
Pixels are very noticeable at 32" 4K. If you don't notice them, your eyes still do - they try to focus on blurry lines, causing eye strain. You might not notice, but it adds up over the years.

It's simple math. A 32" 4K monitor is about 130 PPI. Retina displays (where you could reasonably say the pixels are not noticeable, and the text is sharp enough to not strain the eyes) start at 210 PPI.

Subjectively, the other problem with 32" 4K (a very popular and affordable size now) is that the optimal scaling is a fractional multiple of the underlying resolution (on MacOS - bizarrely I think Windows and Linux both know how to do this better than MacOS). Which again causes blur and a small performance hit.

I myself still use an old 43" 4K monitor as my main one, but I know it's not great for my eyes and I'd like to upgrade. My ideal would be a 40" or 42" 8K. A 6K at that size would not be enough.

I am very excited about this 32" 6K Asus ProArt that came out earlier this year: https://www.asus.com/displays-desktops/monitors/proart/proar... - it finally gets Retina-grade resolution at a more reasonable price point. I will probably switch to two of these side-by-side once I can get them below $1K.

FuriouslyAdrift · a month ago
This is the only large true monitor I know of. It used to be branded by Acer, but now it is branded through Viewsonic. We have a bunch at work and everyone loves them. $570 for 43" 4K

https://www.viewsonic.com/us/vx4381-4k-43-4k-uhd-monitor-wit...

Aurornis · a month ago
> I'd kill for a 40" 5k or 6k to be available

There are a number of 40” 5K wide monitors on the market. They have the same vertical resolution as a 4K but with more horizontal pixels.

littlestymaar · a month ago
> Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.

It's mostly because the improvement over 4k is marginal. In fact, even from 1920x1080 it's not so big of a deal, which is why people keep buying such monitors in 2025.

A the worse is that the higher spending consumer segment of PC parts, the gamers, can't really use high resolution display at their full potential because it puts such a burden on the GPU (DLSS helps, but the results is even smaller of an improvement over 1920x1080 than regular 4k is)

layer8 · a month ago
The likelihood of dead pixels increases quadratically with resolution, hence panel yield drops correspondingly. In addition, the target audience who has hardware (GPUs) that can drive those resolutions is smaller.
ebbi · a month ago
One of the best things I've done for my setup is convert old 5k iMacs to work as external display.

Only downside are the massive borders by todays standards, but it still has the Apple aesthetics, the 5k resolution is beautiful for my use cases (spreadsheets, documents, photo editing), and has HDMI inputs so I can play PS5 on it.

Paianni · a month ago
The Asus PA27JCV is rather less than $2k...
prpl · a month ago
30 or 32" 5k is what I'd love - maybe 6k at 32
mschuster91 · a month ago
> Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.

Multiple reasons.

The first one being yield - yes you can get 8K screens, but the larger they get, the more difficult it is to cut a panel with an acceptably low rate of dead/stuck pixels out of a giant piece of glass. Dead pixels are one thing and bad enough, but stuck-bright pixels ruin the entire panel because they will be noticeable in any dark-ish movie or game scene. That makes them really darn expensive.

The second reason is the processing power required to render the video signal to the screen, aka display controllers. Even if you "just" take regular 8 bit RGB - each frame takes up 33 million pixels, so 796.262.400 bits. Per frame. Per second? Even at just 30 FPS, you're talking about 23.887.872.000 bits per second - 23 gigabits/s. It takes an awful, awful lot of processing power just to shuffle that data from the link SerDes around to all the control lines and to make sure they all switch their individual pixels at the very same time.

The third is transferring all the data. Even if you use compression and sub-sampling, you still need to compress and sub-sample the framebuffer on the GPU side, transfer up to 48 GBit/s (HDMI 2.3) or 77 GBit/s (DP 2.1) of data, and then uncompress it on the display side. If it's HDCP-encrypted, you need to account for that as well - encrypting and decrypting at such line speeds used to be unthinkable even two decades ago. The fact that the physical transfer layer is capable of delivering such data rates over many meters of copper cable of varying quality is nothing short of amazing anyway.

And the fourth is generating all the data. You need absurdly high definition textures, which requires lots of VRAM, lots of regular RAM, lots of disk I/O, lots of disk storage (your average AAA game is well beyond 100GB of data at-rest for a reason!), and then render power to actually render the scene. 8K has 16x (!) the pixels of regular FullHD (1080p).

What's stopping further progress? Other than yield and simple physics (similar to microchips, the finer the structures get the more difficult and expensive it is to make them), the most pressing issue is human visual acuity - even a human with very good vision can only make useful sense of about 74 of the theoretical 576 megapixels [1]. As we already established, 8K is at 33-ish megapixels, so the usual quadratic increase would already be far too detailed for 99.999% of humans to perceive.

Yes, you could go for intermediate sizes. 5K, 6K, weird aspect ratios, whatever - but as soon as you go there, you'll run into issues with video content because it can't be up- or downscaled to such intermediates without a perceptible loss in quality and, again, a lot of processing power.

[1] https://clarkvision.com/articles/eye-resolution.html

Aurornis · a month ago
> And the fourth is generating all the data. You need absurdly high definition textures, which requires lots of VRAM, lots of regular RAM, lots of disk I/O, lots of disk storage (your average AAA game is well beyond 100GB of data at-rest for a reason!), and then render power to actually render the scene. 8K has 16x (!) the pixels of regular FullHD (1080p).

You don’t need to scale everything up to match the monitor. There are already benefits to higher resolution with the same textures for any object that isn’t directly next to the player.

This isn’t a problem at all. We wouldn’t have to run games at 4K.

zamadatix · a month ago
~half of these reasons state sub $2000 8k TVs shouldn't exist, but they do.
znpy · a month ago
Ah yes. It’s the same with memory… 8gb/16gb is incredibly common, even though 16gb memory was a thing in like 2008 already. It’s only with high end machines that you get 64/128gb memory, which should be much more common in my opinion.
qaq · a month ago
6K 32" ProArt model PA32QCV might be more practical for YN crowd at 1299 USD VS 8-9K USD PA32KCX will run you
retrac98 · a month ago
An aside - this monitor is proving surprisingly difficult to buy in the UK. Everywhere I look it seems to be unavailable or out of stock, and I’ve been checking regularly.

Relatedly, I also don’t understand why a half-trillion dollar company makes it so hard to give them my money. There’s no option to order ASUS directly on the UK site. I’m forced to check lots of smaller resellers or Amazon.

brzz · a month ago
Struggling with the exact same issue myself. If you do find a place to buy it, please let me know
qaq · a month ago
Was same in US till maybe 2-3 weeks ago. Maybe they are slowly rolling out to various markets
ErneX · a month ago
Same in Spain, I got tired of looking for it.
tom_alexander · a month ago
I'm not buying a new monitor with a decade-old version of DisplayPort. Non-oled monitors are products that last a long time (at least a decade) so if I bought this monitor, I'd still be using DisplayPort 1.4 from 2016 in 2036. I need UHBR20 on a new monitor so I can rest assured that I will have some lanes available for my other peripherals. I've already lived the hell of needing to dedicate all 4 lanes to DisplayPort, leaving only a single USB2.0 connection remaining for all my other peripherals to share[0][1].

[0] https://media.startech.com/cms/products/gallery_large/dk30c2...

[1] https://i.imgur.com/iGs0LbH.jpeg

simoncion · a month ago
> I'm not buying a new monitor with a decade-old version of DisplayPort.

With the greatest of respect, this is a deeply silly way to think of it.

The way you should be thinking of it is:

> I'm not buying a new monitor that requires DSC to run at native resolution. That's fucking garbage.

Since DP 1.4, the only thing the DisplayPort version indicates that an end-user gives a shit about is the maximum supported speed link speed. So, if all you need is HRB3 to drive a display at its native resolution, refresh rate, and maximum bit depth without fucking DSC, then DisplayPort 1.4 will be just fine. And if DSC doesn't bother you, then your range of acceptable displays is magically widened!

Aurornis · a month ago
I also wish it had something newer, but for that price I’d gladly deal with a second cable for high speed USB devices or the purchase of a dock to handle breakout duties.
zokier · a month ago
I'd imagine for most people the HDR perf difference is more noticeable than the resolution. This new monitor can do 1200 nits peak with local dimming, PA32QCV can only do 600 nits peak with no local dimming. Also Dolby Vision.
qaq · a month ago
I'd imagine most people can't spend 9,000 USD on a monitor
lwhsiao · a month ago
I second this, I recently switched [1] and have been delighted by the crisp fonts.

[1]: https://luke.hsiao.dev/blog/pa32qcv/

WilcoKruijer · a month ago
I've been enjoying the PA32QCV in the last couple months. It's definitely not perfect, but the 220 PPI at 32 inch is just amazing to code on.
tombert · a month ago
I swore a blood oath that I would never buy an Asus product ever again, after three terrible laptops from them in a row, but holy hell do I kind of want this monitor.

My main "monitor" right now is an 85" 8K TV, that I absolutely love, but it would be nice to have something smaller for my upstairs desk.

mnw21cam · a month ago
I have a fantastic Asus laptop that is 8 years old now and (after an easy battery replacement) easily does everything I want from it and feels nice and solid. I was so impressed that I recommended Asus to someone else, and what they got was pretty awful.

So basically, YMMV. They make good stuff, and they make awful stuff.

8cvor6j844qw_d6 · a month ago
What would you pick for your next laptop if you had to buy one?

I had an Asus laptop, but the frequent security firmware updates for one of the Dell laptop that I had makes me think it might make a good candidate in terms of keeping up with security updates.

Not sure for the current latest models for Asus/Dell/HP/etc., but I liked the fact that disassembly manuals are provided for older Dell and HP. I can hardly find disassembly manuals for Asus when I have to do maintenance such as swapping out thermal paste/pads and clearing out the heatsink fins.

mrguyorama · a month ago
My girlfriend's 2 year old Asus Zenbook had easy to find repair manuals and was pretty repairable. Though consumer laptop naming conventions make googling for it error prone.

The main problem was parts. She had a fan that was defective and noisy, and the Asus parts store didn't have it in stock, and there was one on ebay for $30.

But the replacement was easy, the construction was solid, and there have been no issues since.

>Asus when I have to do maintenance such as swapping out thermal paste/pads and clearing out the heatsink fins.

If you have to do this more than once or twice over a ten year lifespan of a laptop, you probably should invest in air cleaning systems. Mid range consumer laptops are way less thermally constrained than they used to be. Ryzen CPUs are essential for that, though I think Intel now has usable cool laptop CPUs

speedgoose · a month ago
I’m only one data point, but I also swear that I would never buy an Asus laptop again. If you are fine with the operating system, a MacBook Pro is the best in my opinion. It’s not even close.

Otherwise I had okay Dell or Lenovo laptops. Avoid HP, even the high end Zbook ones. A framework might be worth a try if you have a lot of money.

tombert · a month ago
I am a pretty huge fan of Thinkpads. I bought mine a year ago and love it.

Dead Comment

ssivark · a month ago
What are the cons of having a large TV as a monitor? I've been considering something like this recently, and I wonder why is this not more common.
bee_rider · a month ago
Someone mentioned the latencies for gaming, but also I had a 4K TV as a monitor briefly that had horrible latency for typing, even. Enough of a delay between hitting a key and the terminal printing to throw off my cadence.

Only electronic device I’ve ever returned.

Also they tend to have stronger than necessary backlights. It might be possible to calibrate around this issue, but the thing is designed to be viewed from the other side of a room. You are at the mercy of however low they decided to let it go.

tombert · a month ago
I'm sure there are reasons with regards to games and stuff, but I don't really use this TV for anything but writing code and Slack and Google Meet. Latency doesn't matter that much for just writing code.

I really don't know why it's not more common. If you get a Samsung TV it even has a dedicated "PC Mode".

swiftcoder · a month ago
Depending on the specific TV, small details like text rendering can be god-awful.

A bunch of TVs don't actually support 4:4:4 chroma subsampling, and at 4:2:2 or 4:2:0 text is bordering on unreadable.

And a bunch of OLEDs have weird sub-pixel layouts that break ClearType. This isn't the end of the world, but you end up needing to tweak the OS text rendering to clean up the result.

xeonax · a month ago
I have been using a 43 inch TV as a monitor, since last 10 years, currently on a LG. You get lot of screen-space, as well as you can sit away from desk and still use it. Just increase the zoom.
terribleperson · a month ago
If you play video games, display latency. Most modern TVs offer a way to reduce display latency, but it usually comes at the cost of various features or some impact to visual quality. Gaming monitors offer much better display latencies without compromising their listed capabilities.

Televisions are also more prone to updates that can break things and often have user hostile 'smart' software.

Still, televisions can make a decent monitor and are definitely cheaper per inch.

sim7c00 · a month ago
high latency on TVs make it bad for games etc. as anyhting thats sensitive on IO timings can feel a bit off. even 5ms compared to 1 or 2ms response times is noticable by a lot in hand-eye coordination across io -> monitor.
jmarcher · a month ago
For me, on macOS, the main thing is that the subpixel layout is rarely the classic RGB (side by side) that macOS only supports for text antialiasing.

If I were to use a TV, it would be an OLED. That being said, the subpixel layout is not great: https://pcmonitors.info/articles/qd-oled-and-woled-fringing-...

112233 · a month ago
For me it's eye fatigue. When you put large 4k TV far enough it's same view angle as a 27" desk monitor, you're almost 1.5m away from it.
monkpit · a month ago
Usually refresh rate and sometimes feature set. And it’s meant to be viewed from further away. I’m sure someone else could elaborate but that’s the gist.
fleventynine · a month ago
No mention of 120Hz; I'm waiting for a 6k or higher-density display that can do higher refresh rates.
dietr1ch · a month ago
I was going to joke about 8k@120Hz needing like 4 video cables, but it seems we are not too far from it.

[8k@120Hz Gaming on HDMI 2.1 with compression](https://wccftech.com/8k-120hz-gaming-world-first-powered-by-...)

> With the HDMI 2.2 spec announced at CES 2025 and its official release scheduled for later this year, 8K displays will likely become more common thanks to the doubled (96 Gbps) bandwidth.

FootballMuse · a month ago
Uncompressed, absolutely we need another generation bump with over 128Gbps for 8K@120Hz with HDR. But with DSC HDMI 2.1 and the more recent DisplayPort 2.0 standards is possible, but support isn't quite there yet.

Nvidia quotes 8K@165Hz over DP for their latest generation. AMD has demoed 8K@120hz over HDMI but not on a consumer display yet.

https://en.wikipedia.org/wiki/DisplayPort#Refresh_frequency_...

https://en.wikipedia.org/wiki/HDMI#Refresh_frequency_limits_...

https://www.nvidia.com/en-gb/geforce/graphics-cards/compare/

ternus · a month ago
My primary monitor is the Samsung 57" 8Kx2K 240Hz ultrawide. That's the same amount of bandwidth, running over DisplayPort 2. It mostly works!
ranger_danger · a month ago
> 4 video cables

The IBM T220 4k monitor required 4 DVI cables.

https://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors

Dylan16807 · a month ago
Also as far as 6k goes, that's half the bandwidth of 8k.
kondro · a month ago
Thunderbolt 5 supports up to 120Gbps one-way.
ryukoposting · a month ago
I wouldn't hold my breath. Competing models seem to top out around 120 Hz but at lower resolutions. I don't imagine there's a universal push for higher refresh rates in this segment anyway. My calibrated displays run at 60 Hz, and I'm happy with that. Photos don't really move much, y'know.
eviks · a month ago
> Photos don't really move much, y'know.

They do when you move them (scroll)

klausa · a month ago
I imagine your mouse still moves plenty though.
cheema33 · a month ago
There is a lot of marketing material at the linked page. But there is no mention of price and available sizes. Also, there is no link to purchase one. This is November. I can look these things up, but why link to a PR fluff piece if there something more substantial available?
dklsf · a month ago
Here's some specs: https://www.asus.com/displays-desktops/monitors/proart/proar...

8K, 32inch, 275ppi, 60Hz 2 Thunderbolt 4, 1 DisplayPort 2.1

pdpi · a month ago
> But there is no mention of price and available sizes

No idea about prices, but, assuming they follow the usual conventions for model codes, that's a 32" unit.

omnibrain · a month ago
Sadly it's just 16:9. Not even 16:10.

I now run 2 3:2 Displays (BenQ RD280U) at home (more in the office, but I never go there) and love my vertical real estate. (No, portrait mode won't work out)

Flockster · a month ago
Thank you for this suggestion!

For Laptops 16:10 (and with the framework and surface even 15:10/3:2) is already quite common, while in the desktop market 16:9 and these ultrawides are dominant.

With 16:9 the whitespace on websites even with Tabs on the side is simply to high ;)