Readit News logoReadit News
proverbialbunny · 6 years ago
I'm an early adopter of 4k60. If you write code and you're not on 4k, you don't know what you're missing. 4k is great.

Back then display port ran at 60fps, and hdmi ran at 30fps. My hardware has changed and moved on, but I still default to using a display port cable. It's rare to not find a graphics card without display port, and it's rare to find a monitor without one, so I've never had a reason to try HDMI. As far as I can tell it's a stubborn format that continue to fight to live on. Frankly, I don't really get why we still have HDMI today.

Polylactic_acid · 6 years ago
HDMI exists because the corporations behind the HDMI group are the ones that make most of the TVs and dvd boxes. So by pushing HDMI they get to use it for free while the competitors have to pay extra and are at a disadvantage. Nvidia and AMD are not on the HDMI group which is why on pro and enthusiast hardware displayport is pushed much more. My current gpu has 3 displayport and 1 hdmi.

On the tv side, the benefits of displayport are less important since most users are not watching 4k 144hz content and hdmi is now capable of playing 4k at 60hz. So hdmi is not only pushed by the corporations but is also the most convenient option the users want since all of their other hardware uses hdmi and it works fine.

Honestly I think GPUs should drop the hdmi port entirely and ship with a displayport to hdmi cable which works fine because displayport gpus also support hdmi and dvi over the dp port.

ChuckNorris89 · 6 years ago
You forgot to mention that besides royalties, the main reason HDMI is used in consumers electronics is that it also streams audio with CEC as apposed to Display Port that only does video.

For most consumers, plugging in just one cable that does everything is a lot more convenient.

Edit: sorry, didn't know DP can also stream audio, my bad

dmos62 · 6 years ago
> If you write code and you're not on 4k, you don't know what you're missing.

I really don't. Would anyone care to share their experience? Currently, I'm thinking that a vertical monitor would be an improvement for me, but I don't see a reason to get a screen that's several times denser.

Kim_Bruning · 6 years ago
Simple rule of thumb:

* More pixels -> more characters on screen.

* More characters on screen -> easier to glance between different parts of your code , open manuals, etc.

A 4K screen has 3840x2160 pixels. You could see that as either fitting 4 regular 1080p screens in a rectangular pattern, or just over 3 regular screens in vertical orientation (which is what you were considering).

Of course, if your current HD screen is only barely comfortable to you (likely), you will end up needing to buy a screen with larger dimensions (findable) to actually be able to utilize the screen to that extent.

Alternately, you could still go for that higher pixel density; in which case you end up with more readable characters. This does alleviate eye-strain!

Given enough pixels at your disposal (and a large enough screen), you can have the luxury of determining your own favorite trade-off between pixel density and amount-of-code-on-screen. You can do so on the fly, as tasks/whims/fatigue demand.

This is why -at least for now- I'd say you can never have enough pixels.

I'm sure there's a point of diminishing returns somewhere, but I think we have a long way to go before we reach it.

mlyle · 6 years ago
They're probably suggesting to get a bigger monitor at the same time. You can have somewhat higher DPI and smoother text, and more real estate.

If you're used to 24" 1080P, going to 32" 4K gets you 1.5x the DPI and 1.77x the screen area. For what's now a $300 monitor, that is a pretty significant improvement overall.

The DPI means my characters go from 9x18 pixels to 13x27, which is a big difference in text rendering fidelity and just feels a little nicer. And the additional screen real estate speaks for itself.

zachmu · 6 years ago
I code on a 27" 4K monitor at 60 Hz. All the text is just so sharp and crisp and clear. I no longer get any noticeable eye fatigue after a day's work.
joshstrange · 6 years ago
Can't speak to 4K but I recently switched to 3 2K monitors (2 vertical on either side of a horizontal monitor). And I quite like the vertical monitors (My setup resembles a tie-fighter) even though I had resisted trying them for over a decade now.

I still have 1 1080p screen attached (just for NVR viewing) and if I drag my IDEA (code editor) window from a vertical monitor to my 1080 then it takes up over 3/4ths of the width. Just to restate that a different way: I gave up less than 1/4th of my screen width but got ~3x the height (just for the 2 vertical screens).

My current setup looks like this with 1 & 3 being vertical, 2 being horizontal, and 4 being my old 1080p horizontal.

    |   | | 4 | |   |
    | 1 ||  2  || 3 |
    |   |       |   |

I have my 2 vertical monitors divided into 3rds (top, middle, bottom) and have keyboard shortcuts to move windows between each region (coupled with a shortcut to move/resize all windows to their predefined default locations). My code is always on monitor 3 taking up the bottom 2/3rds (I find that using the whole height requires too much head/eye movement). I like to use the top 3rd of both vertical monitors for things like chat/reference.

TomVDB · 6 years ago
For my aging eyes, 4K 32” is the sweet spot. They’re cheap too (you can get one for $380.)

I don’t use the full screen for editing (it’s too wide for that), but the edit window is in the middle and stuff that happens in the background (compiling etc.) sits in the virtual background, showing up in the border around the wait window. So I see immediately when things have stopped scrolling.

After using this configuration at work, I upgraded all my home setups as well.

yboris · 6 years ago
I'm using a 55" 4K TV for my coding at home. It's wonderful that I always have enough space to put all my windows (using Win 10).

The other benefit is that I can sit farther away from the screen (I make the font larger to compensate). Feels like it's better for the eyes not to have to focus up-close.

bsenftner · 6 years ago
Back in the early 90's I had a "Sony News" newspaper layout monitor with hardware that displayed a full 2-page newspaper spread at 16 grey-levels. I got addicted and mourned the loss of viewing giant amounts of source code and terminal/app windows. When 4K came out, my dream returned. I'm currently working with four 4K monitors, no scaling, that is also Input Director linked to a second system with two more unscaled 4K monitors, but those are portrait (tall). I'm in heaven, again.
darkr · 6 years ago
At work my primary monitor is 4K @32”. At home my primary monitor is 2650x1600 @30”.

Though the 4K monitor is sharper, specifically for the task of writing code, I actually prefer my home monitor.

cm2187 · 6 years ago
43in 4k monitor (Philips BDM4350UC). Not a particularly high dpi but LOTS of screen real estate, and not very expensive. You can have two documents open side by side plus stackoverflow and the rendering of your website. But it requires 60Hz, because otherwise moving the mouse over such a large surface feels laggy and uncomfortable.
bhauer · 6 years ago
It has been several years since I wrote "4K is for Programmers" [1], and in time since we've migrated to 4K 60 Hz LG panels using DisplayPort. But the upsides of a large 4K for programming remain the same today.

[1] https://tiamat.tsotech.com/4k-is-for-programmers

alias_neo · 6 years ago
If your eye sight is good (no aides needed) it's a much more pleasant experience, the sharpness of coding at 4K, at a like-for-like scale.

What's less pleasant is how unsharp all of the icons and images in apps that haven't provided high density images look.

fyfy18 · 6 years ago
I bought a 4K monitor a few years ago. It's beautiful when displaying 4K, but I'm still waiting for Linux to catch up. I used in on a desktop and that worked mostly ok, but on a laptop with different scalings on the internal and external monitors it is hopeless. I'm not even trying to use both monitors at once, I just want Firefox to be scaled from 2x to 1x when I disconnect the external monitor (and vice versa).

I recently bought a new monitor for my office and explicitly avoided a 4K monitor for this reason, it turns out there aren't that many 27" 1440p monitors nowadays. I ended up getting a Lenovo P27h (comes with USB-C) and it works great.

p1mrx · 6 years ago
> it turns out there aren't that many 27" 1440p monitors nowadays.

Huh? 1440p is the third most popular resolution on the Steam hardware survey[1], after 1080p and 1366x768.

[1] https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...

alias_neo · 6 years ago
I have the best part of a decades experience with this. Until recently I was running 3 4K displays on my desktop.

The real issue is with DEs and not Linux per-se. Gnome handles 4K (Hi-DPI) better than most other DEs I've tried but it's still pretty horrid with Hi-DPI and non Hi-DPI displays mixed.

My rule of thumb has been to try keep all displays at the same resolution. For desktop that's easy, for laptop; buy a laptop that's 4K (not always cost/battery effective), or turn off its display when using external (loss of screen real estate) or change the external monitor to 1080p (not ideal, loss of resolution).

G4E · 6 years ago
I'm always confused by those statements, that it's rare to find a monitor without DP. I live in France, and actually HDMI is the norm here. All low-range and mid-range screens are HDMI only, and it's rare to find a DP actually. It's especially sad when you wants to use freesync on Linux, as mesa (the gpu driver) added support in 2018 for DP, but it's still not usable over HDMI.
diffeomorphism · 6 years ago
My impression in Germany and the US was that cheap monitors and TVs come with HDMI, but not necessarily displayport. Anything higher priced or aimed at design, graphics, office work comes with display port and (often also) hdmi. For example all dell ultrasharp models, hp z27, samsung's uhd monitors, all 4k LG ones.

In addition there are now several monitors offering usb-c connection for video+power+usb hub functionality (e.g. see the wirecutter recommended 4k monitors), which seems convenient.

proverbialbunny · 6 years ago
Wow. That's surprising to hear that there are 4k60 monitors without DP. I'm sorry to hear you can't find a proper monitor in France.
sq_ · 6 years ago
IIRC, Freesync will only ever work over DP because it relies on features that the HDMI protocol doesn't/won't ever have.

Also, it's very interesting that things are so different in France with regard to display cables. I would've assumed that, since it's mostly the same manufacturers making everything around the world, they'd have more or less standard models across regions.

detrino · 6 years ago
Do you use Mac OS X?

When using Windows or Linux I don't find much benefit in text rendering on a 4k display.

But as Mac OS X has no sub pixel rendering or grid fitting text looks terrible without a high ppi display.

proverbialbunny · 6 years ago
I do use OSX, but I'm also on Linux Mint. Both are sharp and smooth as butter. I'm uncertain why your experience has been what it has with Linux. Could be drivers or something. I did have to set Mint to work for 4k60. It did not work out of the box properly. (HiDPI was off. Hardware vsync was off.) Mint has never had sub pixel rendering as far as I know. It looks crisp and great.
jjoonathan · 6 years ago
When did that happen? Back in my day, OSX was the one with sub pixel rendering and Windows users would constantly complain that it looked fuzzy.
chrisweekly · 6 years ago
This! MacOS (mbp15r) with seemingly any non-Apple external monitor, text looks just awful regardless of font or resolution settings.
grosswait · 6 years ago
I've had to use this fix for non-Apple monitors https://www.mathewinkson.com/2013/03/force-rgb-mode-in-mac-o...
m463 · 6 years ago
I was thinking macos always had subpixel rendering, but maybe it does not? I am not running mojave

https://news.ycombinator.com/item?id=17476873

Also, apple has a tendency to support fancy features ONLY on its own hardware. I know apple retina displays allowed display scaling, but non-apple displays only let you set the absolute display resolution.

PretzelFisch · 6 years ago
On windows with a 27in 4k I definitely see a difference much better then the past 2560x1440 or 2560x1600
AnthonBerg · 6 years ago
macOS looks fine on a 32” 4K
seanmcdirmid · 6 years ago
I prefer to write code on a 5K iMac, it is the best 27” screen out there, and code looks paper-like in resolution. 220 PPI is an ideal pixel density.
m463 · 6 years ago
Apple also enables fancy features and tweaks on it's own hardware.
arvinsim · 6 years ago
It's unfortunate that the price difference between 27" 4k and 27" 5k is enormous.
jhallenworld · 6 years ago
Do you really need 60 Hz for writing code?

I've found that on cheap laptops it helps to set the refresh to 40Hz in the name of stability- it makes the recover from suspend process more reliable for some reason.

proverbialbunny · 6 years ago
I haven't used 40Hz so I can't speak for it but at 30Hz even my typing is slower, let alone mouse movement, which feels like it's pulling on a short string.

60Hz is a good sweet spot, but as people say 120Hz is noticeably better. At least at 60Hz I can type at full speed, scroll at full speed, and my mouse accuracy when clicking on buttons is good enough.

paulddraper · 6 years ago
Using a mouse on 30Hz is like molasses.
rhinoceraptor · 6 years ago
Yes, I would even want 120hz. It makes using the computer feel so much more responsive and fluid.
winrid · 6 years ago
You don't. But if you have a 144hz+ monitor at home it's just plane annoying. You notice the lag.

I'm currently fighting this. The trouble is I guess they move people around a lot and I noticed everyone has the same monitor...

mantap · 6 years ago
You need 60Hz if you are making any user facing software for sure.
zenlot · 6 years ago
> If you write code and you're not on 4k, you don't know what you're missing. 4k is great.

I am on 4k60 on 43". I spend most of my day in terminal (ubuntu, i3-regolith) - it's nice, but don't think I would be missing out much without it.

seanmcdirmid · 6 years ago
4K on a 43” is a pretty low PPI. If you had something worse than that for a screen of the same size, you would be missing out.
clarkmo · 6 years ago
You're not reaping the benefits of high pixel density at that size. It's a shame 5K 27" monitors aren't more prevalent. That perfect 2x pixel density increase from 1440p is incredibly sharp.
twhitbeck · 6 years ago
I heard this advice somewhere, and I invested nearly $1k in 2 4k monitors and required hardware to run them at 60hz. It was a terrible experience, and I ended up selling them at a loss and going back to 2 1920x1200 monitors (love 16:10).

Two main issues for me: I run Linux (4k support is simply _not_ there) and the input lag was very distracting. I tried a couple different 4k monitors, HP Z27 and LG 27UK850-W, and while the LG was slightly better, after a couple months I just couldn't bear it any longer.

I'm a full-time dev writing code mostly in Jetbrains IDEs. Hope this can spare somebody else the cost of trying 4k.

proverbialbunny · 6 years ago
In 2015 I had 2 4k60 monitors, and as long as I setup screen composition in the nvidia settings, everything was as smooth as silk and as sharp as a magazine in Linux, and still is.

In 2015 I was in CLion day in and day out and my gpu was only a laptop gpu with 2gb of vram, so it definitely maxed out my gpu then, but still was as smooth as butter. I had to worry about my computer heating up at the time.

Today I'm on a desktop with a 980. I'm sure it's inefficient, but doing anything in the desktop, like watching youtube in 4k60fps uses about 15% of the gpu according to nvidia-smi. With all my apps running, when I'm not training neural networks, my desktop + firefox takes between 1gb and 1.5gb of vram.

stargrazer · 6 years ago
That is unfortunate.

I run two HP Z32 4k monitors side by side in portrait mode. Running Debian Linux Buster. Connected to NUC8i7hvk. LXDE, sometimes KDE. Text is clear. Moving windows around is smooth as silk.

27" would be too small. 30" is about the right size for the resolution.

Plus I have five or six virtual desktops to task switch between various development projects.

parasense · 6 years ago
I run Linux as well; Went from 1920x1200 (the fabulous 16:10 ratio) to 2560x1440 (16:9) and would never go back down. I'm using a triple display setup with 1440p, and they operate at 75Hz no problem (no input/output lag). Coding no problem, multi-task no problem, everything is just peachy. I suspect the jump up to 4k is just symptomatic of crossing the boundary of acceptable image scale. Apparently 1440 approaches the boundary, but 4K is well beyond. My GPU is not very fabulous, and in fact is the limited GPU available with an Intel NUC (Hades Canyon), but the connector to the display is using DisplayPort, and that I believe is the point the OP is trying to make with the Opinion Piece. I find this conclusion troubling because DP is a few thing: a cable standard (shielding, twists of pairs, etc), a differential signaling protocol, and the features layered on top. A lot of people conflate one aspect with the other, which quickly becomes problematic. For example, the DP protocol was incorporated into thunderbolt 3, so the high level stuff mostly, protocol, etc. The thunderbolt 3 cabling sufficiently meets the standard for the cable standard parts (shielding, isolation, etc). I guess where i'm going with this is that HDMI is slightly more problematic here, especially in terms of the matrices of cable standards versus protocol standards, and the consumer buying the cable or understanding what protocol they need, etc... HDMI made the mistake of introducing a kind of "high speed cable" in the HDMI 1.x protocol era, rather than simply jumping to HDMI 2.x, that is to say NOT aligning major spec jumps to to physical cable requirements, but instead to protocol features. It's probably not a fair comparison with DP and HDMI, since DP sorta didn't have the issue, but it becomes apparent that the future generations of DP cable bring physical cable requirements with the next major version bump, protocol topics are there too but we can get to those later. So for example, DP being in Thunderbolt 3, and now Thunderbolt 4 being drafted, and and.. the combination of Thunderbolt 4 with USB 4... which is capable of transporting HDMI 2.x spec protocols. Ugh... So it seems the physical cable specs of video cables is coming to an end, except for high-end video products (e.g. 8K and beyond). In the general purpose (4K and less) sense there will be one cable for both HDMI and DP (where DP is the default protocol). So all that said, I don't really see HDMI the protocol being a problem, but certainly HDMI cables and interfaces are a problem, perhaps not so big a problem considering most modern cables meet or exceed the requirements. But there are outliers, and there is the problem.
m463 · 6 years ago
If you're using it for coding, 4k can be wonderful, but requires desktop scaling.

If you aren't able to configure it, you get just get microscopic text (unless you're using something like a 55" tv as your monitor)

izacus · 6 years ago
For machines that don't need to mix scaling this is pretty much a solved problem - I've run Windows at 125%, Linux/Cinnamon at 150% and OS X at ~150% and they've all preformed just fine for coding tasks.

The only mess is if you have mixed displays - and even that is mostly still problem only on Linux.

proverbialbunny · 6 years ago
I actually don't use HiDPI on my 4k 27" monitor. Instead in Accessibility Settings I turn on Large Text (Linux Mint), which is like 1.5x or 2.x font sizes. All icons I can see quite well as they grow too, but whitespace is minimized, so I get a lot more screen real estate and it would fool others. It looks like HiDPI.

The only problem with this approach is the mouse cursor stays tiny, and it doesn't have the OSX shake option to find it, though I rarely lose it so it isn't much of a problem.

TheOperator · 6 years ago
I will warn that 4K60 actually has hardware requirements. 4K Netflix requires 3gb dedicated/shared VRAM and really 4gb doesn't hurt. No modern GPUs that I know if will stay in P0 with enough 4K displays attached to them. 4x 4K is also a limit most of the time.

1440p monitors are usually better at the same price point in some way. Maybe refresh rates... Maybe image quality... Well specced 4k displays still cost a decent amount.

Still I would say now is a decent type to leap into the ecosystem. The era of 1080p/1440p has already peaked.

sphix0r · 6 years ago
> I'm an early adopter of 4k60. If you write code and you're not on 4k, you don't know what you're missing. 4k is great.

4K adds a lot of screen-estate. Having a "normal" 27" 4k screen, I recently worked on a ultra-wide curved screen. It blew me away. I can so much recommend curved screens over regular 4k screens. All you applications can fit next to each other on eye height.

jiggawatts · 6 years ago
This is a very... erm... "early PC era" way of thinking.

Back in the days, many moons ago, both displays and software typically had a fixed DPI (96 for Windows) and so a larger resolution was basically the same thing as a larger display. The two were interchangeable.

In the photography and print world (and everywhere else) the resolution is just the "level of detail" or "sharpness", completely independent of the size.

With Windows 10, Windows Server 2016, and recent-ish OSX the display resolution is finally decoupled from the display size. This is especially true at nice even resolutions such as precisely double or triple the legacy 96 DPI (200% or 300% scaling).

I've been using 4K monitors for over a decade, basically since they've been available and it always cracks me up to see some people run them at "100%" scaling with miniscule text. That's not the point. The point is that at 200% scaling text looks razor sharp, but is exactly the same size as it would be at 1920x1080.. You can clearly distinguish fonts that look virtually identical at 1920x1080. It's amazing, you have to try it yourself.

Caveat: If you need (or nearly need) prescription glasses, 4K or higher resolutions may not make much of a difference for you. In this case, you're likely better off having a bigger screen and/or a very big screen further away from you.

Latty · 6 years ago
Those curved displays look nice, but they only compare well in real estate to a single 4k monitor. I'd be a step down from a double/triple 4k setup.
JeremyNT · 6 years ago
> I'm an early adopter of 4k60. If you write code and you're not on 4k, you don't know what you're missing. 4k is great.

Important caveat: the display needs to actually be big enough for this to matter. I've got the 4k Dell XPS 13, but I run at a lower resolution, since I simply cannot perceive any difference.

rayhendricks · 6 years ago
I have tried hdmi 2.0 and Displayport from a 32” lg 4K to an Ubuntu 18.04 box with a 1050ti, both seem to work in Linux and windows 10. Displayport to usbc also works at 4kp60 for my MacBook. This was tested with a cheapo monoprice cable and a LG cable.

Edit: code & text is 100% better on a 4K moknitor with proper scaling.

jaytaylor · 6 years ago
The max run length for a display port cable is much, much shorter than HDMI.
mumblemumble · 6 years ago
I have a large 4K monitor when I'm my desk in the office, a smaller 1920x440 at home, and just the (Retina) laptop screen when I'm elsewhere.

My sense is that the 4K monitor is a bit nicer than the 1920x440, but nowhere near enough nicer for me to have ever felt motivated to replace the 10 year old monitor I already have at home. The real difference is between 2 monitors vs just the one.

I also briefly had a 15" laptop with a 4K monitor, and UGH NO CRANK THAT RESOLUTION DOWN RIGHT NOW. My take-away: Ignore the marketing fluff that focuses overmuch on resolutions; the resolution is not an end in and of itself. Pick a screen size, and then pick a resolution that gives you an appropriate DPI for that size.

caymanjim · 6 years ago
Don't ever turn the resolution on a 4k monitor down, unless it's for performance reasons (e.g. gaming). Every OS has UI scaling built-in now. It works perfectly in Windows 10 and MacOS. It works pretty well in Linux, but given the mess that is Linux UI toolkits, not all applications will scale properly. Everything I use regularly looks fine, even in experimental non-standard scaling like 175%.

Running a 4k monitor at 1920x1080 looks like crap. Running a 4k monitor with 200% UI scaling gets you the same dimensions but glorious smooth fonts.

phonypc · 6 years ago
>1920x440

I'm so confused. Just missing a 1 and 1920x1440 is an actual monitor resolution I've never heard of? 1920x1080? 1920x1200? 2560x1440?

akvadrako · 6 years ago
An appropriate DPI is at least 300 at laptop distances which is the point when you start to not need subpixel antialiasing and font hinting which sacrifices letter shape for sharpness.

That means about 4K for 15 inches.

chx · 6 years ago
The real "fun" starts with HDR.

See, HDR on one hand requires ten bits of color per channel. So bandwidth wise 4K @ 60Hz 10 bit fits DisplayPort 1.2 but not even HDMI 2.0 has enough. See the calculator https://linustechtips.com/main/topic/729232-guide-to-display...

But! HDR is more and some video standards have the necessary extensions but it can also be done from software but then only compatible software will be HDR. Check https://superuser.com/a/1335410/41259

> The HDR mode will automatically activate if the monitor receives a compatible signal, but the device only relies on a software-based implementation. When active, the HDR representation adds a nice touch of extra color pop and a seemingly deeper contrast range, although the monitor’s limitations will come into play here.

And this extra has been added to HDMI 2.0b (compared to HDMI 2.0a) and for this reason it is said HDMI 2.0b support 4k @ 60Hz HDR when it can't because it doesn't have enough bandwidth (well, it does for 4:2:0) and DisplayPort 1.2 is said to not support it when it does have enough bandwidth.

Oh joy.

Latty · 6 years ago
Yeah, it's kinda sad how close we are to the specs and that they aren't getting much ahead. Higher resolutions just chew through that bandwidth quickly.

Apple had to really hack around to get the Pro Display XDR to do 6k at 10-bit at 60Hz. They are using USB-C w/Thunderbolt for that and actually pushing two DisplayPort 1.4 signals down that cable. Clearly cludgy at best. Even then at 6k/10-bit/60Hz the other USB-C ports on the monitor can only work at USB 2.0 speeds because the link is so saturated.

6k/10-bit is a bit niche right now for sure, but I'd sure love to be able to daisy-chain my three 4k monitors which I can't do, and 120Hz is becoming very mainstream (although actually rendering 4k/120Hz is problematic on the graphics card side too).

chx · 6 years ago
I believe Apple arrived to the Prod Display resolution based on the Thunderbolt bandwidth! https://linustechtips.com/main/topic/729232-guide-to-display... it's 38.70 gbit/s which is suspiciously close to the bus bandwidth limit of 40 gbit/s
ksec · 6 years ago
How does 120 or 240 or even 360Hz Monitor works? Because at 4K, 10BPC, 240Hz is already ~70Gbps, pushing close to the DisplayPort 2.0 limit.

I know there is DSC compression, but seriously who want a compress link especially when it is not true lossless.

phonypc · 6 years ago
AFAIK there are no consumer monitors that do 4k at >144Hz. 200Hz+ are all 1080p.
dangus · 6 years ago
This article does contain useful information especially since a lot of this is way less obvious than it should be to a non-technical end user.

My crappy Apple USB-C Multiport adapter simply refuses 60Hz 4K on an HDMI cable that works out of the box with Windows 10 on a GTX 1060. This is despite the fact that the product page says that this configuration with my particular Mac is supported (edit: it’s actually because I have an old adapter).

Then I use a cheap $8 USB-C to HDMI cable and it works fine on two MacBook Pro 15/16” computers (be careful here as a 2016 USB-C MacBook Pro 15” doesn’t support 4K 60Hz over HDMI but the identical-looking 2017 model and newer do).

Frustratingly, Apple doesn’t make a similar USB/power/video adapter for DisplayPort, just for HDMI. Nobody else seems to make one either unless you get into the world of expensive hubs.

For our older 2015 MacBook Pro, Thunderbolt 2/mini displayport to DisplayPort is the option.

What I’m getting at in a poorly organized fashion is that the world of single cable charge and display and device hub is definitely not here. There’s one display on the market that supports USB-C charging above 85W and it only works with newer Macs.

You can get basically the same display for $1000 less with an LG 27” 4K display and just use DisplayPort and HDMI with adapters. Saving yourself from plugging in a second thing isn’t worth $1000.

I can’t for the life of me figure out why the people who make displays with built in power supplies for USB-C charging didn’t just default it on the 100W maximum that the spec allows. What short-sighted design!

In any event, I think I don’t mind keeping that piece of hardware separate from my display, not just for reliability but for weight and bulk as well. I can easily hide a power brick under the table.

paranoidrobot · 6 years ago
The situation with HDMI is confusing enough without also introducting USB-C into the picture.

A lot of video cards and some monitors will negotiate a DisplayPort link over HDMI ports and HDMI cables.

For instance, my Intel NUC has only a single HDMI port, but plug a HDMI to DisplayPort cable in, and it'll negotiate a DisplayPort 1.2 connection to my Dell U2711 at 2560x1440@60hz. Plug a HDMI to HDMI cable in, and plug into the HDMI port on the same monitor - no bueno, it's HDMI only and we're stuck at 1920x1200.

Another monitor, I forget the brand, was happy to negotiate DisplayPort over a HDMI port.

Introducing USB-C adapters into the mix and some adapters appear to support USB-C Alternate Mode to carry DisplayPort but only have HDMI connectors, others won't.

Then we run into potential issues where the video card's outputs may not have been wired into the USB-C controller. Though this afaict is mostly applicable to desktops with discrete GPUs.

dangus · 6 years ago
That’s kind of really cool on a technical level while also being strange and confusing.
lispm · 6 years ago
> Apple USB-C Multiport adapter simply refuses 60Hz 4K

Don't know if it's the case for you: the older version of the adapter was limited to 30Hz @ 4K. Only the latest version supports 60Hz @ 4K.

I have both and unfortunately they look very similar. Only the fineprint gives a clue which is the newer version.

https://support.apple.com/en-us/HT207806

dangus · 6 years ago
That would make sense, I bought mine in 2016 when it was “discounted” to make us feel less bad about losing ports.
MrBuddyCasino · 6 years ago
> I can’t for the life of me figure out why the people who make displays with built in power supplies for USB-C charging didn’t just default it on the 100W maximum that the spec allows

I asked myself the same question, and I guess it is either one of the following:

a) 65W is what the reasonably priced chipsets available on the market support, and it would take a lot of costly auxiliary parts to support 100W

b) its a heat management problem

brightball · 6 years ago
I'm not on a Mac anymore, but this sounds very similar to something that I experienced when switching over to Linux and trying to use the Apple adapters that I had for my displays.

I can't remember the exact terminology, but there are basically two types of adapters: active and passive. Passive adapters defer some of the work to software on the computer while active have everything needed built in.

All Apple adapters are passive and because of that when you try to use them with non-Apple computers that don't have the expected software/driver...they don't work.

It's been a while but I experienced this with mini-display port to DVI adapters. I don't know if it carriers over to other types as well.

Kirby64 · 6 years ago
Another thing is whether or not a DisplayPort connector (or mini DisplayPort) supports 'dual mode' DisplayPort.

Many graphics cards have these, but it's VERY unclearly marked on most things.

See: https://en.wikipedia.org/wiki/DisplayPort#DisplayPort_dual-m...

If you have a DP++ port, then it can basically act as an HDMI port with a passive adapter. If not... well, tough luck.

mruszczyk · 6 years ago
It's interesting, at least the lightning to HDMI adapter from apple dynamically loads a bundled copy of iOS from the device itself. I'm unsure if the USB-C multi port adapter is similar. https://hackaday.com/2019/07/30/apple-lightning-video-adapto...
dangus · 6 years ago
Honestly that might be why the adapter is relatively cheap.

A lot of other solutions especially if you need single cable operation and >65W or especially 96W charging for the 16” MacBook Pro are active hubs that cost well over $100.

But those are all cheaper and more flexible and cross platform than buying the 27” UltraFine display. My eyes can’t tell the difference between the UltraFine and the 4K 27” LG display that cost me $290.

vardump · 6 years ago
> Frustratingly, Apple doesn’t make a similar USB/power/video adapter for DisplayPort...

I use just a cheap USB-C - DP cable for this. Works fine. Of course no USB ports and can't charge with that, but not a huge issue since there are more ports.

dangus · 6 years ago
Yeah that’s what I ended up doing as well, although mine is a USB-C to HDMI adapter.

My display has two HDMI and one DisplayPort, so I’ve allocated one HDMI for USB-C Macs, one HDMI for the desktop Windows PC, and one DisplayPort for Mini DisplayPort devices (2015 MacBook Pro).

In my opinion, buying a two-port MacBook Pro of any kind is a mistake for this very reason, although I guess more adapters can solve that problem...

selectodude · 6 years ago
>be careful here as a 2016 USB-C MacBook Pro 15” doesn’t support 4K 60Hz over HDMI

Yes it does. I use it every day. Might be your adapter.

thewisenerd · 6 years ago
also in some cases, even with the right adapter, which type-c port you plug it into;

https://apple.stackexchange.com/a/354688

m463 · 6 years ago
A lot depends on the HDMI version (and in somce cases HDCP)
jhoechtl · 6 years ago
The bigger part of your problems will go away if you do yourself a favor and refrain from tge end-consumer hostile decisions the comoany in cupertino will take for you.
dangus · 6 years ago
Hmm well it’s not like every laptop or adapter that has HDMI involved is guaranteed to support HDMI 2.0, either. I’m not sure how this is an example of Apple doing something user-hostile. It’s an example of two similar industry standards existing and having different levels of capability at different times.

Plenty of non-Apple products don’t support 4K 60Hz and plenty of non-Apple laptops require some kind of adapter to connect to a full size HDMI port, like the current XPS 13.

In fact, Apple switching to USB-C was moving from a proprietary connector (MagSafe) to an industry standard. So I’m struggling to figure out how that’s user-hostile. It’s better than all the business laptops with proprietary dock ports or the proprietary surface connector.

I guess it sucks that I was an early adopter and I got the early adapter but on the other hand 4K displays weren’t anywhere near affordable in 2016.

Stratoscope · 6 years ago
For me the sweet spot is a triple monitor setup with all three monitors in the neighborhood of 200 DPI, and running at 200% scaling in either Windows or Linux (Cinnamon).

One monitor is my ThinkPad's 14" WQHD display. The other two are 24" 4K displays.

One of those displays is horizontal, immediately above the ThinkPad display. The other is vertical, to the left, with the bottom of the monitor close to my standing desk. (Of course it could be on the right if that suits you better.)

Because I'm old enough that my eyes can not adjust their focus like a younger person's eyes, I make sure that all three monitors are at the same distance from my eyes. And I have a pair of single vision prescription glasses adjusted for that distance.

Since I also use the ThinkPad on its own, that determines the focus distance: about 20 inches. The external monitors are also at that distance from my eyes. Each of the three monitors is tilted at the appropriate angle so that the plane of the monitor is perpendicular to the view from my eye position. In other words, the "normal" at the center of each monitor points directly to my eyes.

I can't use big monitors like a 32", regardless of the resolution. The focus distance changes too much between the center of the monitor and its edges. But 24" is small enough that the entire monitor is in focus with my single vision glasses.

Did I mention single vision glasses?!

Unless you are young enough that your eyes can easily refocus, do yourself a favor and get these. Not a "reading" prescription - that is typically 16", much too close for typical computer use. Bring your laptop to your optometrist and get single vision lenses for that distance. It will probably be about 20". Then when you use external monitors, make sure they are also at that same distance.

Do not under any circumstances use progressive lenses for computer work! I have seen far too many people tilt their head back and lower their gaze into the bottom part of their progressives. This is a recipe for neck and eye strain. You will be amazed at the improvement that a good pair of single vision lenses give you.

hbarka · 6 years ago
I agree wholeheartedly. My optometrist created an extra pair of single vision glasses that were backed off by about one diopter from my true distance prescription (YMMV) and it is much much more comfortable for computer work than reading glasses.
RHSeeger · 6 years ago
I actually use a pair of progressive glasses, with the top 60% set for a distance of <my monitor> and the bottom for reading. It works pretty well for me.

Of course, the most recent eye doctor I'm going to can't get the distance right for the computer glasses, so I can't use the new glasses without leaning in. I even have my old pair for reference and I explained it (I thought) well and showed the distance... and they've tried 3 times and can't get it right. Super frustrating.

nihonium · 6 years ago
To anyone has a slightly older Mac and thinking about getting a 4k screen: DON'T.

4k is not supported on many older models. Check your official specs. Most Macs has max 1440p@60hz output. 4k is only supported @ 30hz, which is no good for daily usage. And the main problem is, if you get a 4k monitor( to future proof your setup), and try to use it at 1440p, everything will be blurry and pixels will shift and distort.

Just get a native 1440p monitor.

If you have a never Mac, getting a 4k 27" monitor may still be a bad idea. Since 4k is too much for a 27" screen, you will need to use scaling in Mac options, and ideally set it to "looks like 1440p" But this will cause your mac to do 1.5 scaling and create a burden in your GPU and CPU. It will render everything doubled at 5k and try to scale it to 4k. If you're using a Macbook, your fans will never stop even on idle. This is even worse performance than getting a 5k monitor and using it native 2x scaled, which is easy on GPU.

One side note; there is no USB-C Hub that offer 4k@60hz output, technically not possible. You have to get a separate hdmi or dp adapter, or an expensive Thunderbolt 3 dock. But there are some usb-c to hdmi or dp adapters which also offers Power Delivery.

I've already wasted money and time figuring this out, so you don't have to :)

proverbialbunny · 6 years ago
I have a 2015 MBP that runs 4k60 just fine, but I do admit the gpu gets nearly maxed out on the desktop running software like CLion, as the IDE is gpu accelerated. Anything pre 2015 is most likely a no go.

From a pragmatic standpoint you'll need a gpu that supports display port 1.2 or hdmi 2.0 or thunderbold / usb-c, and at least 1GB of vram as many operating systems take up to roughly 900MB of vram to run a desktop at 4k. Firefox and Chrome can run fine on 100MB of vram (even Youtube at full 60fps at 100MB of vram is fine), but they really want around 500MB of vram to breath, so 2GB is a good safe minimum for having a lot of windows open at 4k.

The 2015 MBP has 2GB of vram and supports display port 1.2.

saurik · 6 years ago
> One side note; there is no USB-C Hub that offer 4k@60hz output, technically not possible.

This is not correct: there is enough bandwidth for 4k@60hz, just not if you also want USB 3 speeds on the USB hub (which I have no need for: USB 2 is plenty fast enough). I am using a CalDigit USB-C hub with my 12" MacBook (which does not have Thunderbolt) with a special version of the firmware (you have to ask their customer support for a copy) that drops the USB ports down to USB 2 so I can connect to a 4k display at 60hz, and it works great.

crooked-v · 6 years ago
> If you're using a Macbook, your fans will never stop even on idle.

I have a 2017 MacBook Pro that runs 4K-at-looks-like-1440p fine with no fan noise and without even turning on the dedicated GPU for normal web browser / code editor / document stuff.

nihonium · 6 years ago
Macbooks always turns on dedicated GPU when connected to an external monitor. Maybe you're using it in clamshell mode?
mnm1 · 6 years ago
MacBook Pros since at least late 2013 support 4k @ 60hz. That's a seven year old laptop. I've been using 4k displays with mine for years with no issues, both at native and scaled resolutions without taxing the cpu/gpu. I would highly recommend it. Your info might apply to other models but definitely not the last seven years of mbpros.
closetohome · 6 years ago
I encountered most of these issues, with the additional one that some games would force themselves into 3840x2160 regardless of scaling, and then run terribly.

Eventually I just bought a ~2008 30" Cinema Display and have been incredibly happy with it.

nihonium · 6 years ago
30" Cinema Display is a joy. I used to have one but such a shame that it runs too hot. Living in the UK and working in an office with bad AC, it melted my face :) I let it go.

To anyone planning to use a 30" CinemaDisplay , you will probably need a special Mini DP to Double Dvi active powered adapter. They are not very common so they are a bit expensive. Search for: Tripp Lite Mini DisplayPort to DVI Adapter Cable with Dual-Link Active USB Power MDP to DVI-D, 6 in. (P137-06N-DVI-DL)

sgt · 6 years ago
HDMI and 4K/60Hz is problematic on other platforms than just Linux, in my experience. It just doesn't work well enough yet for most people and most computers/screen combos.

Using 4K and 30Hz is like using your desktop through a VNC session, it's absolutely horrendous.

trollied · 6 years ago
On a Mac the other way to sort this is with an egpu.

I have a 13” 2017 MBP & use a Razor Core X Chroma with an AMD Vega 64.

It makes an awesome 1 cable docking station - I have mouse, keyboard & external disks plugged into it, as well as the monitors and a Behringer external sound card (for midi and studio speakers).

Just 1 Thunderbolt cable into the MBP, which also provides power to it. Makes for a nice tidy desk too!

https://egpu.io has all the info you’d ever need on such setups (no affiliation BTW).

pram · 6 years ago
I have a similar setup but I found the USB ports on the Chroma to be ridiculously unreliable, and that seems to be very common. They work without issue for you?
m463 · 6 years ago
Related, I had an interesting experience with an older 4k monitor and ultrahd bluray.

I had a dell UP3214Q monitor, and I got the bright idea to get a bluray player. The model I bought happened to support 4k ultrahd, so I hooked it up.

It looked pretty good... except ... wait, it was downscaling ultrahd to 1080p over HDMI.

So I tried making sure I had the right HDMI cables. Still 1080p.

Long story short -- nobody will tell you this straight out -- ultrahd bluray players require hdcp 2.2. strangely 1080p non-ultrahd disks upscaled to 4k.

hollywood sucks.

I tried a U3219Q (HDMI 2.0 and HDCP 2.2) and everything worked. (except the monitor is not as high quality with a little less viewing angle and some LED bleed at the lower edges)

proverbialbunny · 6 years ago
If it says anything, most 4k movies out right now have cgi rendered in 1080p and then upscaled. Also, most cameraing, even if recorded in 4k, is not focused to 4k, but focused for 1080, so it's often fuzzy. Then some shots are more focused than others which is just annoying.

Avengers: End Game is the first movie to render for 4k and recorded with 4k in mind, and I am uncertain if any movies have come after it meeting that spec yet, so atm 4k bluray is sadly a gimmick anyways.

m463 · 6 years ago
I watched Gemini Man 4k/60hz/HDR and it was really weird to watch.

It felt more like reality tv than a cinematic movie, and I'm uncertain if it was the HDR or the 60hz.