Readit News logoReadit News
qwertox · 2 years ago
This is a really well written article. If you haven't bothered reading about DisplayPort because you knew VGA, somewhat knew DVI and thought that HDMI is the culmination of it all, where then DisplayPort is just some further kind of evolution, this article does a really good job at explaining how DP is very different and something new, something worth knowing about.

The core sentence which made me actually read the article was "DisplayPort sends its data in packets." and does a good job at explaining what this means and how this differs from HDMI.

ben0x539 · 2 years ago
As a habitual comments skimmer, thanks for selling me on the article. :)
alfiedotwtf · 2 years ago
Do you know what sucks? DisplayLink. It's 2023, and the best we have for 3 monitors (MacBook Pro, and 2 external monitors) is basic screen scraping with software drivers if you want to use a single cable... otherwise you have to fall back to 2 cables (one for power + first monitor, and one HDMI).

The other reason why DisplayLink sucks is performance (or lack of) - what they don't tell you on the box. Video is like 16FPS on my third monitor!

Vogtinator · 2 years ago
This is only because MacBooks do not support MST. On all of my non MacBook laptops I can use multiple monitors over USB-C.
NavinF · 2 years ago
If you have high end monitors (4K 120Hz or 6K 60Hz), each monitor will use up all the bandwidth so using one cable for multiple monitors is a nonstarter.

IIRC there's a way around this limitation with DSC and thunderbolt docks, but it's not worth the effort.

Dead Comment

zsz · 2 years ago
I've run into problems repeatedly when it came to DP cables. The issue is always the same: lack of enforcement and no barrier to entry allows for too many entrants, whose motives are up to consumers to discern, wasting time and money in the process. What I originally saw as a benefit of an open standard, as it turned out, was in fact a benefit (for consumers like myself) of the closed/high priced standard—which ended up costing me less, in the process.

Something the original article doesn't mention regarding multi-stream (MST) is that it could be used in situations where a defined standard for a certain resolution/refresh rate didn't even exist, to still make it available. For example, the earliest 4K/60Hz monitors relied on DP 1.4's (or 1.2's? I don't remember) ability to address multiple displays, to send two signals—each to cover one half of the total screen area, i.e. 1920x2160/60Hz, for a combined total of 3840x2160/60Hz—to the same display, which that display then used to internally drive two virtual screens, added seamlessly to create 3840x2160/60Hz. At the time (around 2013 and for a while thereafter), the maximum supported in single stream configuration (or by the existing HDMI standard at the time)_was 3840x2160/30Hz.

You'd think this would be a point in favor of DP—which is certainly what I assumed, at the time. Unfortunately it soon became obvious that because there was no enforcement of DP compatibility—of claiming to support up to a certain version of DP fully, in other words—this meant that most cable manufacturers felt no compunction about lying shamelessly, claiming to support e.g. the 1.2 or 1.4 version at the time (which implied supporting its MST and bandwidth capabilities fully), while doing nothing of the sort.

The lies did not stop there, by the way. If I could, I would here post a photo that I would happily take this very moment of such a DP cable—which didn't come especially cheap at the time, btw., in fact it took three days' worth of effort, a lot of handwringing and plain luck in the end (not to mention wasting money on several dud cables, each claimed to be fully compliant, on top of the additional money required) to finally purchase a cable that actually was compliant—which claimed gold plating as one of its features. Some fine gold that was, with black spots on both sides of the yellowish anodized plug, where the metal had oxidized! Why? Because, as I came to realize, the high barrier to entry created by the high licensing fee of the HDMI group also acts to keep away a bunch of unscrupulous manufacturers, which is purely a benefit to consumers!

In addition, I've never had problems with regular size HDMI plugs—in particular, with removing them. I can't say the same when it comes to DP (especially full sized), which frequently (by design?) have a button that needs to be pushed in to release a lock that holds the plug in place. The problem is, too many times it's very difficult, if not impossible, to push down this button. Worse yet is the ambiguity that this creates: is the button fullly depressed? Is it stuck? Am I getting ready to rip out, or at least damage the underlying hardware, or even just the cable itself? These are thoughts I've had nearly every time while trying to unplug a DP cable, while HDMI (at least the standard size) slides out smoothly, as nothing is put in place to hinder this. In addition, this works well time after time, meaning there is no mechanical fatigue like with mini- and mico-USB.

I can appreciate the idea that DP, when used internally (e.g. in laptops, where any shortcoming would directly reflect on the manufacturer of that laptop), or embedded within another standard, is a great idea, where its low barrier to entry /can be/ (as long as the savings are passed on, that is) a benefit to consumers. However, to claim the same in all applications is simply not supported by the real life outcome of either implementation philosophy.

eimrine · 2 years ago
I hate the lock on DP cables since I have managed to break 2 of them when HDMI just slips out in that situation. DVI also has a lock (2 screws) but for some reason their connectors never break, why did DP has degraded that is a very big question for me.
thereddevil · 2 years ago
Thank you for writing this down. I’ve had very similar experiences with DP and been on the HDMI train since.
mixmastamyk · 2 years ago
Can be used over USB-C apparently, which should alleviate your physical complaints.
JuanPosadas · 2 years ago
> how DP is very different and something new

To be fair, if you think you know your video cables but don't know what display port is, you've been out of the loop for over a decade.

Dead Comment

TazeTSchnitzel · 2 years ago
The amazing thing about DVI, and by extension HDMI, is that it's just VGA but digital, with all the timing and synchronisation complexity that implies. Recall that DVI can have both an analog (VGA/DVI-A) and digital (DVI-D) signal in the same cable. They aren't independent; they share some pins and have the same timing. You could have made a CRT monitor that used DVI-D just by adding a DAC, though I'm not sure if anyone ever did.

DisplayPort does away with all that legacy. I assume the hardware to implement it is much simpler and more reliable.

madars · 2 years ago
This also has side-channel implications, and is a reason why eDP is recommended in voting applications. https://www.eerstekamer.nl/nonav/overig/20160428/richtlijnen...

>DisplayPort uses a scrambler as part of its line encoding in order to flatten the Fourier spectrum of its emissions and suppress spectral peaks caused by particular image contents. This reduces the chances of any particular image content causing a problem spectral peak during SDIP-27 and EMI spectrum measurements. According to the standard, the scrambler reduces spectral peaks by about 7 dB. As a side effect, the scrambler also makes it far more difficult, probably even impractical, for an attacker to reconstruct any information about the displayed image from the DisplayPort emissions. [..] DisplayPort uses a small number of fixed bit rates, independent of the video mode used. Unlike with most other digital interfaces, video data is transmitted in data packets with header and padding bytes, and not continuously with a television-like timing. As a result, DisplayPort cables are not a common source of van-Eckstyle video emanations and this again will make it very hard for an eavesdropper to synchronize to the transmitted data.

Speaking of which: how does one force HDCP (say, in Linux)? That would transform this technology from a DRM nuisance (strippers easily available on AliExpress) into a Van Eck-countermeasure.

angry_octet · 2 years ago
HDCP can't really be stripped, if it's on then something is an HDCP receiver. For v1.4 signals you can probably get something with the leaked key, but not for v2.2 as yet.

HDCP itself needs to be licensed in order to generate it, and that licence is only available to HDMI Adopter companies. However there is code to get some chipsets to turn it on: https://github.com/intel/hdcp

Grab it now before Intel deletes it completely.

HDCP is really a very ugly protocol designed just for anti-copying, I wouldn't build anything relying on it, and everything is harder with HDCP from an AV integration perspective. If you have long links that you want to secure, use something like SDVoE with encryption and authentication (bits are easily flipped in HDCP).

gsich · 2 years ago
Voting computers are not recommended anyways.
ddingus · 2 years ago
You mean election type voting?

Screen scraping is the least of the problems.

Voting machines are simply not trustworthy given these requirements for a free and fair election:

0. Freedom to vote or not

1. Anonymity in that no vote may be associated with the voter who cast it

(These two are most of why e-voting is not trustworthy and are what differentiates banking from voting.)

2. Transparency. The law, means, methods must all be known to the voting public. It is their election. In particular, seeing how a cast vote moves through the entire process and into the final tally is necessary.

3. Oversight. The first two of these foundation requirements necessary for free and fair public elections have an important ratification:

The record of votes cast must be human readable and said record must be used directly to realize the final tally of all votes cast.

There is a chain of trust between voter intent and the vote cast. When one casts a vote on physical media, that is a direct record of voter intent. The voter can be sure their intent adds to the tally because they can directly verify their votes cast.

Electronic votes are a vote by proxy. The direct expression of intent is not kept and exists as a bit of skin residue left on the machine interface. Put simply, we ask people to vote and then we discard their expression!

What gets used instead is whatever the machine thought the voter intent was!

Consider this: You have a machine in front of you with two buttons. One brings the world into harmony, the other casts it into war, grief and darkness.

You can:

A. Submit a paper form you fill out manually indicating your choice

, or

B. Press one of the buttons and get a receipt for your records.

Say this machine has an indicator showing your choice and a third "approval" button you use when you feel good about the machine getting your choice right.

How do you the voter know the receipt and or indicators match what the proxy machine says your choice was.

What happens when we have a dispute? We know the machine can't be trusted to match receipts. With the direct expression of intent lost, what do we bring to court?

We can and have brought paper ballots into court to resolve and election facing genuine ambiguity.

I submit to you we can't really be sure who wins an electronic election, unless we also associate your identity with the vote.

Say a bunch of people get to choose the fate of the world this way. What is your confidence level?

How do we know the machines tallied up the winning choices correctly?

What do the tests for that look like and how do they get around the forced trust problem we have with all electronic inputs today?

There is a lot more trouble to discuss. What I put here is the core trouble and computers have always had this problem too.

It just does not present any real difficulty, until elections are brought into the discussion.

Display trust issues are well above the core tech trust issues we face today.

pavlov · 2 years ago
I was guessing that the oddly beautiful 17” Apple CRT Studio Display [1] from 2000 might have been a CRT that used a digital signal cable because it had the short-lived Apple Display Connector, but apparently ADC carried analog too.

[1] https://everymac.com/monitors/apple/studio_cinema/specs/appl...

ShadowBanThis01 · 2 years ago
The ADC carried power, too, which is why Apple went with it.
nyanpasu64 · 2 years ago
One (extreeeemely pedantic) difference is that (I think) HDMI begins and ends its vsync pulses when the preceding hblank begins, whereas VGA begins and ends them slightly later when the preceding hsync pulse begins (I documented this at https://nyanpasu64.gitlab.io/blog/crt-modeline-cvt-interlaci...).

I also think that DP rouuuughly reflects CRT timings complete with active vs. blanking intervals, but doesn't actually have a fixed pixel clock (perhaps it does? I didn't quite figure out synchronous/asynchronous pixel transmission when reading the spec) and doesn't transmit hsync pulses once per scanline.

MisterTea · 2 years ago
> You could have made a CRT monitor that used DVI-D just by adding a DAC, though I'm not sure if anyone ever did.

From memory IBM offered a crt with DVI.

thereddaikon · 2 years ago
Plenty did but they were usually DVI-A or DVI-I
rasz · 2 years ago
>DisplayPort does away with all that legacy.

You wish. DP sends exactly same bytes DVI does (blanking and all), just broken up into packets.

fanf2 · 2 years ago
The important difference is that (like VGA) DVI and HDMI still dedicate particular physical wires to separate red, green, and blue channels. Display port does not shard individual pixels across multiple channels: all bits (all colours) of a pixel are sent on the same diffpair. If a DP link has multiple lanes then the bits of different successive pixels are sent on each lane.
tshaddox · 2 years ago
> However, I’d like to tell you that you probably should pay more attention to DisplayPort – it’s an interface powerful in a way that we haven’t seen before.

Lines like that really show just how long it can take for standards to get on the radar of mainstream tech culture. I remember hearing about and being excited about DisplayPort's move to packetized digital video in college in 2008, and seeing the first Macs with MiniDisplay port later that year (or perhaps it was in 2009)!

I was actually under the impression that it has been well-known and commonplace for hobbyist and enthusiast PCs for well over 10 years, but I'm probably wrong about that!

NavinF · 2 years ago
> impression that it has been well-known and commonplace for hobbyist and enthusiast PCs for well over 10 years

It is. For a long time DP was the only standard that could do variable refresh rate. Even today all high end monitors have DP while the cheapest monitors only have HDMI.

fireflash38 · 2 years ago
Which is ironic considering the cost of the HDMI port is likely higher than the DP one due to licensing!
bick_nyers · 2 years ago
I just wish TV manufacturers would start putting Displayport into their TVs.

GPUs nowadays use 3x Displayport and 1x HDMI, which is quite the bottleneck if you want to max out your ports.

As I understand, HDMI <-> Displayport converter cables often do not have the high end features you might want, such as 4k/120Hz HDR, and/or Variable Refresh Rate. Perhaps this has improved in recent months.

MBCook · 2 years ago
Same. I live in the Mac world. When I took a job where I had to use PCs at work I was surprised to see they were mostly using HDMI with some people using older DVI equipment.

I had just assumed everyone had transitioned similar to Macs has years before.

Nope. Lots of people use/want HDMI to this day.

loup-vaillant · 2 years ago
> Lots of people use/want HDMI to this day.

I’d wager this is less about HDMI, and more about the fact that people want their old DP-less monitors to work.

hot_gril · 2 years ago
That, and a similar line might've been said about FireWire, which didn't really make it.
dylan604 · 2 years ago
Maybe I agree with that, but I also know that firewire helped usher in the digital video era. It allowed the transition from tape based acquisition when media cards were prohibitively expensive. Audio/Video/Deck control all down one single cable straight from the camera to the computer was what really kicked the prosumer market into being able to lean closer to pro than consumer. Now that media cards are actually affordable, that does seem like ancient history. I could see how you might think of firewire as a failure if you're looking at it as a USB type transition, but for the camera/video professions, it served a very good purpose even if short lived
sillywalk · 2 years ago
Apparently it's 1394b is used in some military applications, the F-35 for example.
tivert · 2 years ago
Are there any KVM switches that do Displayport well (i.e. where switching between inputs does not look like a display disconnect to the PC)?

I'm still using HDMI because I like to share my home multi-monitor setup between my personal machine and my work laptop, and the KVM switches are able to fool the PCs into thinking the monitor are always connected. Years ago I tried a Displayport switch, but it could not -- I assume because if the greater sophistication of the Displayport protocol.

xxpor · 2 years ago
The magic words you're looking for are "EDID emulation". The KVM will continue to send the EDID data from the monitor even after you've switched away, which will fix that issue.

It's relatively uncommon and not always implemented super well, but it's a requirement for any DP KVM to be not super annoying IMO.

There was one particular KVM brand that was supposed to do it well whose name is escaping me now :/. I was looking at buying one in ~ May 2020 for obvious reasons, but they were on super-backorder (also for obvious reasons), so I never got around to it. IIRC they were about $500 for a 4 input/2 output version, so not cheap.

hazebooth · 2 years ago
db48x · 2 years ago
Adderview made some of the best KVMs, but I don’t know if they have a good DisplayPort model or not.
Fnoord · 2 years ago
BliKVM PCIe I bought (based on PiKVM) came with an EDID emulator.
jchw · 2 years ago
None of them are perfect, but I've heard good things about the DP switch from Level1techs. The thing is, all of them are a little tricky, but they mostly differ in how quirky they are, and I suspect the reason why people like the Level1techs DP switch is that they seem to at least try to alleviate some of the issues DP switches tend to get into.

https://store.level1techs.com/products/14-kvm-switch-dual-mo...

The startech one I have is alright... But Apple computers absolutely hate it and frequently refuse to display, and sometimes Windows gets stuck and USB devices stop working. Strangely enough... Linux doesn't ever have any problems with either display or input. A rare win, but fine by me.

pxc · 2 years ago
The Level1Techs KVM switches are rebranded Startech switches with 'dumber' firmwares whose dumbness affords better compatibility with niche DisplayPort features.

I have a bunch of them and I like them pretty well, but getting a bunch of computers all plugged in turns out to be a bit of a nightmare, especially when you need some long-ish cable runs or you are daisy-chaining devices (e.g., multiple KVM switches, adding USB hubs or Thunderbolt docks, etc.).

The Level1Techs KVM switches don't meet GP's criterion for hotplugging behavior, unfortunately. Switching between devices is just an unplug and replug for them.

Like you, I've found that macOS and Windows don't handle hotplugging monitors well, but Linux desktops (in my case, KDE Plasma) consistently do the right thing and don't require a repeater to lie to them about monitors always being plugged in.

FWIW, I don't get the 'Apple computers just refuse to work' issue with any of my L1T KVMs.

Steltek · 2 years ago
I had much more serious problems with a StarTech DP KVM and Macs. My Macbook would hang and crash-reboot. Both on the initial plug-in and on switching inputs.

Everything else seemed to handle it fine with Linux being especially unphased, as usual.

mvid · 2 years ago
https://store.level1techs.com/?category=Hardware

This is what I use. It appears to disconnect, but also doesnt seem to be an issue. My machines re-organize instantly.

jamestanderson · 2 years ago
I got their 10gbps displayport switch to use with switching a single monitor between a Windows desktop PC and an M1 MacBook Pro. I have a 4k@144hz monitor and can get the full framerate and resolution with this setup. I've never had any problems, would highly recommend.
formerly_proven · 2 years ago
It's gotten better in the last year or so with Windows 10 but it'll still sometimes just fall apart when the display configuration changes, which is something that just never happened for any reason with HDMI/DVI.
zerkten · 2 years ago
I was looking at this as an upgrade pick and don't have any re-arrangement with my TESmart (TES-HDK0402A1U-USBK). What monitor(s) do you have?
stephenr · 2 years ago
The new Dell 6K has kvm (and PiP) functionality across its inputs, and it does appear from my modest use of this feature so far, that it works as you would want (ie it still thinks the display is connected, even when not showing that input)
111111IIIIIII · 2 years ago
I would prefer that with 1 display but I have 2 Dell 6Ks and it's kind of annoying if I want to have them each on a different PC. (I use a usb switch to switch my peripherals between displays)
seanalltogether · 2 years ago
Can you explain why this is beneficial? I have a mac laptop and pc desktop at home that i switch between depending on whatever I need to do. By triggering a disconnect, it means all my mac windows that are on the main monitor will zip back over to the laptop so they're still reachable if i need to access them with the trackpad and integrated keyboard. When I switch the kvm back to mac all those windows jump back to the main monitor.
pavon · 2 years ago
Flaky drivers. KVM induced unresponsiveness is pretty much the only reason I ever have to hardboot my computers.

Also, even if the drivers are solid, they take longer to renegotiate with a monitor that was removed and plugged back in compared to one they think was always there, which matters if you switch back and forth frequently.

Lastly, sometimes the OS doesn't put things back they way they were when you plug a monitor back in. If you have a laptop which has a lower resolution display than the external monitor, you'll often return to find all the windows shrunk to fit the laptop display. Not an issue if you run everything full-screen, but annoying if you tile windows.

bbatha · 2 years ago
I had the startech one the siblings have mentioned but that wasn't very good and didn't do EDID emulation correctly. This CKL one [0] has been working really well, and supports USB 3 which is a nice bonus so I can share my webcam. Though sometimes after wake up my macbook forgets about my second monitor (I have an M1 connected to a cable matters thunderbolt dock), my windows machine which has direct DP connections doesn't have the same issue.

0: https://www.amazon.com/gp/product/B09STVW821/

nyanpasu64 · 2 years ago
Never mind KVM switches, I wish powering off my DP monitor while leaving wall AC power plugged in didn't appear as a display disconnect to the computer.
m463 · 2 years ago
I don't think I've had a KVM switch work well since VGA+PS/2

They all try to be too smart.

As a matter of fact, I usually use a monitor switch of some sort, then use mechanical USB switches - one for keyboard, one for mouse. That seems to be the only way to get mouse and keyboard to work well (basically just a hardware connection, no smarts)

smitty1110 · 2 years ago
Belkin makes some, up to duplex 4k@60hz, but holy mother of god are they expensive.
nine_k · 2 years ago
Look how expensive are the chips they are using.

High-speed, high-bandwidth, low-delay interfaces are apparently hard.

nfriedly · 2 years ago
I'm annoyed nvidia for putting the current generation of HDMI on their recent GPUs, but leaving them with an outdated version of DisplayPort.

For a long time, my advice to anyone was to always choose DisplayPort whenever it was an option. But now that has to have the caveat of "if you have a new high-end GPU and a fancy high refresh rate monitor, HDMI might actually be better for your situation"

gjsman-1000 · 2 years ago
> I'm annoyed nvidia for putting the current generation of HDMI on their recent GPUs, but leaving them with an outdated version of DisplayPort.

That was due to unfortunate timing where HDMI had the specifications ready before DisplayPort did.

Sweepi · 2 years ago
AMD’s RX 7000 support Display port 2.0 and were released 2 months later then Nvidia’s Ada. Afaik DP 2.0 has been finished in 2019(!).
Night_Thastus · 2 years ago
I thought it was because their G-Sync modules had not been updated to support the new DP? That was what I heard.
ThatPlayer · 2 years ago
Yeah I hate this too because I just want to have more video outputs. My Valve Index doesn't like to be hotswapped plugged in, requiring a reboot. With 1440p144hz monitors, I just barely cannot run 2 of them (2x14Gbit) over a single DP1.4 (26Gbit) using MST. Windows will automatically drop colour down to 4:2:2 if I try it.

Not that DP2.0 MST hubs exist yet afaik, but when they do I'd have to get a new GPU. Which I guess is Nvidia's goal.

bick_nyers · 2 years ago
I think it's all baked into NVIDIA's product strategy of trying to encourage enthusiasts to upgrade every generation. Yes some enthusiasts will upgrade every generation no matter what strategy you implement, but I would say most enthusiasts philosophy is to buy premium products and skip a generation or two. It used to be just raw computational performance that sold GPUs, but now it's all these additional layers and features, GSYNC, RTX, DLSS, Frame Interpolation, and even the ability to interface with a particular display (e.g. 8k/120Hz) are all part of the product, and so can be reserved to "boost" particular generations desirability. I wouldn't be surprised if there aren't any "software" level enhancements to the 5000 series, just their standard performance uplift, an increase in VRAM capacity (5090 topping out at 32GB), and the new Displayport standard, all to promote and focus on 8K/120Hz and 4k/240Hz gaming. They set the stage for it with the motion interpolation tech. in the 4000 series.
bdavbdav · 2 years ago
I also noticed that many boards are 3xHDMI+1xDP now. My previous card was 3DP 1HDMI
LatticeAnimal · 2 years ago
What card is 3xHDMI + 1xDP? I just checked and at least the reference rtx 4080 is still 3xDP 1xHDMI
111111IIIIIII · 2 years ago
And only 1 new version HDMI port but 3 old version DP ports. I use a dual display setup but HDR only works on mu displays with HDMI.
Strom · 2 years ago
The more expensive Gigabyte RTX 3080+ cards had 3x DP 1.4, 2x HDMI 2.1, 1x HDMI 2.0, which was great even if you could only use 4 of them at the same time due to Nvidia limitations. Unfortunately for the 40 series they stopped offering it and now they're the same as stock Nvidia with 3x DP 1.4, 1x HDMI 2.1.

Luckily ASUS still offers a bit more with their 4080+ cards - 2x HDMI 2.1, 3x DP 1.4. I personally depend on the 2x HDMI 2.1 to even be able to run my 4K 144Hz monitors at full speed.

crazygringo · 2 years ago
> Just like most digital interfaces nowadays, DisplayPort sends its data in packets. This might sound like a reasonable expectation, but none of the other popular video-carrying interfaces use packets in a traditional sense – VGA, DVI, HDMI and laptop panel LVDS all work with a a stream of pixels at a certain clock rate.

Funny, a stream of data at a constant rate makes much more sense to me intuitively than packets, specifically for uncompressed video.

Are there any downsides to packetization, like increased latency or dropped frames or anything? Or not really, is it all upsides in being able to trivially combine multiple data streams or integrate easily into hubs?

wtallis · 2 years ago
> Funny, a stream of data at a constant rate makes much more sense to me intuitively than packets, specifically for uncompressed video.

Sure, until you start trying to design the transceivers and realize that supporting two or three fixed standard data rates is a lot simpler than supporting a continuously-variable clock speed. Every other high-speed digital interface operates at just a few discrete speeds: SATA/SAS, PCIe, Ethernet, USB.

The fact that DVI and HDMI were such a shallow digitization of VGA's racing-the-beam meant features like variable refresh rate (Gsync/Freesync) showed up far later than they should have. If we hadn't wasted a decade using CRT timings (and slight modifications thereof) to drive LCDs over digital links, it would have been more obvious that the link between GPU and display should be negotiated to the fastest data rate supported by both endpoints rather than the lowest data rate sufficient to deliver the pixels.

ansible · 2 years ago
> ... If we hadn't wasted a decade using CRT timings (and slight modifications thereof) to drive LCDs over digital links ...

Don't forget the decade or so (late 1990's to early 2000's) where we were driving LCDs over analog links (VGA connectors).

I had purchased a pair of Silicon Graphics 1600SW monitors back in the day, which required a custom Number Nine graphics card with an OpenLDI display interface. It was many years since those were introduced to the market that DVI finally started becoming commonplace on PC graphics cards.

Using the mass-market LCD monitors in the late 1990's was a frustrating affair, where you had to manually adjust the timing synchronization of the analog signal.

samtho · 2 years ago
> Funny, a stream of data at a constant rate makes much more sense to me intuitively than packets, specifically for uncompressed video.

There is a data rate floor for how fast the device that outputs must meet. We’ve surpassed this (due to optimization at the silicon level, designing hardware who’s sole job it is to send bursts of data) and we end up running out of data in the buffer periodically because it’s just so fast. Because analog is real time, you can squeeze much else in that data stream but with digital, we are afforded the luxury of packet switching instead of that line being idle, we can pump even more down the linen.

> Are there any downsides to packetization, like increased latency or dropped frames or anything? Or not really, is it all upsides in being able to trivially combine multiple data streams or integrate easily into hubs?

If I recall correctly, the timing and data rates is all prearranged based on reported capacity and abilities of the receiving device and it won’t even attempt to support multiple streams if it is incapable of doing so or the data channel established cannot fully support the bandwidth required.

tverbeure · 2 years ago
Latency is limited to the amount of buffering that's present in the interface logic (on sides). In the case of regular DP, there's just a few FIFOs and pipeline stages, so the latency is measured in nanoseconds.

When using display stream compression (DSC), there's a buffer of 1 line, and a few FIFOs to handle rate control. At the resolution for which DSC is used (say, 4K/144Hz), the time to transmit a single line is around 3us. So that's the maximum additional latency you can expect.

willis936 · 2 years ago
If you didn't packetize then the receiver would never recover line coding state when a single symbol is missed. The rest are incidental details.
muhammadusman · 2 years ago
This post was informative and I didn't realize just how different DisplayPort is from HDMI. Recently, I got a desktop dock that uses DisplayPort instead of HDMI to connect to my monitor. My monitor has 2 HDMI ports, 1 Type-C, and 1 DisplayPort. So far things have been fine but I did notice that the audio is choppy no matter what I do. I thought it was the dock but audio going from my computer > dock > my webcam's speaker works fine (all over usb-c). So unfortunately, it leads me to believe that the DisplayPort is causing this jittery audio.
smachiz · 2 years ago
It might be your dock.

Check to see whether it's USB or Thunderbolt. Thunderbolt docks are more expensive, but considerably more efficient and faster than USB (assuming your laptop/device supports Thunderbolt).

Thunderbolt docks are basically PCIe extension devices, whereas USB docks attach everything as USB, with all the common challenges USB has on systems (like dropped audio when the CPU is busy).

muhammadusman · 2 years ago
Thanks for the info. The dock is the CalDigit TS3 Plus dock and I’m using it with an LG monitor. Their page says it’s a Thunderbolt dock so I wonder if there’s anything else about this particular dock that could be causing this issue. Btw when the monitor was connected over HDMI, it was fine playing audio.

https://www.caldigit.com/ts3-plus/

qingcharles · 2 years ago
My screen only has HDMI and my desktop only has DP out, so I bought a $2 adapter from Temu. The audio surprisingly works fine, I thought that would be a total oversight.

Picture on the other hand is slightly janky, which seems to be a common issue for DP->HDMI convertors. If anyone knows a convertor that doesn't turd up the signal I'd love to know.

jbverschoor · 2 years ago
Same here with an lg monitor on dp
binkHN · 2 years ago
I didn’t realize Multi-Stream Transport (MST) requires OS support, and I was surprised to find out MacOS, with its great Thunderbolt support, does not support this. “Even” ChromeOS can do MST.
stephenr · 2 years ago
Technically macOS does support MST. But it only supports it to stitch together for a single display. It does not support daisy chaining two displays.

Thankfully, every Mac for the last 7 years has Thunderbolt3 at least, so getting dual-4K-display from a single port/cable is still very doable, you just need a TB3 to dual DisplayPort or HDMI adapter.

bpye · 2 years ago
Well except for some of the Apple Silicon machines. The M1 (and maybe M2?) only have two video output blocks, of which one is already used for the internal display. It’s honestly the biggest complaint I have about my M1 MBP. Yes DisplayLink or whatever it’s called exists but the performance is bad.
bdavbdav · 2 years ago
Unfortunately, although it can work with certain Thunderbolt devices, many TB4 docks with the video path based on MST don’t work, causing havoc for mixed environments
nyanpasu64 · 2 years ago
Doesn't MST merely chop up the bandwidth of the lanes you have? So why would you update different parts of a single display using MST (each stream only getting part of the bandwidth of the link), rather than using the whole link to update the display from top to bottom at the same pixel depth and clock?
deergomoo · 2 years ago
You can daisy chain the Studio Display and Pro Display, and you could daisy chain the old Thunderbolt Display. Is that using some custom thing over Thunderbolt rather than MST?
bdavbdav · 2 years ago
Yep. Super annoying. I have a Dell WD22TB4 which works great to drive 3 monitors for everything, except my Mac.