Yeah I have no idea how normal non-tech people are supposed to make any sense of USB-C cables and device compatibility.
Even us tech people find it difficult. Small rant - I have the M1 Max MacBook which comes with “USB4”. I remember at some point getting a SanDisk external “USB4” ssd which claimed 20Gb/s, you would think it should all work together.
But no, apparently the ssd needed 3.2 2x2 support, which the Mac doesn’t implement, since it’s apparently only a optional part of USB4 spec…
One other funny thing - Apple’s page for “Identify the ports on your Mac” has 5 separate entries that are all USB-C shape. https://support.apple.com/en-us/109523
DisplayPort never caught on in the home entertainment market - have you ever seen a DisplayPort TV? And why would there be one, when everyone’s happy with HDMI?
DisplayPort, for most intents and purposes, has lost the format war. You buy a laptop - it’s HDMI on the side, not DisplayPort. You buy a monitor - DisplayPort is never an exclusive port on any model, but HDMI is.
Why do you assert that everyone is happy with HDMI? It has a lot of problems even before we get into the technical details. For example, it is so secretive and closed that the HDMI forum won't even let AMD implement HDMI 2.1 in their Linux driver: https://arstechnica.com/gadgets/2024/02/hdmi-forum-to-amd-no...
> DisplayPort, for most intents and purposes, has lost the format war. You buy a laptop - it’s HDMI on the side, not DisplayPort.
The laptops USB-C ports probably speak DisplayPort though. There was another USB-C alt mode which carried HDMI instead but that spec was abandoned, DisplayPort is the de facto standard USB-C video protocol now.
I would argue the opposite - DisplayPort has won the computing format war, especially on laptops. Most laptops will have at least two USB-C ports with DisplayPort tunneling. Most business PCs have only DisplayPort connectivity, with the assumption that you'll buy a DisplayPort to HDMI adapter (if you have an older monitor that only supported HDMI).
True, we will never see them on the TV, but on the computer it is all DisplayPort.
I'm seeing less and less HDMI ports on computer monitors, graphics cards, projectors, etc. Since IIRC they charge a royalty per port, this makes a lot of sense.
I buy a laptop - it's USB only. No HDMI, DisplayPort, VGA, headphone jack or anything else. When was the last time you bought one?
I buy a desktop graphics card. 1 HDMI - for compatibility, you know - and 3 DisplayPort. The HDMI is port number 2.
Last time I used a projector it was DisplayPort or VGA. I brought brought both kinds of adapters just in case.
My newest monitor has power input, DisplayPort and VGA. No HDMI.
Besides DisplayPort over USB-C that people have already mentioned there's also eDP which is basically the standard for connecting a laptop's screen to its GPU. You buy a laptop, regardless of what external port it has, DisplayPort is used on the internal display. And since most laptop users don't regularly connect to external displays, one can certainly say that the usage of DisplayPort here far eclipses HDMI.
DisplayPort in a way won. USB-C video is DisplayPort. And the HDMI port on your laptop is just an internal USB adapter for DisplayPort alt mode for hdmi.
DisplayPort "fast AUX" stream is more than capable of returning audio or even video data to the "source" and some videoconferencing rigs even implement this. But I don't see why the industry would be motivated to adopt and consolidate around that standard when HDMI's bidirectional features already exist and are widely supported.
CEC should be required by the spec. It's tantalizingly close to "just works" but marred by devices that ought to support it but don't, like my RTX 3060/Windows (which?).
Another fun issue I have with it is that my Apple TV 4K will randomly (1 time in 3-4 hours?) tell the receiver to change input when I'm on another. I'm guessing there's some scheduled wake happening and it inadvertently treats it as a user interaction. Adds a nice element of difficulty to Elden Ring.
My favorite is when my receiver decides to send audio to my PROJECTOR instead of to the goddamned speakers that are attached to said receiver.
I heard tinny sound coming from somewhere behind me and was baffled when I approached the projector and heard sound coming from its tiny built-in speaker.
Is there any reason why most TVs still don't support USB C display port and just support HDMI?
With the recent drama around HDMI vs. Foss drivers on Linux, I'm curious why haven't we seen that bigger push from TV vendors to support USB C display port.
Most of my monitors work with USB C, Even consoles like steam deck, a lot of high end phones seem to support USB C.
So... Are there any features of HDMI that USB C display port doesn't support?
DisplayPort in native mode lacks some HDMI features such as Consumer Electronics Control (CEC) commands. The CEC bus allows linking multiple sources with a single display and controlling any of these devices from any remote. DisplayPort 1.3 added the possibility of transmitting CEC commands over the AUX channel. From its very first version HDMI features CEC to support connecting multiple sources to a single display as is typical for a TV screen.
The other way round, DisplayPort's Multi-Stream Transport allows connecting multiple displays to a single computer source.
This reflects the facts that HDMI originated from consumer electronics companies whereas DisplayPort is owned by VESA which started as an organization for computer standards.
DRM is pretty much mandatory in HDMI in home entertainment setup, and is controlled by the media companies' cabal, which is important to vendors in that space (who are often members too).
Also, lowest common denominator HDMI in TV format has HDCP decoder as the only complex part, you can theoretically drive a dumb panel with few fixed function ICs that aren't even related to HDMI. Simplest dumbest case you can advertise the bare minimum supported display setting in EDID, and plop few buffers and ADCs and drive an analog TV off it.
Meanwhile simplest possible DisplayPort implementations still requires that your display can handle packetized data sent over PCI-E PHY layer, with more complex data structures and setup information than "plop an I2C rom chip here, few ADCs and buffer logic here, and you've got HDMI without HDCP to analog CRT".
>Is there any reason why most TVs still don't support USB C display port and just support HDMI?
Personally I’m convinced it’s mostly because the display manufacturers want to discourage the use of TVs as monitors, in order to protect their margins on monitors.
8k monitors should be sub 1000 usd by now and standard for anyone working with screens. You can get that as a tv but not as a monitor. :(
I feel like it will be a long time before this is considered essential, if it ever is. An end-to-end 2.1 setup will get you 4K120 with e-ARC and numerous flavours of VRR and HDR (though I don't think those last two are necessarily tied to a specific HDMI version).
Heck, it's still rare to find a TV where all HDMI ports are 2.1 (for years it was only LG, not sure if it's changed this year).
Why aren’t they all hdmi 2.0, isn’t it backwards compatible? Or it’s just to save them a $ while technically still being able to advertise it as hdmi 2.0 ?
It actually has little to nothing to do with the connector standard but more of the TV system itself. I know there is HDMI / HDCP handshake but they dont contribute much to the 5s of switching channels.
It should also be legally mandated to have individual buttons for each input instead of having to navigate a silly software menu, for when the auto detection invariably fails !!
> If confirmed, this would align the new HDMI standard with the latest DisplayPort 2.1 technologies, offering consumers expanded options for ultra-high-definition media and gaming experiences.
Why do we need new HDMI when there already is DisplayPort for this?
There’s no reason DisplayPort couldn’t do the same thing. The connector on the inside is basically identical.
That being said, although creative, it’s a pretty terrible idea. PCBs are not meant to be edge connectors. Especially if they aren’t designed for it (ie using hard gold on the edge connector plating).
There’s an interesting middle-ground worth mentioning: a 2-in-1 connector https://www.mouser.com/ProductDetail/Rego-Electronics/845-00... that can take both HDMI and DisplayPort cables, which could be a neat solution for devices juggling both standards.
https://hackaday.com/2023/07/11/displayport-a-better-video-i...
There are a couple of video cameras and viewfinders (particularly from BlackMagic) that have emerged in the last couple of years using DP.
The HDMI versioning chaos did not help.
The worst is still USB 3.1 Gen 2.1 Hyper Giga Super Speed Mark II
Even us tech people find it difficult. Small rant - I have the M1 Max MacBook which comes with “USB4”. I remember at some point getting a SanDisk external “USB4” ssd which claimed 20Gb/s, you would think it should all work together. But no, apparently the ssd needed 3.2 2x2 support, which the Mac doesn’t implement, since it’s apparently only a optional part of USB4 spec…
One other funny thing - Apple’s page for “Identify the ports on your Mac” has 5 separate entries that are all USB-C shape. https://support.apple.com/en-us/109523
DisplayPort, for most intents and purposes, has lost the format war. You buy a laptop - it’s HDMI on the side, not DisplayPort. You buy a monitor - DisplayPort is never an exclusive port on any model, but HDMI is.
The laptops USB-C ports probably speak DisplayPort though. There was another USB-C alt mode which carried HDMI instead but that spec was abandoned, DisplayPort is the de facto standard USB-C video protocol now.
True, we will never see them on the TV, but on the computer it is all DisplayPort.
HDMI is also for various reasons very tightly bound with home entertainment ecosystem because the DRM is mandatory while it's optional in DisplayPort.
Meanwhile DisplayPort has effectively won the format war for computer displays, and on all newer display connectors.
I buy a laptop - it's USB only. No HDMI, DisplayPort, VGA, headphone jack or anything else. When was the last time you bought one?
I buy a desktop graphics card. 1 HDMI - for compatibility, you know - and 3 DisplayPort. The HDMI is port number 2.
Last time I used a projector it was DisplayPort or VGA. I brought brought both kinds of adapters just in case.
My newest monitor has power input, DisplayPort and VGA. No HDMI.
Another fun issue I have with it is that my Apple TV 4K will randomly (1 time in 3-4 hours?) tell the receiver to change input when I'm on another. I'm guessing there's some scheduled wake happening and it inadvertently treats it as a user interaction. Adds a nice element of difficulty to Elden Ring.
I heard tinny sound coming from somewhere behind me and was baffled when I approached the projector and heard sound coming from its tiny built-in speaker.
With the recent drama around HDMI vs. Foss drivers on Linux, I'm curious why haven't we seen that bigger push from TV vendors to support USB C display port.
Most of my monitors work with USB C, Even consoles like steam deck, a lot of high end phones seem to support USB C.
So... Are there any features of HDMI that USB C display port doesn't support?
The other way round, DisplayPort's Multi-Stream Transport allows connecting multiple displays to a single computer source.
This reflects the facts that HDMI originated from consumer electronics companies whereas DisplayPort is owned by VESA which started as an organization for computer standards.
That depends on whether or not you consider single-link DVI-D as the "very first version" of HDMI.
- TV manufacturers are often (always?) members of the HDMI consortium, meaning they financially profit from each device that has an HDMI port.
- Manufacturers of devices with HDMI ports are heavily discouraged from also including competing ports like DP.
Also, lowest common denominator HDMI in TV format has HDCP decoder as the only complex part, you can theoretically drive a dumb panel with few fixed function ICs that aren't even related to HDMI. Simplest dumbest case you can advertise the bare minimum supported display setting in EDID, and plop few buffers and ADCs and drive an analog TV off it.
Meanwhile simplest possible DisplayPort implementations still requires that your display can handle packetized data sent over PCI-E PHY layer, with more complex data structures and setup information than "plop an I2C rom chip here, few ADCs and buffer logic here, and you've got HDMI without HDCP to analog CRT".
Anyway, HDMI is for TVs and DisplayPort is for monitors. They're both entrenched enough that it doesn't make much sense to try to cross over.
Cost. USB c has much more overhead. For example, people expect they’ll be able to charge with it and have usb pass through.
Personally I’m convinced it’s mostly because the display manufacturers want to discourage the use of TVs as monitors, in order to protect their margins on monitors.
8k monitors should be sub 1000 usd by now and standard for anyone working with screens. You can get that as a tv but not as a monitor. :(
Heck, it's still rare to find a TV where all HDMI ports are 2.1 (for years it was only LG, not sure if it's changed this year).
On a couple of 4K Samsung monitors I have, HDMI port 1 is HDMI 1.4 and HDMI port 2 is HDMI 2.0.
Couldn't be simpler. The HDMI version is on the port label. Something about the situation makes me chuckle.
Waiting for a monitor for 5 seconds (often without knowing what it is doing or on which input it is looking) is such a bad UX.
Why do we need new HDMI when there already is DisplayPort for this?
https://forum.kicad.info/t/hdmi-pcb-edge-connector-for-raspb...
Never tried that with a DisplayPort connector.
That's not a reason to keep HDMI. Obviously the real reason is that HDMI makes money for the people selling it.
That being said, although creative, it’s a pretty terrible idea. PCBs are not meant to be edge connectors. Especially if they aren’t designed for it (ie using hard gold on the edge connector plating).
Here’s a video https://www.youtube.com/watch?v=rZpHizpZSPQ showing a PC with this connector in action.
Maybe not a game-changer, but it’s an interesting idea to reduce port clutter without forcing a winner in the format war just yet.