Readit News logoReadit News
masklinn · 3 years ago
Some of the entries seem incorrect: "USB 3.2 (USB 3.2 Gen 2x2) and "USB 4" (USB 4 USB4 Gen 2×2) should have the same nominal data rate of 2500MB/s, they're 2 lanes (x2) of 10GB/s. Though they are apparently coded differently electrically, so they're distinct protocols.

The tables would benefit from mentioning the coding (8/10 or 128/132) as IMO it's one of the most confusing bits when you see the effective data rates:

* USB 3.2 Gen 1x2 has a nominal data rate of 10G (2 lanes at 5G) with a raw throughput of 1GB/s (effective data rates topping out around 900MB/s)

* USB 3.2 Gen 2x1 has the same nominal data rate of 10G (1 lane at 10G) but a raw throughput of 1.2GB/s (and effective data rates topping out around 1.1GB/s)

The difference is that Gen 1x uses the "legacy" 8/10 encoding, while Gen 2x uses the newer 128/132 encoding, and thus has a much lower overhead (around 3%, versus 20).

fabiensanglard · 3 years ago
Thank you for noticing these issues, I have updated the table.

I would be happy to improve it and add encoding. I am surprised by some of the summary entries on Wikipedia (https://en.wikipedia.org/wiki/USB4). Looks like USB4 "reverted" to 128b/132b. It is accurate?

lxgr · 3 years ago
128b/132b is the more efficient coding. The closer to 1 the fraction is, the less coding overhead it has, and 128/132 is larger than 8/10.
Ldorigo · 3 years ago
Fyi, last two columns in table 2 are a bit confusing: footnote c says "real life sequential speed", but then the last column title is "real life", so it's unclear what the difference is
CamperBob2 · 3 years ago
He goes off the rails earlier than that, by saying that USB 2.0 is "also known as" Hi-speed. HS is only one data rate supported by the USB 2.0 standard; it incorporates both full speed from the earlier standard and low speed, which isn't mentioned at all.
masklinn · 3 years ago
That's more of an approximation matching how, frankly, most people think of the specs: yes USB 2.0 supersedes 1.1 entirely, but everyone will think of "full speed" and "low speed" as USB 1 which are BC supported by USB 2.0.

That's also why USB 3.1 and 3.2's rebranding of previous versions is so confusing and a pain in the ass to keep straight: USB 3.2 1x1 is USB 3.1 Gen 1 is USB 3.0 (ignoring the USB 2.0 BC).

dwaite · 3 years ago
Right, per his chart "Full Speed" should be known as USB 1.1 Full Speed and USB 2.0 Full Speed.
belter · 3 years ago
Also should be:

12 Mbps -> 1.43 MiB/s -> 1.5 MB/s

480 Mbps -> 57 MiB/s -> 60 MB/s

5000 Mbps (5 Gbps) -> 596 MiB/s -> 625 MB/s

10000 Mbps (10 Gbps) -> 1192 MiB/s -> 1250 MB/s

20000 Mbps (20 Gbps) -> 2384 MiB/s -> 2500 MB/s

40000 Mbps (40 Gbps) -> 4768 MiB/s -> 5000 MB/s

adrian_b · 3 years ago
No, some of your rates are wrong.

The so-called 5 Gb/s USB has a data rate of 4 Gb/s.

The marketing data rates for Ethernet are true, i.e. 1 Gb/s Ethernet has a 1 Gb/s data rate, but a 1.25 Gb/s encoded bit rate over the cable.

The marketing data rates for the first 2 generations of PCIe, for all 3 generations of SATA, and for USB 3.0 a.k.a. "Gen 1" of later standards, are false, being advertised as larger with 25% (because 8 data bits are encoded into 10 bits sent over the wire, which does not matter for the user).

All these misleading marketing data rates have been introduced by Intel, who did not follow the rules used in vendor-neutral standards, like Ethernet.

So PCIe 1 is 2 Gb/s, PCIe 2 & USB 3.0 are 4 Gb/s and SATA 3 is 4.8 Gb/s.

So USB "5 Gbps" => 500 MB/s (not 625 MB/s), and after accounting for protocols like "USB Attached SCSI Protocol", the maximum speed that one can see for an USB SSD on a "5 Gbps" port is between 400 MB/s and 450 MB/s.

The same applies for a USB Type C with 2 x 5 Gb/s links.

As other posters have already mentioned, USB 3.1 a.k.a. the "Gen 2" of later standards has introduced a more efficient encoding, so its speed is approximately 10 Gb/s.

The "10 Gbps" USB is not twice faster than the "5 Gbps" USB, it is 2.5 times faster, and this is important to know.

Deleted Comment

Someone · 3 years ago
No USB On-The-Go (https://en.wikipedia.org/wiki/USB_On-The-Go) or Wireless USB (https://en.wikipedia.org/wiki/Wireless_USB)?

USB is a triumph of marketeers over engineers. All these things are called USB because USB sells (see also: Bluetooth).

IshKebab · 3 years ago
I don't know anything about wireless USB but USB OTG is called USB because it is USB. It's not some totally unrelated protocol.
jandrese · 3 years ago
I thought OTG was just changing up where the host controller is sitting in the USB relationship? So you can have a device that acts like a client when hooked to a computer, or a master when hooked to a thumb drive/webcam/etc...?
ChrisRR · 3 years ago
Bluetooth Smart aka. Bluetooth Low Energy aka. Wibree aka. not actually bluetooth
kashunstva · 3 years ago
> May 05, 2025

The article is dated May 5, 2025. I've long been wondering about the future of USB.

vesinisa · 3 years ago
OP forgot [2025] from the title.
WithinReason · 3 years ago
I could still fix it, but I fear the wrath of Dang
Beta-7 · 3 years ago
USB 4.2 (later renamed to USB 3.2 gen 2 Mk. 1) comes with built in time traveling. They just keep adding features to the protocol and making it complicated.
Fatnino · 3 years ago
It's a form of SEO. Google promotes "fresh" content, so if it sees a date less than a year ago it often assumes the content is better. Normally you will see this abused by crappy content mills using a plugin that constantly updates the date on their garbage.

Putting a static date from 3 years in the future seems like a quick a dirty hack to do the same thing.

notorandit · 3 years ago
Not to be read before: see article time stamp
IvanK_net · 3 years ago
Fun fact: USB 2.0 webcams have been existing for over 10 years. USB 2.0 is 60 MB/s.

A pixel of an image is 3 Bytes. A 1920x1080 FullHD image is 6.2 MB. At 30 frames per second, second of a FullHD video is 186 MB. How did they do that?

Answer: frames are transferred as JPEG files. Even a cheap $15 webcam is a tiny computer (with a CPU, RAM, etc), which runs a JPEG encoder program.

pseudosavant · 3 years ago
Most webcams, especially 10 years ago are not 1080p, or even 60fps. Many aren't even 720p. 1280 x 720 x 3 bytes x 30 fps = ~80MB/second. 480p @ 30 fps = 26MB. That is how many webcams can get by without hardware JPEG/H264 encoding.

4K @ 60fps = 1.4GB/sec. USB 3, even with 2 lanes, will have trouble with that.

kevin_thibedeau · 3 years ago
The cheap ones are using hardware JPEG encoders. The associated micro isn't powerful enough to do it in firmware alone.
masklinn · 3 years ago
Surprised they don't use a hardware video encoder, is it because the well and efficiently supported formats are all MPEG, and thus have fairly high licensing cost on top of the hardware? Or because even efficient HVEs use more resources than webcams can afford? Or because inter-frame coding requires more storage, which (again) means higher costs, which (again) eats into the margin, which cheap webcam manufacturers consider not worth the investment?
grishka · 3 years ago
Hm. But then wouldn't it make more sense to just stream the raw sensor data, which is 1 byte per pixel (or up to 12 bits if you want to get fancy), and then demosaic it on the host? Full HD at 30 fps would be 59.33 MB/s, barely but still fitting into that limit.

But then also I think some webcams use H264? I remember reading that somewhere.

monocasa · 3 years ago
The pixel density doesn't generally refer to the density of the Bayer pattern, which can be even denser. Generally a cluster of four Bayer pixels makes up one pixel (RG/GB), but like most things in computing, the cognitive complexity is borderline fractal and this is a massive simplification.
masklinn · 3 years ago
> Full HD at 30 fps would be 59.33 MB/s, barely but still fitting into that limit.

It's not fitting into anything I fear, best case scenario the effective bulk transfer rate of USB2 is 53MB/s.

60 is the signaling rate, but that doesn't account for the framing or the packet overhead.

verall · 3 years ago
It would need a funny driver and since that stuff is big parallel image processing it's easy in HW but if someone has a netbook or cheap/old Celeron crap it would peg their CPU to do the demosaic and color correction.
Dylan16807 · 3 years ago
> Full HD at 30 fps would be 59.33 MB/s, barely but still fitting into that limit.

That limit is too high even as a theoretical max.

You could do raw 720p.

BayAreaEscapee · 3 years ago
I don't know where you get "1 byte per pixel" from. At minimum, raw 4:2:0 video would be two bytes per pixel, and RGB would be three bytes per pixel with 8-bit color depth.
verall · 3 years ago
It needs a uC with some special hardware anyways to do demosaic or else it would require special drivers that would peg some people's crappy laptop CPUs.

Also the raw YUV 4:2:0 is 1.5 bytes per pixel so that's doing half of the "compression" work for you.

0xTJ · 3 years ago
Just how much do you have to hate consumers to come up with a scheme like this? Increment revisions as you add more features, add something to the end to say how fast it goes. The 3.2 renaming is idiotic.
willis936 · 3 years ago
USB 4 AKA USB 4 Gen2x2

USB 4 (opt) AKA USB 4 Gen3x2

They had a chance to fix their colossal fuckup and they decided not to.

ThreePinkApples · 3 years ago
In marketing and on cables they've chosen to use the terms USB4 20Gbps and USB4 40Gbps, so at least that's explicit. There's also officials ways to mark cables as being 100W or 240W capable.
nolok · 3 years ago
Their issue was not the naming for consumer or tech user, their issue was "how do we allow any random laptop from claiming latest usb despite not actually supporting it".

It was super obvious with usb 3 and its sub versions, and it gets even worse with 4.

paulmd · 3 years ago
Yes. The "IF" in "USB-IF" stands for implementers forum, it is a consortium of hardware companies who make devices. It's preferable to them if they can slap "USB 3.2 support!" on the box without having to redo their boards with a new, expensive component.

In other words, the incentives here are for USB-IF to promote customer confusion, not to reduce it, because that confusion can sell devices and push profit margins.

It's absolutely terrible that the EU is giving this group a legal monopoly on the ability to create and proliferate new standards. Their incentives fundamentally run against the consumer and they have repeatedly acted against the interests of the consumer. Unlike HDMI, there is no VESA to counterbalance them, it is USB or nothing, so you'll have to deal with these crappy standards going forward.

--

HDMI is doing something similar now too - "HDMI 2.1" is a completely hollow standard where every single feature and signaling mode added since HDMI 2.0 is completely optional. You can take HDMI 2.0 hardware and get it recertified as HDMI 2.1 without any changes - actually you must do this since HDMI Forum is not issuing HDMI 2.0 certifications any more, only HDMI 2.1 going forward, the new standard "supercedes" the old one entirely.

So - "HDMI 2.1" on the box doesn't mean 4K120 support, it doesn't mean VRR support, it doesn't mean HDR support. It could actually just literally be HDMI 2.0 hardware inside. You need to look for specific feature keywords if that is your goal.

https://arstechnica.com/gadgets/2021/12/the-hdmi-forum-follo...

https://www.youtube.com/watch?v=qo9Y7AMPn00

Kab1r · 3 years ago
USB versioning is such a clusterfuck.
411111111111111 · 3 years ago
There was a really short timeframe when I was really positive about USB, but that has been long lost since.

They should've never allowed cables to only provide some capabilities and still get the branding. Having capabilities for connectors was fine imo, but also accepting them with cables was bad because you cannot really find out what it supports and where the issue originates of something goes wrong

jsjohnst · 3 years ago
It’s why I always buy TB3 (or now TB4) cables rather than a cheaper USB-C to USB-C. Due to the strict requirements on TB cables, you can pretty much guarantee it’ll support any use case (alt modes, PD, etc). Sometimes overspending is worth the headache prevention.

Deleted Comment

can16358p · 3 years ago
So on the next versions of USB, the cable length will get shorter and shorter until the max gets to 5cm?

While I get the technical reasoning about high frequency/attenuation etc that limits cable length as speeds go higher, there are obviously some practical limits to how short cables can be.

How would that be solved, I don't know.

moffkalast · 3 years ago
Keep the same speeds, add more wires.

Deleted Comment

zamadatix · 3 years ago
I'm confused what that section is supposed to represent. E.g. Apple has a 3 meter USB 4 3x2 (40 Gbps) cable but the "cable" value for that section is listed as 0.8m. The only hit I'm getting in the USB 4 spec for "0.8" is on page 59 referring to maximum receiver insertion loss in dB for a gen 3 connection including a 0.8m passive cable but that in itself isn't a hard limitation on cable length.
CoastalCoder · 3 years ago
Not my area of expertise, but maybe some (unrealistic) options include using fiber optics for the data lines, or adding more data lines.
birktj · 3 years ago
There already exists some fiber-optic USB cables that come in lengths >50m and with support for USB 3.1 so it doesn't seem like a very unrealistic option.
dual_dingo · 3 years ago
I guess at some point optical will be the only way forward.

Having more data lines in a serial bus is interesting, as the whole reasoning to go from parallel lines (e.g. Centronics, ATA/SCSI or ISA/PCI buses) to serial (SATA/SAS, PCIe, USB) was that coordinating multiple data lines got impossible due to physical limitations where e.g. minimal differences in cable lengths started to matter).