nerdsniper
today at 10:49 AM
The cable can report what it "thinks" it is, and in fact, modern USB-C cables do this: they have "e-Marker chips" inside the plugs which communicate with whatever they're plugged into and enumerate their belief as to their capabilities. The thing is, manufacturers can set the e-Marker chips to spew lies, or a cable that used to support 80Gbps got slightly damaged after 6 months of use and now only reliably transmits 10Gbps.
Power capacity is relatively easy to measure ad-hoc via voltage drop from one end to the other...USB-PD controllers already do this and can even fine-tune the voltage to make sure that if the device receiving (sinking) power needs 20V they'll send 20.4V or 20.9V to compensate for voltage drop so that the charging device gets 20V on its end.
But actual maximum data throughput is hard to know. The only way to really "know" how much data can flow through a cable is with an expensive oscilloscope or cable tester. Because 80Gbps cables run at ~13GHz so, at minimum you need a 26GHz scope (Nyquist–Shannon sampling theorem) or more practically a 52GHz scope. And it turns out it's really expensive to measure electrical signals 52 billion times per second. The necessary devices start at $15,000 (cable signal integrity tester) [0] on the very low end and only work for max 10Gbps USB 3.2 cables, or past $270,000 for 80Gbps USB4 cables (proper 60GHz oscilloscope) [1].
On the high end, each signal integrity test device can actually cost $1-2 million [2] where the base unit starts at $670,000 plus then spending additional money for hardware-accelerated analysis, specialized active probes, and the specific PAM-3 / USB4 compliance software packages.
0: https://www.totalphase.com/products/advanced-cable-tester-v2...
1: https://www.edn.com/12-bit-oscilloscope-operates-up-to-65-gh...
2: https://www.eevblog.com/forum/testgear/uxr1104a-infiniium-ux...
alex43578
today at 10:56 AM
I get that to properly test a cable, you need that level of accuracy, but for home use, couldn’t you get away with a source and a receiver that are far cheaper?
If a USB4 device can output a USB4 stream and the receiver can check that stream for errors, isn’t that sufficient?
nerdsniper
today at 11:11 AM
At some point you end up testing the peripheral and/or host rather than the cable. For example, cables often state that they can handle up to 240W ... but no 240W USB-PD chip has ever gone into production -- you won't even find one at the hottest USB-PD trade shows[0] in China.
It could be reasonable for computers to be allowed to trigger a data throughput test and the peripheral would state "I support up to 40Gbps of receiving/sending", and then send a simple pattern that can be generated on the fly. But a lot of devices can't receive/send that 80Gbps of data for long enough to perform a decent test - the storage, RAM, buffers, etc get depleted or act as bottlenecks.
If you know enough to accurately interpret the measurements you get from that, you know enough to write your own computer program to try to send 80Gbps from one computer to another and use DMA to process it in real-time without hitting storage (which a lot of peripherals likely don't have the CPU to accomplish).
If you don't know enough to write those test applications, you probably don't know enough to interpret the results of a built-in test function and the measurements would confuse and frustrate a lot of well-meaning, nerdy, but under-educated consumers who make assumptions about why they're not actually getting the rated speed.
Idk, my opinion doesn't go one way or the other here. Perhaps I myself don't quite know enough to be a good judge of that concept.
0: https://asiachargingexpo.com