One of the things that I found most frustrating about USB-C hubs is how hard it is to find one that actually gives you multiple USB-C ports. I have several USB-C devices but most hubs just give you one USB-C port and a bunch of USB-A ports. At most it’s 2 USB-C ports but only with the hub that plugs into both USB-C ports on my MacBook Pro (so I’m never able to get more ports than I started with). The result is I end up having to keep swapping devices. For a connector that was supposed to be the "one universal port," it's weird that most hubs assume you only need one USB-C connection. Has anyone found a decent hub with multiple USB-C data outputs?
I'm in the same boat. It seems like the mindset for consumer-grade hubs is to provide support for as many old, legacy devices as possible, rather than a higher number of new devices.
Another problem is that USB-A ports are dirt cheap and simple to implement, so hub makers feel like "leaving free IO on the table" by not sprinkling them on everything. Whereas each "decent" USB-C port has enough complexity to think twice about adding it.
Nevertheless, there are a couple of options. Try searching for "USB-C only hub". You will get some results, but they are basically the identical product (same IO card), just with different housings. So you can pretty much count with these specs: 1 USB-C in for power, 3–4 USB-C out, 5 or 10Gbps each, Power Delivery at various wattages. No video support.
I have one of these on my desk right now, it's from the brand "Minisopuru", I get power and four USB-C "3.2 Gen 2" ports. It's fine. But like I said, it's no Thunderbolt, and no video support, so I have to "waste" the other port on my MacBook just for my external display.
There are also Thunderbolt / USB4 devices which will give you a bonkers amount of IO, including good TB / USB-C ports usually (plus some USB-A of course, as a spit in the face – so you'd need to ignore those). But these are not hubs, they are docks, which is a different product class entirely (big and heavy, more expensive, dedicated power supply).
Something I've been doing recently to salvage the USB-A ports I still begrudgingly encounter, while continuing to (force myself to) upgrade all my devices to type-C, are these: [0]. 1-to-1 USB-A male to USB-C female adapters. I just stick them in all USB-A ports I encounter, leave them there all the time, and move on with my life. It's a bit bulky and looks kinda stupid, but it basically gives me USB-C everywhere I need (including work-issued PCs and docking stations) for just a couple of bucks. For low-bandwidth devices like headphones, keyboard / mice / gamepads, or even my phone, it works perfectly fine.
You can get them now. Thunderbolt and USB 4 hubs will often have multiple USB C ports and only need one plug. I have one that's more of a docking station:
Yes, I've bought a chinese ("Acasis" brand) TB4 hub which has three TB4 downstream ports and an USB 3.x hub with three downstream 10 Gbps USB-C ports. There are also weird combos like one downstream TB3 + three downstream USB-C 3.x. Still not great, but it's better than a single downstream port.
Can anyone tell me why I have several devices in my home that demand a certain USB-C cord in order to charge? They are mostly cheap Chinese devices that won’t acknowledge a more expensive (e.g., Apple) uSB-C cord plugged into them. Even when plugged into the same transformer. They only charge with the cheap USB-C cord they came with. What gives?
Because the USB Consortium made a terrible mistake. Instead of speccing USB-PD power supplies to default to 5V <3A when there are no resistors in the port of the other device, the default is do nothing. So in order to be in spec, you have to refuse to charge non-compliant ports. This means the compliant power supplies are worse, in a way. So you need to use a "dumb" USB-A power supply and USB-A to C cable, which does default to 5V <3A to matter what. As for why some devices choose to use non-compliant ports - I assume it's extreme cheapness. They save a penny on a resistor.
It's not a terrible mistake. A terrible mistake would have been having such power available on ports that even a reasonable person might short out by innocently connecting a USB C cable between them.
A couple 5.1k resistors add about $0.00001 to the BOM cost. The terrible mistake is on the designers of devices who try to forego these.
It's really not the BOM cost that drives these decisions but the assembly cost of adding placements. Time on the assembly line is very valuable and doesn't have a simple / clean representation on a spreadsheet. It's dependent on the market and right now assembly line time is worth a lot.
That is exactly the reality. I work in a place where we build HW. The resistor costs almost nothing. But installing it, having it in stock, making sure the quality of yet another component is correct, eventually managing another vendor, all costs. So much, that the cost of a resistor we put a value of some cents (up to ten) even when the part itself cost so little that the software has problems tracking it.
Except that connecting 5V to 5V does not cause a short circuit. No current will flow without a voltage difference. If there is a difference, the capacitors in one of the power supplies will charge up to the same voltage and then current stops flowing again.
That would be true if both sides were exactly 5.0V, but they're not. There's a 5% tolerance, from 4.75V to 5.25V, and in practice you will see chargers often run "5V" at 5.1V intentionally, to account for resistive loss in the charging cable. If you accidentally connect your "5V" device to the host's "5V" you may find that the host simply disables the port, which has happened to me more than once. So no, you can't just blindly connect them together.
No need to have a separate USB-A brick - simply have a USB-C brick plus C-to-A adapter. An adapter will force 5V down the port no matter what. But afaik you still need USB-A cable (or another adapter?), which kinda defeats the whole idea of having just one cable.
At this point I'm even surprised that compliant cables and chargers exist so the GP can have that problem.
But I believe the specs are that way to avoid problems with OTG devices. If both devices decide to just push a voltage into the cable at the same time, you risk spending a great deal of energy or even start a fire. That said, there are better ways to deal with this issue; those are just slightly more expensive.
I think the Apple USB-C charger I have is compliant and so is the cable. I actually use it to charge my Samsung phone primarily, but inadvertently discovered that it won't run a Raspberry Pi 4 at all. The $12 adapter that is sold for that purpose runs the Raspberry Pi 4 just fine. Apparently because it just supplies 5 volts all the time, no matter what the device says.
The Raspberry Pi 4 has a design error in its USB-C circuitry.
It does include a pull-down resistor, but wired incorrectly (compliant devices need two), which results in compliant chargers only correctly detecting it when using a “dumb” (i.e. containing no e-marker chip) USB-C-to-C cable. Your Apple cable probably has a marker (all their Macbook charging cables have one, for example).
Thanks for the explanation. I actually found out the USB-C plug can act as a USB device. At USB 2.0 speeds oddly enough. So I have all my Pi 4s configured now in that mode and I just power them through the 5 volt header, which seems simpler. Albeit less convenient.
I had to get this USB "power blocker" that only passes the data pins through, otherwise the Pi runs off the computer it is plugged into all the time
If it is all 5 volts, it will not do much. But perhaps that screwball PD stuff would get you in trouble. The OTG stuff, just concerns who is the usb host. Where the otg cable instructs a normally client device to act as a usb host, Where the holy grail was to find that magical OTG cable that would let me charge the phone while it was acting as host. Hmmm... on reflection, this would be any dock right?
And a rant, for free: Holy smokes, I think OTG may be the second most braindead marketing dribble sort of acronym to come out of tech, right behind wifi(wireless... fidelity? what does that even mean?)
I guess this is only partially true, as I have a A-to-C charger cable from Huawei that works with everything except my Pixel 4A phone. And my Pixel 4A phone works with everything except that specific cable.
USB A->C cables are supposed to have a Rp pullup on CC1, and leave CC2 disconnected. Huawei made some A->C cables which (incorrectly, and spec-violatingly) have Rp pullups on both CC lines, which is how you signal you're a power sourcing Debug Accessory
Your Pixel 4A is entering debug accessory mode (DebugAccessory.SNK state in the USB-C port state machine); other devices probably don't support debug accessory mode and just shrug.
Maybe the cable is missing the CC pin resistors (all USB-A to C cables are supposed to have them to identify themselves as such), and maybe only the phone cares.
Some badly designed USB-C devices don’t properly negotiate power supply, and as a result, only USB-A (since these always output 5V without any digital or electrical negotiation) or other non-compliant USB-C devices will actually charge them.
I’ve experienced this too and it’s not just no-names. I have a wireless gaming keyboard from SteelSeries, certainly a very legit brand. I lost the original USB-C cord. Tried every USB-C cord I could find, and they power the keyboard and charge it to exactly 1%, but no more.
Found plenty of people online with the same issue but no resolution.
Finally just paid the $25 to get the OEM SteelSeries replacement cable and it charges fully again. wtf… I guess the replacement cable was USB-A to C and I’ve only tried USB-C to C cables?
Actually, in most situations with this problem it is possible to solder 2 additional resistors inside the offending USB-C device. I have done that on a flashlight and can confirm that it fixed the problem.
Adding SteelSeries to my never buy list, along with Unicomp (Unicomp's literally died on me weeks after the 1 year warranty ended. Got told to buy another at full price, went to Ellipse instead at modelfkeyboards dot com for 4x the price and never been happier).
It’s not really PD. It’s just they aren’t usb c spec compliant at all. USB-C has the power pins at 0v by default, and you have to signal there is a connected device to activate 5v. While usb-a has 5v hot all the time.
Since there aren’t any active chips in these cables, an A to C cable happens to have 5V hot on the usb c side, but this should not be relied on as it isn’t true for C to C
PD is optional for USB-C devices, but these out of spec devices don’t even support the basic USB-C resistor-based identification scheme (which is mandatory).
In order to get anything from a USB-C power supply, a device needs to have 5.1kΩ resistors from the CC1 and CC2 pins of the USB-C port to ground. Devices that cheap out on these two resistors (which cost well under a cent each) will not get any power from the power supply.
I have purchased multiple devices like this over the years. In all cases, it is that it doesn't have whatever circuitry is required to have a USB-C PD charger send 5v down the line. Using a USB A to C cable works every time. Ironically, using a C to A then A to C then makes it work with a USB-C charger.
I've always ignored instructions that say to only use that product's USB cord (things like my kitchen scale and water flossed) and have never had an issue. Sounds like I've just gotten lucky though, based on your experience.
I was under the impression that the USB protocol just fell back to 1a 5v when power negotiation was unsure.
USB-C is 0v by default and you have to signal to get anything at all. A lot of junky devices are non compliant and aren’t set up to signal 5v so they get 0 when plugged in to a C-C cable.
With resistors on the CC pins. In particular, there is resistor value that indicates legacy USB charging. This is in the USB-A to USB-C adapters and cables.
The manufacturers cheaped out in not including the right resistors.
I would also guess that some of these cases are designs that were adapted from previous USB mini- or micro-b configurations. Like an intern got the assignment, switched the connector, and wiped hands on pants, not realizing that an electrical change was required as well.
And if you spin the new board and it works with the A->C cable sitting on your desk, then what could possibly be different about plugging it into a C<->C cable, right?
> How does it negotiate with a host-powered device if it's unpowered to begin with?
Through a pair of resistors.
The unpowered device connects each of the two CC pins to the ground pin through a separate resistor of a specific value. The cable connects one of these CC pins to the corresponding pin on the other end of the cable (the second pin on each side is used to power circuitry within the cable itself, if it's a higher-spec cable). On the host side, each of the two CC pins is connected to the power supply through a separate resistor of another specific value. When you plug all this together, you have the host power supply connected to ground through a pair of resistors, which is a simple voltage divider. When the host detects the resulting voltage on one of the CC pins, it knows there's a device on the other end which is not providing power, and it can connect the main power pins of the connector to its power supply.
I have 2 powerbanks that cannot be charged by USB-C port when at 0%. The signaling circuitry simply doesn't work. No idea who designed this. It is silly beyond belief. I have to charge it with normal 5V A-to-C cable for 30 seconds, then unplug, then all the PD stuff will start working and I can fastcharge it with USB-C port again. I'm screwed without A-to-C cable.
Holy shit. This explains why my Anbernic 406v is so weird. If I drain the battery too much, it won't let me charge with anything except with a normal 5v USB A to C cable and the USB-C cable that I use to charge it while it's on does nothing. It makes so much sense now.
This (and the GP) are because your device supports some sort of fast charge USB-PD negotiation, but does not support what is known as “dead battery mode”. Basically, dead battery mode enables those pull down resistors by default (when no power is applied) so you can get 5V to the system, where eventually it would charge up and the chip that can do PD negotiation will be powered. Usually this is done simply by having the negotiation chip default to pull down resistors internally when unpowered.
USB-C hosts and power adapters are only allowed to provide 5V if they can sense a downstream device (either via a network of resistors or via explicit PD negotiation).
Out-of-spec USB-C devices sometimes skip that, and out-of-spec USB-C chargers often (somewhat dangerously) always supply 5V, so the two mistakes sort of cancel out.
They are likely not following the USB spec correctly. Things like pulling certain pins high or low or having a set resistance between certain pins or communications between the host and device will all affect what goes over the wire and whether the host or the device will accept this. Cables will also have some pins entirely unconnected.
Cheap, bad, shortcuts, etc. will result in an out of spec cable being necessary for an out of spec device to work correctly with an in or out of spec hub. It's terrifically frustrating but a fact of the world.
And this isn't just random no name knockoffs. The Raspberry Pi in certain versions didn't correctly follow the power spec. Both the Nintendo Switch and Switch 2 either incompletely, incorrectly, or intentionally abused the USB spec. The Lumen metabolism monitoring device doesn't follow the USB spec. This is one of those things where you want a bit of a walled garden to force users of a technology to adhere to certain rules. Especially when power and charging is involved which can cause fires.
> This is one of those things where you want a bit of a walled garden to force users of a technology to adhere to certain rules.
That’s what consumer protection laws with teeth and electric safety certifications like CE or UL are for, not walled gardens.
History has shown that relying on hardware DRM, like Apple did with Lightning doesn’t prevent manufacturers, from doing dangerous things, because they’ll find ways around it sooner rather than later.
USB-PD hubs are very annoying. Devices with no battery on a hub (Raspberry Pi etc) will just get hard rebooted if anything else gets plugged into the same hub. I looked at a lot of hubs and they all behaved this way. They all cut power to everything, then re-negotiate the power allowance each device gets from the shared wattage, every time anything new connects. I could not find a hub that honored existing power contracts and gave new devices the remainder. My guess is the average customer expect the newest plugged in device to get max power (at the expense of everything else) or they return it to the store thinking it's broken.
I’m not sure if it’s a USB-PD hub but Jeff Geerling posted a video about a USB-C charger that doesn’t suffer from the renegotiation issue https://m.youtube.com/watch?v=dG2v4FHwJjE
Unfortunately there is no real solution to this, that would work in general case. With renegotiation, the power gets cut off, but most likely every device will get the allowance and all of them will still charge. With no renegotiation, a newly plugged in device might not charge at all. Not sure what's worse.
I slowly replaced my home network piece by piece trying to find the bottleneck that was causing my gigabit internet to top out at ~300kbps in my office on the other side of the house from the modem.
After replacing the Ethernet cable run from the second floor to the basement with fiber optic... And the switches in between... And seeing no improvement... I tried a different computer with a built-in ethernet port on the same cable, and pulled 920kbps.
The problem... Was my Caldigit Thunderbolt Dock. I replaced it with an OWC one from Facebook marketplace for cheap and it solved the problem... Roughly $500 in. I'm still angry I didn't check that early earlier.
Which CalDigit dock? I have the TS3 Plus. Using its Ethernet port and Thunderbolt 3 to my laptop, I'm getting the expected 1,000 Mbps connection on my home network. Do you have a different model or maybe a defective unit?
You need a tester like the FNIRSI FNB58 (not affilate link: https://www.amazon.com/gp/product/B0BJ253W31). This is just an example and not a personal recommendation, as I've just started looking into these myself.
They have fixed this, the modern spec has speed and power rating logos that (good) manufacturers can put on the cables. Just assume anything without a logo on it is USB2/5v only and buy some new cables you know the specs of.
The fact that cables have varying speed and power seems like a failure at launch. Who benefits from this? Manufacturers trying to save money on cables? Fuck that. This just means we'll use the cables that actually work everywhere to death and throw the rest out. What a waste.
Well, there are always going to be varying speeds and power because needs change and tech improves, as the spec improves over time, the cables have to vary. Either you change the cables entirely (creating even more waste as now you have to change even if you don't need the higher spec), or you have varied ones. Also, right now they can do higher speeds at shorter lengths, but not longer ones, so if you had only one speed, you'd have to have a short maximum length, or pay for very expensive active cables for any long cable, even if you only need it for power.
Even if it were purely cost, even then I think we still benefit: the alternative is cheaper devices will use a different cheaper cable spec, and you end up with many different cable types where you can't use them at all. Sure, maybe I won't get max speed from one cable, but still being able to get half speed and or power is better than a cable I can't use at all.
Honestly, I just have never got this criticism, the alternative is just "have completely different cables" and that has literally all the same problems but worse, outside of "it can be hard to identify what this cable supports", which is solvable in much better ways than making them incompatible (as with the cable I'm looking at on my desk right now which explicitly says 40gbps/240w on it).
I grew up in the era of every mobile phone having a different charger, and it was always a nightmare when you were at a friend's house and they didn't have the one you need. I may not get max charging speed, but now everyone always has a charger I can use for any of my devices. Please, I don't want to go back.
Colored electrical tape or heat shrink labels at both ends of each cable with a simple coding system (P=Power delivery wattage, D=Data speed, T=Thunderbolt) solves this problem permanently.
The USB IF really should have specified some labeling for cables. The icons used on USB-A connectors are too complicated. What I think would work well is colored rings, with some for the different USB3 speeds, and some for the 100W and 240W charging.
Don’t be disingenuous. The branding guidelines are for manufacturers. Consumers just read the logo, which straightforwardly lists the speed and power capacity.
For example: Usb 5Gbps is the same regardless of usb 3.0, 3.1 or 3.2 gen 1. In fact, customers dont need to know that "3.0" number. they just need their ports support 5gbps
Thunderbolt cables have always been marked either on the ends or with a sticker wrapped around the cable. Everything else can be assumed to be a 2.0 cable at 60w/140w
Not strictly related but I just bought a USB4 USB-C cable which is rated at 40 Gbps. I still can't really believe it. (I still remember saving to cassette tape)
I use one of these for two 4k monitors, sound, ethernet, mouse, webcam, AND charging. It's amazing having one cable to plug in when I work from my desk. Unfortunately requires one of those $400 docks though.
Work bought me a VisionTek hub. I wanted the 1 cable life - unfortunately, it only does monitors via DisplayLink, aka compressed & streamed to & from my desk. It's noticeably fuzzy.
So now it's 2 cables: 1 from the hub, 1 from the monitor. Both USB-C.
WTF guys?
My Apple monitor from 2009 just worked with 1 cable (no power, but still).
I just have the cable. I dont have a computer nor a device that can transfer that quickly. Wont be long though! Actually looked like those 6k monitors @60Hz will fill the pipe.
Yes. I have exact same story: around 3 completely screwed hubs til I got to mostly decent working one. I hate Apple for the port shenanigans, they are so close sometimes by plugging something into one port, the other one gets blocked. I can even block one port by my headphones cable thats how close together they are. Truly an idiot designed this. Also USB-C really feels more flimsy than USB-A and cables I insert in are quite shaky already after 1 year of usage.
Same goes with 3.5mm jack on the phones. Freaking adapters are just an ominous thing to use. They are just bad and they always break. The port is so loose so after 3 months of using they just start falling away from it. There is no decent phone left with a 3.5mm jack, which is a really sad state of things... Unless you know one? Feel free to suggest.
Do you clean your USB-C ports? I used to get frustrated thinking the port was worn as cables would come out easily. I used a plastic dental pick to clean the port and it felt brand new afterward. Lint and fine debris gets in the port and then a cable insertion compacts that debris/lint. Over time this compaction layer builds up and cables no longer have enough depth to properly lock onto the port. It was shocking how much debris was compacted into the port.
Man that is a lot of computer to put into that product for as little money as possible.
I'm not intending to excuse the products with likely bad firmware causing most of these issues, especially Ethernet PHYs. Though, in my professional experience doing embedded device firmware, Ethernet PHYs are always the biggest pain, hands down. The firmware included with them has many many ways to configure it wrong, and the defaults are typically just random settings, not a reasonable configuration. Just getting the drivers even running sometimes involves just knowing a few tricks.
Anyways, it doesn't surprise me many have trouble working right, especially when they indicate they are all running OEM firmware essentially.
However, I can’t help but feel a little bit cheated by companies just buying off-the-shelf products, slightly modifying the case layout, and then quadruple the price because it’s “from a reputable company”.
LOL. Welcome to the world of OEM/ODM. As a conservative estimate I'd guess >95% of all consumer electronics is done this way. Even the big names like Apple, Dell, Lenovo, etc. do it.
However, if you are - according to Wikipedia - a two-billion-dollar company like Realtek, then I expect you to get your shit together. There are exactly zero excuses for Realtek to not have a driver release ready almost a year after Big Sur has been announced. Zero.
Mac users are in the minority. It's worth noting that the RTL8153 is a native RNDIS device, which has its history on the Windows side, and Realtek has only started contributing drivers to Linux relatively recently.
FWIW I've had great luck with Realtek NICs, although I don't specifically recall using their USB NICs.
> I've had great luck with Realtek NICs, although I don't specifically recall using their USB NICs.
I envy you. Realtek NICs (especially over USB) are tantamount to trash in my mind after 2 decades of fighting their frequent failures. Be it failure to initialize at all to driver crashes to pisspoor feature sets (or claiming to have features that don't work at all), and a myriad of other problems. Granted, they usually work in Windows, but I don't work in windows (I live and work in linux/BSD). It has become my personally policy to avoid/disable and realtek NICs and replace them with something actually functional whenever possible.
Hopefully their work on linux-side drivers will change this given their proliferation.
To be honest I've yet to find a reliable USB based network interface regardless of chipset/brand/manufacturer, outside of the ones that do PCIe passthrough via USB4/Thunderbolt and those tend to be quite expensive (though they are starting to come down in price).
The problem with USB NICs is now you have two flaky failure points - the chipset driving the USB port and the chipset driving the USB network interface.
I had a reliability issues using a Realtek 2.5 Gbps USB network interface. Kept locking up, or having latency issues. Until I switched which USB port I plugged it into (one that used a different chipset), and after that it was solid.
Realtek itself (Questionable quality on a good day)
The implementation of Realtek by the ODM/OEM/SI into whatever part is being shipped, which given Realtek is the defacto "budget" option for networking, it's often done as cheaply and shoddily as possible, even if the chip itself actually isn't crapware.
And the USB interface as you point out. There's a whole rabbit hole that I'm unfortunately all too familiar with when it comes to diagnosing and dealing with USB. PCIe passthrough via a USB4/TB combo chip isn't as reliable as just using PCIe directly, but it's still better than a non-pcie passthrough usb interface.
They use the USB-C physical interface for their modules, but that doesn't mean they actually use the USB protocol on the backend. Not sure how they implement it to be honest, but it's at least entirely possible for example to run display-port only (With no USB involved at all) through a USB-C PHY (and dp isn't alone in being able to do that).
I suspect a lot of the flakiness is not the chip itself but the fact that, because it's cheap, the bottom-of-the-barrel manufacturers will always use it instead of the significantly more expensive alternatives, and then further cut corners with the board layout and design.
Ironically, the only problems I've had with NICs were on an Intel and a Broadcom.
> I suspect a lot of the flakiness is not the chip itself
Most certainly. Doesn't change the fact that Realtek being present is a huge redflag, even if it's not a cheap device, regardless of whether it's realtek's fault or the OEM/ODM/SI that integrated them into the system in question. It basically screams "we phoned this part in", though it's certainly not always true, it's true enough that I refuse to use them (be it by disabling them or just opting for entirely different hardware so I can avoid that headache).
Broadcomm is certrainly better than Realtek, but it's still a "Replace at soonest possible convenience" tier as well. Intel is far far more reliable in my experience (save for some of their combo bluetooth/wifi cards, but their dedicated wired ethernet cards have always been great for me. The i210/211 class of integrated bottom tier ones can be hit and miss though.
Is it always the case that these white label products are all equivalent? That is, is there still some input from the purchasing company on choice of components, quality control, etc, and does that make a difference to the product?
Unfortunately I can't say I'm surprised that the common thread was Realtek network chips. I've found their NICs to be fairly flaky over the years - the majority work, but a solid minority don't. In contrast, Intel NICs have been bulletproof for me and I seek them out whenever I have any choice in the matter.
Looks like he only bought cheaper things, so no wonder they all eventually died. My USB-C hub is an HP Thunderbolt dock. It's beefy as heck, lasted for years with no issues. It has a tiny fan inside, which I assume helps with the longevity. I hear good things about CalDigit docks too. Those also are very expensive.
Expensive doesn't equal great either. My ThinkPad thunderbolt hub, which is not cheap by any standards, can't route HDMI without randomly restarting itself every few minutes. Connecting the same cable directly to my ThinkPad laptop works perfectly fine. Sort of defeats the whole point of buying a hub. I have sort of given up on hopes of getting a high quality hub - it's a money sink
Yep that one. I guess they don’t have everything, but the everything hubs tend to come with a lot of confusing pitfalls. I use that one in particularly on my steam deck mostly.
I’ve also given up on USB hubs and I’m using a Thunderbolt 4 dock to get more IO out of my Mac Studio. It feels crazy to spend that much $’s, but it solved my problems.
Back at one of my previous employers we had a long internal briefing about why our latest device did not have USB-C when other solutions on the market by then had.
The connector is solid but my god have there been disasters because of USB-C.
1. Power distribution upto high wattage, not always with auto sensing,
2. Wites rated to different data transmission speeds.
3. USB standard data transfers and Thunderbolt over the same connector and wire but most accessories are not rated for Thunderbolt.
For real eye diagram, a signal / PRBS generator and a high end oscilloscope. Those are in the many-tens-of-thousands of USD range.
Debug tools for eg. FPGA transceivers or some redriver chips can measure BER and show fake eye diagrams. In the few-hundreds to the few-thousands of USD range, but you may need a company to sing NDAs. Eg.:
Thanks. If this type of equipment were more accessible and cheaper we'd see less USB problems. This should also be USB consortium's responsibility, but I guess this way they make more money with compliance testing.
recently went through the process of picking a new hub and it took hours to locate one actually appropriate to my use case. Helpfully enough, chatgpt/claude were both good at locating ones with specific needs (x USB-C ports, y USB-A, no HDMI, no SD card readers), would probably have taken a lot longer without.
LLMs really are fantastic tools for shopping and product comparison. It's going to suck once the marketers manage to successfully do the equivalent of SEO on them
I’m so glad to hear other people have had issues with these stupid cheap docks. I’ve burned through three of them over the last few years. It seems that not routing laptop charging power helps, but doesn’t solve the issue. Stupid cheap products.
Years ago, I bought a hub similar to that Satechi one. It worked great until I unplugged my laptop. Then my home network would die. After some sleuthing, I realized the RJ45 interface repeated any packets received, creating a network loop and confusing my Ethernet switch. I contacted Satechi about this obvious defect. Their support team insisted this was by design and told me to unplug USB-C power from the hub whenever I disconnected my laptop, which … kinda defeats the point?
Was it actually repeating packets or was it sending out pause frames?
In my experience USB ethernet adapters send out pause frames which shit-tier switches replicate to all ports in direct contravention of the ethernet specifications.
Wow me too. This was very confusing and annoying as my wife's cheap dock would take down the network while we were both working from home. It took several annoying incidents before we connected the incidents to the dock.
I've been on a Macbook M1 Air for the last few years and wanted multiple screens, so I got a USB 3 hub (Dell D6000) which does DisplayLink. I had almost everything hooked in there, but still connected one screen direct via usb-c. Displaylink is good for an M1 as you can add loads of screens if you want, but you can't watch streaming services on them as MacOs thinks your screen is being 'watched'.
I did want a thunderbolt hub but as far as I could tell at the time Thunderbolt and Displaylink just didn't come in the same package, so I was stuck with two cables.
Three years on, I picked up an M4 machine that can do two external screens natively, great, I can reach my goal of only plugging one cable into my macbook. But the Dell can't do that on a Mac because of something or other meaning it would work as mirrored-only.
Time to grab an actual thunderbolt dock. I picked up an HP TB3 refurb (HP P5Q58AA) which was cheap (30 AUD) and seemed great on paper, only to find it needed a type of power adaptor I didn't have that put me back another 60 bucks, and when I got it all hooked up it didn't always work and couldn't support decent refresh rates on my monitors, with one always being stuck at 30Hz. There was a software update available, but for that you need a windows machine with a TB3 port, not something I have.
So then I grabbed a Kensington SD5750T, which was another 140 bucks, but I am pleased to report that it just works and supports dual 4k60 over thunderbolt/USB-C. There is no HDMI or Displayport on this thing, but my monitors both have USB-C in so... Unfortunately, now that I've read the article, I can also confirm it contains a Realtek 0x8153, and is an OEM'd 'goodway' dock.
I 'member hearing stories about an early USB C hub which contained a network chip that, when no computer was attached but it had power from the brick, would randomly barf invalid ARP packets, up to and including taking entire networks down. Anyone 'member details?
I'm convinced the only way to get a quality piece of hardware is to spend a couple hundred bucks on a Thunderbolt 4 or 5 hub. I got a TB4 hub from CalDigit and it works great. And no misbehaving network chip to be found either.
I just bought a monitor arm that has a USB dock at the base. The dock is actually just a USB extension cord that is enclosed by the base of the arm. Think of it like putting a $10 USB hub into a box and poking holes in it to stick wires through. Great teardown, makes you wonder what's truly inside these boxes.
In other news, I do think desk makers should start incorporating the USB dock inside the top board of a desk. People go through a lot of money and bullshit to keep their setup clean, especially those who swap computers.
> In other news, I do think desk makers should start incorporating the USB dock inside the top board of a desk. People go through a lot of money and bullshit to keep their setup clean, especially those who swap computers
Humanscale the docks on the monitor arms look integrated but they're actually a really slick attachment that goes around the base nice and snug, so it just looks like one piece. It can be replaced if you take off the monitor, I really liked that. Humanscale is also extremely expensive as far as monitor arms go.
The selection isn't great yet, but you can get a USB charger or USB hub desk grommet (or one that does both).
Hopefully there will be a matching sized one (60mm) doing something useful, but if not you can just put a normal grommet cover in and do the stuff under the desk.
More accurate to say it’s a dock than a hub, but I’m using a Dell 2427DEB monitor[0] with my Dell work notebook and a second monitor daisy chained from it on the DP out port.
My work laptop has just a single USB-C cable plugged into the monitor for everything making it super trivial to plug it back in when I use it away from my desk (which I do regularly).
My personal desktop has a single DP and also a USB A to B cable. The monitor has KVM capability so I can super conveniently switch between them even with the two screens.
Cables are completely minimized with this set up, I’m very happy.
The only thing that’s unfortunate is that I occasionally work on a MacBook Pro 16” M4 and it can’t drive the second monitor over the same USB-C cable as Apple CBA to have support for DP MST on even their premium priced hardware. So I have to also plug in an HDMI cable to the second monitor.
Also unfortunate with the MacBook Pro is that macOS UI scaling doesn’t allow ratios like 125% meaning the UI elements aren’t quite at my preferred size. My Windows 11 handles this perfectly.
When I was tasked with acquiring desk setups for our whole company (small, new startup) back when everyone was using just whatever I decided to go with a really similar setup, Dell P2425HE monitor with integrated USB-C hub, and a Dell P2425H daisy chained from it.
I'm really glad we went with that, so far they've been great (for software devs, no color sensitive stuff).
> macOS UI scaling doesn’t allow ratios like 125% meaning the UI elements aren’t quite at my preferred size
Try BetterDisplay, and enable "Flexible Scaling" (a per-display setting).
Bonus: enable "UI Scale Matching" and output will be same physical size across all displays (as long as they report proper DPI, but I think you can hack that as well right there)
One of the things that I found most frustrating about USB-C hubs is how hard it is to find one that actually gives you multiple USB-C ports. I have several USB-C devices but most hubs just give you one USB-C port and a bunch of USB-A ports. At most it’s 2 USB-C ports but only with the hub that plugs into both USB-C ports on my MacBook Pro (so I’m never able to get more ports than I started with). The result is I end up having to keep swapping devices. For a connector that was supposed to be the "one universal port," it's weird that most hubs assume you only need one USB-C connection. Has anyone found a decent hub with multiple USB-C data outputs?
I'm in the same boat. It seems like the mindset for consumer-grade hubs is to provide support for as many old, legacy devices as possible, rather than a higher number of new devices.
Another problem is that USB-A ports are dirt cheap and simple to implement, so hub makers feel like "leaving free IO on the table" by not sprinkling them on everything. Whereas each "decent" USB-C port has enough complexity to think twice about adding it.
Nevertheless, there are a couple of options. Try searching for "USB-C only hub". You will get some results, but they are basically the identical product (same IO card), just with different housings. So you can pretty much count with these specs: 1 USB-C in for power, 3–4 USB-C out, 5 or 10Gbps each, Power Delivery at various wattages. No video support.
I have one of these on my desk right now, it's from the brand "Minisopuru", I get power and four USB-C "3.2 Gen 2" ports. It's fine. But like I said, it's no Thunderbolt, and no video support, so I have to "waste" the other port on my MacBook just for my external display.
There are also Thunderbolt / USB4 devices which will give you a bonkers amount of IO, including good TB / USB-C ports usually (plus some USB-A of course, as a spit in the face – so you'd need to ignore those). But these are not hubs, they are docks, which is a different product class entirely (big and heavy, more expensive, dedicated power supply).
Something I've been doing recently to salvage the USB-A ports I still begrudgingly encounter, while continuing to (force myself to) upgrade all my devices to type-C, are these: [0]. 1-to-1 USB-A male to USB-C female adapters. I just stick them in all USB-A ports I encounter, leave them there all the time, and move on with my life. It's a bit bulky and looks kinda stupid, but it basically gives me USB-C everywhere I need (including work-issued PCs and docking stations) for just a couple of bucks. For low-bandwidth devices like headphones, keyboard / mice / gamepads, or even my phone, it works perfectly fine.
[0] – https://www.amazon.com/UGREEN-Adapter-10Gbps-Converter-Samsu...
Belkin makes one https://a.co/d/8CchALB
I just bought one of these yesterday. It’s on the way so I can’t offer a review.
https://www.caldigit.com/thunderbolt-5-element-5-hub/
You can get them now. Thunderbolt and USB 4 hubs will often have multiple USB C ports and only need one plug. I have one that's more of a docking station:
https://www.aliexpress.com/item/1005008363288681.html
But you can also find them in smaller hub form.
Yes, I've bought a chinese ("Acasis" brand) TB4 hub which has three TB4 downstream ports and an USB 3.x hub with three downstream 10 Gbps USB-C ports. There are also weird combos like one downstream TB3 + three downstream USB-C 3.x. Still not great, but it's better than a single downstream port.
Can anyone tell me why I have several devices in my home that demand a certain USB-C cord in order to charge? They are mostly cheap Chinese devices that won’t acknowledge a more expensive (e.g., Apple) uSB-C cord plugged into them. Even when plugged into the same transformer. They only charge with the cheap USB-C cord they came with. What gives?
Because the USB Consortium made a terrible mistake. Instead of speccing USB-PD power supplies to default to 5V <3A when there are no resistors in the port of the other device, the default is do nothing. So in order to be in spec, you have to refuse to charge non-compliant ports. This means the compliant power supplies are worse, in a way. So you need to use a "dumb" USB-A power supply and USB-A to C cable, which does default to 5V <3A to matter what. As for why some devices choose to use non-compliant ports - I assume it's extreme cheapness. They save a penny on a resistor.
It's not a terrible mistake. A terrible mistake would have been having such power available on ports that even a reasonable person might short out by innocently connecting a USB C cable between them.
A couple 5.1k resistors add about $0.00001 to the BOM cost. The terrible mistake is on the designers of devices who try to forego these.
It's really not the BOM cost that drives these decisions but the assembly cost of adding placements. Time on the assembly line is very valuable and doesn't have a simple / clean representation on a spreadsheet. It's dependent on the market and right now assembly line time is worth a lot.
That is exactly the reality. I work in a place where we build HW. The resistor costs almost nothing. But installing it, having it in stock, making sure the quality of yet another component is correct, eventually managing another vendor, all costs. So much, that the cost of a resistor we put a value of some cents (up to ten) even when the part itself cost so little that the software has problems tracking it.
Except that connecting 5V to 5V does not cause a short circuit. No current will flow without a voltage difference. If there is a difference, the capacitors in one of the power supplies will charge up to the same voltage and then current stops flowing again.
That would be true if both sides were exactly 5.0V, but they're not. There's a 5% tolerance, from 4.75V to 5.25V, and in practice you will see chargers often run "5V" at 5.1V intentionally, to account for resistive loss in the charging cable. If you accidentally connect your "5V" device to the host's "5V" you may find that the host simply disables the port, which has happened to me more than once. So no, you can't just blindly connect them together.
Yes you can. They will not blow. Nothing will happen.
Thank god you're not responsible for designing these protocols.
No need to have a separate USB-A brick - simply have a USB-C brick plus C-to-A adapter. An adapter will force 5V down the port no matter what. But afaik you still need USB-A cable (or another adapter?), which kinda defeats the whole idea of having just one cable.
At this point I'm even surprised that compliant cables and chargers exist so the GP can have that problem.
But I believe the specs are that way to avoid problems with OTG devices. If both devices decide to just push a voltage into the cable at the same time, you risk spending a great deal of energy or even start a fire. That said, there are better ways to deal with this issue; those are just slightly more expensive.
I think the Apple USB-C charger I have is compliant and so is the cable. I actually use it to charge my Samsung phone primarily, but inadvertently discovered that it won't run a Raspberry Pi 4 at all. The $12 adapter that is sold for that purpose runs the Raspberry Pi 4 just fine. Apparently because it just supplies 5 volts all the time, no matter what the device says.
The Raspberry Pi 4 has a design error in its USB-C circuitry.
It does include a pull-down resistor, but wired incorrectly (compliant devices need two), which results in compliant chargers only correctly detecting it when using a “dumb” (i.e. containing no e-marker chip) USB-C-to-C cable. Your Apple cable probably has a marker (all their Macbook charging cables have one, for example).
Thanks for the explanation. I actually found out the USB-C plug can act as a USB device. At USB 2.0 speeds oddly enough. So I have all my Pi 4s configured now in that mode and I just power them through the 5 volt header, which seems simpler. Albeit less convenient.
I had to get this USB "power blocker" that only passes the data pins through, otherwise the Pi runs off the computer it is plugged into all the time
If it is all 5 volts, it will not do much. But perhaps that screwball PD stuff would get you in trouble. The OTG stuff, just concerns who is the usb host. Where the otg cable instructs a normally client device to act as a usb host, Where the holy grail was to find that magical OTG cable that would let me charge the phone while it was acting as host. Hmmm... on reflection, this would be any dock right?
And a rant, for free: Holy smokes, I think OTG may be the second most braindead marketing dribble sort of acronym to come out of tech, right behind wifi(wireless... fidelity? what does that even mean?)
Maybe the USB-IF agrees, because OTG is no longer a thing with USB-C.
I guess this is only partially true, as I have a A-to-C charger cable from Huawei that works with everything except my Pixel 4A phone. And my Pixel 4A phone works with everything except that specific cable.
USB A->C cables are supposed to have a Rp pullup on CC1, and leave CC2 disconnected. Huawei made some A->C cables which (incorrectly, and spec-violatingly) have Rp pullups on both CC lines, which is how you signal you're a power sourcing Debug Accessory
Your Pixel 4A is entering debug accessory mode (DebugAccessory.SNK state in the USB-C port state machine); other devices probably don't support debug accessory mode and just shrug.
Maybe the cable is missing the CC pin resistors (all USB-A to C cables are supposed to have them to identify themselves as such), and maybe only the phone cares.
could be for safety reason ?, many device with usb c port can be used underwater(smartphone).
Some badly designed USB-C devices don’t properly negotiate power supply, and as a result, only USB-A (since these always output 5V without any digital or electrical negotiation) or other non-compliant USB-C devices will actually charge them.
I’ve experienced this too and it’s not just no-names. I have a wireless gaming keyboard from SteelSeries, certainly a very legit brand. I lost the original USB-C cord. Tried every USB-C cord I could find, and they power the keyboard and charge it to exactly 1%, but no more.
Found plenty of people online with the same issue but no resolution.
Finally just paid the $25 to get the OEM SteelSeries replacement cable and it charges fully again. wtf… I guess the replacement cable was USB-A to C and I’ve only tried USB-C to C cables?
That's a big red flag. IF their engineers wont even bother reading the usb-c documents, how can i trust them doing their job right?
I have a JBL speaker with the same issue: it can charge only with the included cable, no other.
They seem to be a popular brand, but can’t even get charging right. Ironically, the speaker doubles as a portable charger.
I have a USB-C JBL speaker (Flip 5) which charges alright with a USB-C to USB-C cable (and USB-C charger), but only in one direction.
So sometimes I have to plug it, realize nothing is happening, unplug, flip the cable and plug it again for it to start charging.
Actually, in most situations with this problem it is possible to solder 2 additional resistors inside the offending USB-C device. I have done that on a flashlight and can confirm that it fixed the problem.
Adding SteelSeries to my never buy list, along with Unicomp (Unicomp's literally died on me weeks after the 1 year warranty ended. Got told to buy another at full price, went to Ellipse instead at modelfkeyboards dot com for 4x the price and never been happier).
I've had two Unicomps as my daily drivers since December 2011. No issues so far, other than having to use PS/2 adapters.
They are devices that don't do USB PD. Usually it is a USB-A to USB-C cord, and just provides 5V 500mA or higher.
It’s not really PD. It’s just they aren’t usb c spec compliant at all. USB-C has the power pins at 0v by default, and you have to signal there is a connected device to activate 5v. While usb-a has 5v hot all the time.
Since there aren’t any active chips in these cables, an A to C cable happens to have 5V hot on the usb c side, but this should not be relied on as it isn’t true for C to C
Some are so not USB-C compliant and just "USB-A wires but with a USB-C plug" that they only charge in one orientation just like USB-A.
We can't have nice things.
PD is optional for USB-C devices, but these out of spec devices don’t even support the basic USB-C resistor-based identification scheme (which is mandatory).
In order to get anything from a USB-C power supply, a device needs to have 5.1kΩ resistors from the CC1 and CC2 pins of the USB-C port to ground. Devices that cheap out on these two resistors (which cost well under a cent each) will not get any power from the power supply.
I have purchased multiple devices like this over the years. In all cases, it is that it doesn't have whatever circuitry is required to have a USB-C PD charger send 5v down the line. Using a USB A to C cable works every time. Ironically, using a C to A then A to C then makes it work with a USB-C charger.
I've always ignored instructions that say to only use that product's USB cord (things like my kitchen scale and water flossed) and have never had an issue. Sounds like I've just gotten lucky though, based on your experience.
I was under the impression that the USB protocol just fell back to 1a 5v when power negotiation was unsure.
What kinds of devices?
USB-C is 0v by default and you have to signal to get anything at all. A lot of junky devices are non compliant and aren’t set up to signal 5v so they get 0 when plugged in to a C-C cable.
Ah that actually makes a lot more sense to start at 0. I appreciate the info.
How does it negotiate with a host-powered device if it's unpowered to begin with?
With resistors on the CC pins. In particular, there is resistor value that indicates legacy USB charging. This is in the USB-A to USB-C adapters and cables.
The manufacturers cheaped out in not including the right resistors.
I would also guess that some of these cases are designs that were adapted from previous USB mini- or micro-b configurations. Like an intern got the assignment, switched the connector, and wiped hands on pants, not realizing that an electrical change was required as well.
And if you spin the new board and it works with the A->C cable sitting on your desk, then what could possibly be different about plugging it into a C<->C cable, right?
> How does it negotiate with a host-powered device if it's unpowered to begin with?
Through a pair of resistors.
The unpowered device connects each of the two CC pins to the ground pin through a separate resistor of a specific value. The cable connects one of these CC pins to the corresponding pin on the other end of the cable (the second pin on each side is used to power circuitry within the cable itself, if it's a higher-spec cable). On the host side, each of the two CC pins is connected to the power supply through a separate resistor of another specific value. When you plug all this together, you have the host power supply connected to ground through a pair of resistors, which is a simple voltage divider. When the host detects the resulting voltage on one of the CC pins, it knows there's a device on the other end which is not providing power, and it can connect the main power pins of the connector to its power supply.
This is a real issue.
I have 2 powerbanks that cannot be charged by USB-C port when at 0%. The signaling circuitry simply doesn't work. No idea who designed this. It is silly beyond belief. I have to charge it with normal 5V A-to-C cable for 30 seconds, then unplug, then all the PD stuff will start working and I can fastcharge it with USB-C port again. I'm screwed without A-to-C cable.
Holy shit. This explains why my Anbernic 406v is so weird. If I drain the battery too much, it won't let me charge with anything except with a normal 5v USB A to C cable and the USB-C cable that I use to charge it while it's on does nothing. It makes so much sense now.
This (and the GP) are because your device supports some sort of fast charge USB-PD negotiation, but does not support what is known as “dead battery mode”. Basically, dead battery mode enables those pull down resistors by default (when no power is applied) so you can get 5V to the system, where eventually it would charge up and the chip that can do PD negotiation will be powered. Usually this is done simply by having the negotiation chip default to pull down resistors internally when unpowered.
Thanks, that solution sounds decent. Hope more manufacturers do it, seems I got unlucky.
using passive component
> How does it negotiate with a host-powered device if it's unpowered to begin with?
The spec was written by a SW engineer, this explains some things. /s
USB-C hosts and power adapters are only allowed to provide 5V if they can sense a downstream device (either via a network of resistors or via explicit PD negotiation).
Out-of-spec USB-C devices sometimes skip that, and out-of-spec USB-C chargers often (somewhat dangerously) always supply 5V, so the two mistakes sort of cancel out.
They are likely not following the USB spec correctly. Things like pulling certain pins high or low or having a set resistance between certain pins or communications between the host and device will all affect what goes over the wire and whether the host or the device will accept this. Cables will also have some pins entirely unconnected.
Cheap, bad, shortcuts, etc. will result in an out of spec cable being necessary for an out of spec device to work correctly with an in or out of spec hub. It's terrifically frustrating but a fact of the world.
And this isn't just random no name knockoffs. The Raspberry Pi in certain versions didn't correctly follow the power spec. Both the Nintendo Switch and Switch 2 either incompletely, incorrectly, or intentionally abused the USB spec. The Lumen metabolism monitoring device doesn't follow the USB spec. This is one of those things where you want a bit of a walled garden to force users of a technology to adhere to certain rules. Especially when power and charging is involved which can cause fires.
> This is one of those things where you want a bit of a walled garden to force users of a technology to adhere to certain rules.
That’s what consumer protection laws with teeth and electric safety certifications like CE or UL are for, not walled gardens.
History has shown that relying on hardware DRM, like Apple did with Lightning doesn’t prevent manufacturers, from doing dangerous things, because they’ll find ways around it sooner rather than later.
The Nintendo Switch PD charges from every adapter and cable I have tried. However the Switch's power brick itself won't PD charge any other device.
I frequently use the Switch's power brick to charge my Thinkpad, as it's smaller and so easier to transport than the Thinkpad's original power brick.
The charger included with the original Nintendo Switch charges my MacBook Pro at 40 watts.
Unfortunately this forum is in German but it's really funny:
https://www.mikrocontroller.net/topic/458093
Basically the OP is asking what kind of resistors he needs so that he can get 5V out of USB C.
The first response is "No, 5V is always present." (incorrect)
The second response by another poster is "5V is ALWAYS present at the USB port..."
Only the fourth person actually answers the question and does it in a single sentence: "There needs to be 5k1 between CC and GND."
they violate the spec, that's all.
USB-PD hubs are very annoying. Devices with no battery on a hub (Raspberry Pi etc) will just get hard rebooted if anything else gets plugged into the same hub. I looked at a lot of hubs and they all behaved this way. They all cut power to everything, then re-negotiate the power allowance each device gets from the shared wattage, every time anything new connects. I could not find a hub that honored existing power contracts and gave new devices the remainder. My guess is the average customer expect the newest plugged in device to get max power (at the expense of everything else) or they return it to the store thinking it's broken.
I’m not sure if it’s a USB-PD hub but Jeff Geerling posted a video about a USB-C charger that doesn’t suffer from the renegotiation issue https://m.youtube.com/watch?v=dG2v4FHwJjE
Unfortunately there is no real solution to this, that would work in general case. With renegotiation, the power gets cut off, but most likely every device will get the allowance and all of them will still charge. With no renegotiation, a newly plugged in device might not charge at all. Not sure what's worse.
There is a clean solution: A toggle switch. Why does all modern equipment take control away from the user?
Not the case anymore with more modern chargers e.g. Anker GaN
even the new anker zolo 140w suffer the same issue.
I had my own decent into madness this spring.
I slowly replaced my home network piece by piece trying to find the bottleneck that was causing my gigabit internet to top out at ~300kbps in my office on the other side of the house from the modem.
After replacing the Ethernet cable run from the second floor to the basement with fiber optic... And the switches in between... And seeing no improvement... I tried a different computer with a built-in ethernet port on the same cable, and pulled 920kbps.
The problem... Was my Caldigit Thunderbolt Dock. I replaced it with an OWC one from Facebook marketplace for cheap and it solved the problem... Roughly $500 in. I'm still angry I didn't check that early earlier.
My network is 10 gigabit now though.
I think you mean 300Mbps and 920Mbps (M not K right?)
Yes! lol I'd have just used WiFi if I was only getting 320kbps wired lol
Which CalDigit dock? I have the TS3 Plus. Using its Ethernet port and Thunderbolt 3 to my laptop, I'm getting the expected 1,000 Mbps connection on my home network. Do you have a different model or maybe a defective unit?
The problem that I have is that I have a ton of usb cords in my drawer. I DONT KNOW WHAT THEY ARE CAPABLE OF.
Is it a charge only cable (no data)? Is it usb3 5gbps ? Is it 100 watt power delivery? Is it thunderbolt 3/4/5?
Does the cable do 4K video, only 1080p, or no video at all?
You need a tester like the FNIRSI FNB58 (not affilate link: https://www.amazon.com/gp/product/B0BJ253W31). This is just an example and not a personal recommendation, as I've just started looking into these myself.
What I want is a tester that can show bit error rate.
They have fixed this, the modern spec has speed and power rating logos that (good) manufacturers can put on the cables. Just assume anything without a logo on it is USB2/5v only and buy some new cables you know the specs of.
The fact that cables have varying speed and power seems like a failure at launch. Who benefits from this? Manufacturers trying to save money on cables? Fuck that. This just means we'll use the cables that actually work everywhere to death and throw the rest out. What a waste.
Well, there are always going to be varying speeds and power because needs change and tech improves, as the spec improves over time, the cables have to vary. Either you change the cables entirely (creating even more waste as now you have to change even if you don't need the higher spec), or you have varied ones. Also, right now they can do higher speeds at shorter lengths, but not longer ones, so if you had only one speed, you'd have to have a short maximum length, or pay for very expensive active cables for any long cable, even if you only need it for power.
Even if it were purely cost, even then I think we still benefit: the alternative is cheaper devices will use a different cheaper cable spec, and you end up with many different cable types where you can't use them at all. Sure, maybe I won't get max speed from one cable, but still being able to get half speed and or power is better than a cable I can't use at all.
Honestly, I just have never got this criticism, the alternative is just "have completely different cables" and that has literally all the same problems but worse, outside of "it can be hard to identify what this cable supports", which is solvable in much better ways than making them incompatible (as with the cable I'm looking at on my desk right now which explicitly says 40gbps/240w on it).
I grew up in the era of every mobile phone having a different charger, and it was always a nightmare when you were at a friend's house and they didn't have the one you need. I may not get max charging speed, but now everyone always has a charger I can use for any of my devices. Please, I don't want to go back.
Colored electrical tape or heat shrink labels at both ends of each cable with a simple coding system (P=Power delivery wattage, D=Data speed, T=Thunderbolt) solves this problem permanently.
The USB IF really should have specified some labeling for cables. The icons used on USB-A connectors are too complicated. What I think would work well is colored rings, with some for the different USB3 speeds, and some for the 100W and 240W charging.
They did! [0] The problem is that the vast majority of manufacturers have chosen to just completely ignore it.
[0]: https://www.usb.org/sites/default/files/usb_type-c_cable_log...
I think most of the cables are just not certified by USB alliance (?) so they can not use those logos.
Am I supposed to be an expert on international branding and trademark law just to determine what cable I need to use?
Don’t be disingenuous. The branding guidelines are for manufacturers. Consumers just read the logo, which straightforwardly lists the speed and power capacity.
But anyone can print up those labels and slap them on a cable. Or they could make ones that look really similar but aren't quite the same
Really what we need as consumers are just 2 numbers: X GBps | Y Watts. I don’t need to know what freaking protocol it uses under the hood.
And that is what they are trying to do.
For example: Usb 5Gbps is the same regardless of usb 3.0, 3.1 or 3.2 gen 1. In fact, customers dont need to know that "3.0" number. they just need their ports support 5gbps
Thunderbolt cables have always been marked either on the ends or with a sticker wrapped around the cable. Everything else can be assumed to be a 2.0 cable at 60w/140w
I printed details on labels using a Brother label printer and then attached them to one end of each cable.
Details you found using a tester? I label some USB cables, but without a tester there is a limit to how much I know about them.
You can test them to a certain extent using a USB tester device like RYKEN RK-X3
Previous discussion (520 comments): https://news.ycombinator.com/item?id=30911598
Thanks!
USB-C hubs and my slow descent into madness (2021) - https://news.ycombinator.com/item?id=30911598 - April 2022 (513 comments)
Not strictly related but I just bought a USB4 USB-C cable which is rated at 40 Gbps. I still can't really believe it. (I still remember saving to cassette tape)
I use one of these for two 4k monitors, sound, ethernet, mouse, webcam, AND charging. It's amazing having one cable to plug in when I work from my desk. Unfortunately requires one of those $400 docks though.
Work bought me a VisionTek hub. I wanted the 1 cable life - unfortunately, it only does monitors via DisplayLink, aka compressed & streamed to & from my desk. It's noticeably fuzzy.
So now it's 2 cables: 1 from the hub, 1 from the monitor. Both USB-C.
WTF guys?
My Apple monitor from 2009 just worked with 1 cable (no power, but still).
Have you tried to benchmark it at all?
I just have the cable. I dont have a computer nor a device that can transfer that quickly. Wont be long though! Actually looked like those 6k monitors @60Hz will fill the pipe.
Do you doubt that they work? Your can demonstrate it to yourself pretty easily with 2 computers and iperf.
Yes. I have exact same story: around 3 completely screwed hubs til I got to mostly decent working one. I hate Apple for the port shenanigans, they are so close sometimes by plugging something into one port, the other one gets blocked. I can even block one port by my headphones cable thats how close together they are. Truly an idiot designed this. Also USB-C really feels more flimsy than USB-A and cables I insert in are quite shaky already after 1 year of usage.
Same goes with 3.5mm jack on the phones. Freaking adapters are just an ominous thing to use. They are just bad and they always break. The port is so loose so after 3 months of using they just start falling away from it. There is no decent phone left with a 3.5mm jack, which is a really sad state of things... Unless you know one? Feel free to suggest.
Do you clean your USB-C ports? I used to get frustrated thinking the port was worn as cables would come out easily. I used a plastic dental pick to clean the port and it felt brand new afterward. Lint and fine debris gets in the port and then a cable insertion compacts that debris/lint. Over time this compaction layer builds up and cables no longer have enough depth to properly lock onto the port. It was shocking how much debris was compacted into the port.
> There is no decent phone left with a 3.5mm jack, which is a really sad state of things... Unless you know one? Feel free to suggest.
Sony Xperia phones traditionally have a 3.5 jack and a normal screen (without holes and other nonsense for the camera).
Man that is a lot of computer to put into that product for as little money as possible. I'm not intending to excuse the products with likely bad firmware causing most of these issues, especially Ethernet PHYs. Though, in my professional experience doing embedded device firmware, Ethernet PHYs are always the biggest pain, hands down. The firmware included with them has many many ways to configure it wrong, and the defaults are typically just random settings, not a reasonable configuration. Just getting the drivers even running sometimes involves just knowing a few tricks. Anyways, it doesn't surprise me many have trouble working right, especially when they indicate they are all running OEM firmware essentially.
> I guess if you need to plug in an ancient beamer or something?
Slightly off-topic, but I wonder if "beamer" is a "real" German word or one they borrowed from English in a weird way.
The first time I heard it completely threw me for a loop, as in the UK, "beamer" is shorthand for a BMW car.
> I wonder if "beamer" is a "real" German word or one they borrowed from English in a weird way.
It’s both :) See also: “Handy” for mobile phone.
However, I can’t help but feel a little bit cheated by companies just buying off-the-shelf products, slightly modifying the case layout, and then quadruple the price because it’s “from a reputable company”.
LOL. Welcome to the world of OEM/ODM. As a conservative estimate I'd guess >95% of all consumer electronics is done this way. Even the big names like Apple, Dell, Lenovo, etc. do it.
However, if you are - according to Wikipedia - a two-billion-dollar company like Realtek, then I expect you to get your shit together. There are exactly zero excuses for Realtek to not have a driver release ready almost a year after Big Sur has been announced. Zero.
Mac users are in the minority. It's worth noting that the RTL8153 is a native RNDIS device, which has its history on the Windows side, and Realtek has only started contributing drivers to Linux relatively recently.
FWIW I've had great luck with Realtek NICs, although I don't specifically recall using their USB NICs.
> I've had great luck with Realtek NICs, although I don't specifically recall using their USB NICs.
I envy you. Realtek NICs (especially over USB) are tantamount to trash in my mind after 2 decades of fighting their frequent failures. Be it failure to initialize at all to driver crashes to pisspoor feature sets (or claiming to have features that don't work at all), and a myriad of other problems. Granted, they usually work in Windows, but I don't work in windows (I live and work in linux/BSD). It has become my personally policy to avoid/disable and realtek NICs and replace them with something actually functional whenever possible.
Hopefully their work on linux-side drivers will change this given their proliferation.
To be honest I've yet to find a reliable USB based network interface regardless of chipset/brand/manufacturer, outside of the ones that do PCIe passthrough via USB4/Thunderbolt and those tend to be quite expensive (though they are starting to come down in price).
The problem with USB NICs is now you have two flaky failure points - the chipset driving the USB port and the chipset driving the USB network interface.
I had a reliability issues using a Realtek 2.5 Gbps USB network interface. Kept locking up, or having latency issues. Until I switched which USB port I plugged it into (one that used a different chipset), and after that it was solid.
3 actually:
Realtek itself (Questionable quality on a good day)
The implementation of Realtek by the ODM/OEM/SI into whatever part is being shipped, which given Realtek is the defacto "budget" option for networking, it's often done as cheaply and shoddily as possible, even if the chip itself actually isn't crapware.
And the USB interface as you point out. There's a whole rabbit hole that I'm unfortunately all too familiar with when it comes to diagnosing and dealing with USB. PCIe passthrough via a USB4/TB combo chip isn't as reliable as just using PCIe directly, but it's still better than a non-pcie passthrough usb interface.
Wonder if the Frameworks with USB adapters for everything struggle with that
They use the USB-C physical interface for their modules, but that doesn't mean they actually use the USB protocol on the backend. Not sure how they implement it to be honest, but it's at least entirely possible for example to run display-port only (With no USB involved at all) through a USB-C PHY (and dp isn't alone in being able to do that).
I suspect a lot of the flakiness is not the chip itself but the fact that, because it's cheap, the bottom-of-the-barrel manufacturers will always use it instead of the significantly more expensive alternatives, and then further cut corners with the board layout and design.
Ironically, the only problems I've had with NICs were on an Intel and a Broadcom.
> I suspect a lot of the flakiness is not the chip itself
Most certainly. Doesn't change the fact that Realtek being present is a huge redflag, even if it's not a cheap device, regardless of whether it's realtek's fault or the OEM/ODM/SI that integrated them into the system in question. It basically screams "we phoned this part in", though it's certainly not always true, it's true enough that I refuse to use them (be it by disabling them or just opting for entirely different hardware so I can avoid that headache).
Broadcomm is certrainly better than Realtek, but it's still a "Replace at soonest possible convenience" tier as well. Intel is far far more reliable in my experience (save for some of their combo bluetooth/wifi cards, but their dedicated wired ethernet cards have always been great for me. The i210/211 class of integrated bottom tier ones can be hit and miss though.
Is it always the case that these white label products are all equivalent? That is, is there still some input from the purchasing company on choice of components, quality control, etc, and does that make a difference to the product?
> Mac users are in the minority.
Are they truly though, given that MacBooks were the first to drop all ports except USB-C, pushing people to look into buying hubs?
Unfortunately I can't say I'm surprised that the common thread was Realtek network chips. I've found their NICs to be fairly flaky over the years - the majority work, but a solid minority don't. In contrast, Intel NICs have been bulletproof for me and I seek them out whenever I have any choice in the matter.
Looks like he only bought cheaper things, so no wonder they all eventually died. My USB-C hub is an HP Thunderbolt dock. It's beefy as heck, lasted for years with no issues. It has a tiny fan inside, which I assume helps with the longevity. I hear good things about CalDigit docks too. Those also are very expensive.
Expensive doesn't equal great either. My ThinkPad thunderbolt hub, which is not cheap by any standards, can't route HDMI without randomly restarting itself every few minutes. Connecting the same cable directly to my ThinkPad laptop works perfectly fine. Sort of defeats the whole point of buying a hub. I have sort of given up on hopes of getting a high quality hub - it's a money sink
$70 for a hub is not cheap at all.
I’ve got a few of the Apple ones because every new job just gives you one, and they have always worked in every way in every device I’ve used them on.
Yeah they cost more but they actually work properly.
Apple ones? Like this? https://www.apple.com/shop/product/MW5M3AM/A/usb-c-digital-a...
Yep that one. I guess they don’t have everything, but the everything hubs tend to come with a lot of confusing pitfalls. I use that one in particularly on my steam deck mostly.
I agree that if you buy cheap devices you shouldn't expect them to last. But the first device was almost $100, which I certainly wouldn't call cheap.
I’ve also given up on USB hubs and I’m using a Thunderbolt 4 dock to get more IO out of my Mac Studio. It feels crazy to spend that much $’s, but it solved my problems.
Any recommendations?
Back at one of my previous employers we had a long internal briefing about why our latest device did not have USB-C when other solutions on the market by then had.
The connector is solid but my god have there been disasters because of USB-C.
1. Power distribution upto high wattage, not always with auto sensing, 2. Wites rated to different data transmission speeds. 3. USB standard data transfers and Thunderbolt over the same connector and wire but most accessories are not rated for Thunderbolt.
Omg I love it and I hate it.
>but my god have there been disasters because of USB-C.
....like?
Slightly off topic. Would you see any case where Ethernet(POE*) are replaced with native USB-c?
What is the cheapest way of generating eye diagrams of USB cables?
And the cheapest way to measure bit error rates?
For real eye diagram, a signal / PRBS generator and a high end oscilloscope. Those are in the many-tens-of-thousands of USD range.
Debug tools for eg. FPGA transceivers or some redriver chips can measure BER and show fake eye diagrams. In the few-hundreds to the few-thousands of USD range, but you may need a company to sing NDAs. Eg.:
https://www.intel.com/content/www/us/en/docs/programmable/68...
Thanks. If this type of equipment were more accessible and cheaper we'd see less USB problems. This should also be USB consortium's responsibility, but I guess this way they make more money with compliance testing.
recently went through the process of picking a new hub and it took hours to locate one actually appropriate to my use case. Helpfully enough, chatgpt/claude were both good at locating ones with specific needs (x USB-C ports, y USB-A, no HDMI, no SD card readers), would probably have taken a lot longer without.
LLMs really are fantastic tools for shopping and product comparison. It's going to suck once the marketers manage to successfully do the equivalent of SEO on them
> the equivalent of SEO on them
It’s called GEO and it’s becoming a thing.
FWIW, you may have better searching for "LLMO".
I’m so glad to hear other people have had issues with these stupid cheap docks. I’ve burned through three of them over the last few years. It seems that not routing laptop charging power helps, but doesn’t solve the issue. Stupid cheap products.
I think I remember this (or a similar one).
The main gist was that almost every hub used the same board.
Years ago, I bought a hub similar to that Satechi one. It worked great until I unplugged my laptop. Then my home network would die. After some sleuthing, I realized the RJ45 interface repeated any packets received, creating a network loop and confusing my Ethernet switch. I contacted Satechi about this obvious defect. Their support team insisted this was by design and told me to unplug USB-C power from the hub whenever I disconnected my laptop, which … kinda defeats the point?
Was it actually repeating packets or was it sending out pause frames?
In my experience USB ethernet adapters send out pause frames which shit-tier switches replicate to all ports in direct contravention of the ethernet specifications.
Wow me too. This was very confusing and annoying as my wife's cheap dock would take down the network while we were both working from home. It took several annoying incidents before we connected the incidents to the dock.
I just bought my third dock/hub thing.
I've been on a Macbook M1 Air for the last few years and wanted multiple screens, so I got a USB 3 hub (Dell D6000) which does DisplayLink. I had almost everything hooked in there, but still connected one screen direct via usb-c. Displaylink is good for an M1 as you can add loads of screens if you want, but you can't watch streaming services on them as MacOs thinks your screen is being 'watched'.
I did want a thunderbolt hub but as far as I could tell at the time Thunderbolt and Displaylink just didn't come in the same package, so I was stuck with two cables.
Three years on, I picked up an M4 machine that can do two external screens natively, great, I can reach my goal of only plugging one cable into my macbook. But the Dell can't do that on a Mac because of something or other meaning it would work as mirrored-only.
Time to grab an actual thunderbolt dock. I picked up an HP TB3 refurb (HP P5Q58AA) which was cheap (30 AUD) and seemed great on paper, only to find it needed a type of power adaptor I didn't have that put me back another 60 bucks, and when I got it all hooked up it didn't always work and couldn't support decent refresh rates on my monitors, with one always being stuck at 30Hz. There was a software update available, but for that you need a windows machine with a TB3 port, not something I have.
So then I grabbed a Kensington SD5750T, which was another 140 bucks, but I am pleased to report that it just works and supports dual 4k60 over thunderbolt/USB-C. There is no HDMI or Displayport on this thing, but my monitors both have USB-C in so... Unfortunately, now that I've read the article, I can also confirm it contains a Realtek 0x8153, and is an OEM'd 'goodway' dock.
Just as well I'm happy with wireless networking!
this is why it's important to have an RJ45 port on your laptop...
(although.... looks like it's a realtek with the r8169 driver)
I 'member hearing stories about an early USB C hub which contained a network chip that, when no computer was attached but it had power from the brick, would randomly barf invalid ARP packets, up to and including taking entire networks down. Anyone 'member details?
I'm convinced the only way to get a quality piece of hardware is to spend a couple hundred bucks on a Thunderbolt 4 or 5 hub. I got a TB4 hub from CalDigit and it works great. And no misbehaving network chip to be found either.
I just bought a monitor arm that has a USB dock at the base. The dock is actually just a USB extension cord that is enclosed by the base of the arm. Think of it like putting a $10 USB hub into a box and poking holes in it to stick wires through. Great teardown, makes you wonder what's truly inside these boxes.
In other news, I do think desk makers should start incorporating the USB dock inside the top board of a desk. People go through a lot of money and bullshit to keep their setup clean, especially those who swap computers.
I guess if the devices were reliable enough they could be added to floor boxes. But ultimately putting docks in monitors seems the cleanest way.
> In other news, I do think desk makers should start incorporating the USB dock inside the top board of a desk. People go through a lot of money and bullshit to keep their setup clean, especially those who swap computers
Will be obsolete in 5 years every time
Humanscale the docks on the monitor arms look integrated but they're actually a really slick attachment that goes around the base nice and snug, so it just looks like one piece. It can be replaced if you take off the monitor, I really liked that. Humanscale is also extremely expensive as far as monitor arms go.
The selection isn't great yet, but you can get a USB charger or USB hub desk grommet (or one that does both).
Hopefully there will be a matching sized one (60mm) doing something useful, but if not you can just put a normal grommet cover in and do the stuff under the desk.
I'm using these things a bunch.
More accurate to say it’s a dock than a hub, but I’m using a Dell 2427DEB monitor[0] with my Dell work notebook and a second monitor daisy chained from it on the DP out port.
My work laptop has just a single USB-C cable plugged into the monitor for everything making it super trivial to plug it back in when I use it away from my desk (which I do regularly).
My personal desktop has a single DP and also a USB A to B cable. The monitor has KVM capability so I can super conveniently switch between them even with the two screens.
Cables are completely minimized with this set up, I’m very happy.
The only thing that’s unfortunate is that I occasionally work on a MacBook Pro 16” M4 and it can’t drive the second monitor over the same USB-C cable as Apple CBA to have support for DP MST on even their premium priced hardware. So I have to also plug in an HDMI cable to the second monitor.
Also unfortunate with the MacBook Pro is that macOS UI scaling doesn’t allow ratios like 125% meaning the UI elements aren’t quite at my preferred size. My Windows 11 handles this perfectly.
[0] https://www.dell.com/en-us/shop/dell-pro-27-plus-video-confe...
When I was tasked with acquiring desk setups for our whole company (small, new startup) back when everyone was using just whatever I decided to go with a really similar setup, Dell P2425HE monitor with integrated USB-C hub, and a Dell P2425H daisy chained from it.
I'm really glad we went with that, so far they've been great (for software devs, no color sensitive stuff).
> macOS UI scaling doesn’t allow ratios like 125% meaning the UI elements aren’t quite at my preferred size
Try BetterDisplay, and enable "Flexible Scaling" (a per-display setting).
Bonus: enable "UI Scale Matching" and output will be same physical size across all displays (as long as they report proper DPI, but I think you can hack that as well right there)