Dieuetmondroit 3 hours ago

I feel like I need to point out that this tool does not do, in any way, what the title claims. Parsing the output of the system profiler tool will not tell you whether your cable is "good", which in practice tends to mean that it supports the protocol the user cares about at that moment. For some examples: If you connect a Thunderbolt only cable to a USB4 only device, this approach will give you no information about why things are not working. If you connect a USB2 only cable to a Display-Port Alternate Modal display, this approach will not tell you anything about why your display is not turning on.

Izkata a day ago

It's more complicated than "this cable is good/bad". I had a suspicion about one of my cables for months, but just last week I confirmed it with a device that shows the volts/amps/watts/total kwh passing through it: I have a USB-C cable with orientation. Plugged in one way it transfers about 5x more power than the other way, and in the lower orientation it's low enough for the earbud case I use it with to not charge.

  • brailsafe 6 hours ago

    My pixel 7 seems to have fully died out of the blue while charging two days ago, using a USB-C I thought might be getting a little flaky (connected to my mac, I'd occasionally get repeating disconnects). I wonder if something along these lines could be the culprit.

    I picked it up to find it had shut itself off, and now won't accept any charge, wireless or wired from any combination of power sources and cables. No signs of life at all.

  • wiradikusuma a day ago

    Could you elaborate on "orientation"?

    Let's say for C-to-C, are you talking about swapping the head/tail? Or simply connecting at a different angle (180 degrees)?

    • rcxdude a day ago

      Probably 180 degrees rotation in the plug (on either end). It commonly happens if one of the contacts or conductors for USB-PD signalling is not working correctly. (because of the way the pinout is designed to work either way around, the conductors that are used for signalling swap roles depending on the orientation)

      • Izkata 18 hours ago

        Yep, 180 degree rotation.

  • giancarlostoro 17 hours ago

    Thats so weird, did you wind up coloring one end or something? I still wish we would add color to USB C wires like USB 3 has to emphasize features and expected uses. USB C was a much needed change from USB3 and 2 in terms of being reversible and superior but every manufacturer implements the cables differently and its confusing and hard to figure out which cable is best for what.

    • opan 5 hours ago

      Some cables write 10Gbps and similar near the end.

  • lostlogin a day ago

    The audio community love this sort of thing and will pay top dollar for unidirectional cables. Reproducible data proving the claims could be worth millions.

    • ChrisGreenHeur a day ago

      well, if you listen to audio you would not want the audio to accidentally get confused and head back to where it came from halfway down the cable right?

      • wombatpm 3 hours ago

        Audio people lost me when they complained about tape hiss being an issue with Digital Audio Tape. They then moved on to gold plated terminals and left twisted vs right twisted pairs wires inside multi conductor cables.

    • kstrauser 16 hours ago

      “This cut signal reflections, yielding brighter high hats without the brassiness of two-directional cabling. Bass was particularly clear and rumbly without the muddiness we heard from Monoprice cords.”

  • duttish a day ago

    Wait what. I thought half the point of usb c was to not rely on orientation.

    Is there any way to check this other than experiment?

    My "solution" so far has been to not buy cheap cables and just hope I get quality in return.

    • lmm a day ago

      > I thought half the point of usb c was to not rely on orientation. > Is there any way to check this other than experiment?

      Well sure, a standards-compliant cable will work in either orientation, but it's always possible for some but not all of the pins or wires to break.

    • pja a day ago

      I believe USB C cables actually do have an orientation - it's just that the negotiation both ends do usually makes that orientation irrelevant.

      Maybe the negotiation can fail & the plugged in orientation is then the only one that works?

      • estimator7292 17 hours ago

        USB-C only has an "intrinsic" orientation because we call one set of pins "1" and the other "2". Electrically there should be no difference.

        • lxgr 14 hours ago

          No, there really is an intrinsic orientation, at least once a cable is plugged in.

          The receptacles are symmetric, but a full connection is not. The cable only connects CC through end-to-end on one of A5 or B5, but not both, which lets the DFP detect which of A5 or B5 should be CC. The one not used for CC is then used to power the e-marker in the cable, if any.

          This is also true for legacy adapters; for example, for C-to-A USB 3 adapters, the host needs to know which of the two high-speed pairs to use (as USB 3 over A/B connectors only supports one lane, while C supports two).

    • tennysont a day ago

      I think that I have a specific cable-device-orientation that is broken. Meaning, I think a particular USB C cable won't charge my phone if it's plugged in 'backwards'.

      I always assumed that USB C cables use different pins depending on orientation, and that some pins on the cable wore down.

      Maybe that's what happened here?

      • consp a day ago

        My guess would be they used a one-sided pcb to connect the cable to and used half the wires. Some sockets internally link the power and ground pins, so it works both ways, but you get no resistor network and thus only standard 5v which gives you 500ma max (at best). With the resistors connected by the cable it's about 900ma to 3a which is probably what happens plugged in "correctly". Or some other magic happens on one side of the PCB to fool the charger into pushing the full 3A.

        • lxgr a day ago

          Shouldn't a compliant USB-C DFP not supply Vbus without the resistor network, though, so there should be no charging at all? (Not that all DFPs necessarily do the correct thing, of course.)

          • SAI_Peregrinus 17 hours ago

            Correct, which is probably why it won't even charge their earbuds in the broken orientation.

      • Waterluvian a day ago

        I think a more distressing thought is that it’s quite possible that your cable won’t charge your phone if it’s plugged in forwards.

      • numpad0 a day ago

        It's CC2/VCONN used for eMarker. That pin may be terminated inside the cable and used to power eMarker chip. It can also be used for orientation sensing. I think.

    • cyberax a day ago

      It happens. More often than not, it can be physical damage or manufacturing defect for one of the contacts and/or wires.

  • atoav a day ago

    It is not unheard of to have single damaged lines/connector-joints within a cable. The question is whether your cable was designed that way or whether it was damaged in a way that made it do this.

    • Perz1val a day ago

      It won't be a damaged wire, there's only one set of those, it's the plug lacking connectors or having them not connected

  • dist-epoch a day ago

    I can confirm, I have a USB-C cable with the same problem, charging speed depends on the orientation of the USB-C connector, which is hilarious.

    It was not a cheap cable, it was a medium-priced one with good reviews from a known brand.

mnw21cam 21 hours ago

No, I don't get it. Firstly, the normal system command output is not hard to read, but secondly, this output doesn't list any of the capabilities of the cables, just the devices at the ends of them. Perhaps showing an example of the output when the device is plugged in through the wrong cable would have helped. Does the tool produce a similar warning to the system popup, that is "this cable and device are mismatched"?

  • lxgr 13 hours ago

    As far as I understand, the idea is to determine whether the cable is the bottleneck from a hardcoded list of theoretical device capabilities with actually observed connection speeds as reported by the OS.

    It would be nice to just compare with the device's reported maximum capability, but I'm not sure whether macOS exposes that in any API.

bapak a day ago

Fun fact: this information is already available in the System Information app on your Mac.

Hardware -> USB

I also use the app to check what wattage my cables are when charging my MacBook (Hardware -> Power)

  • lxgr 13 hours ago

    This only shows you the minimum of what the cable and adapter support together, though. I believe this is a fundamental limitation of the protocol; the source won't tell you about voltage/current combinations not supportable by the cable.

  • angulardragon03 18 hours ago

    Yes, system_profiler is just a terminal version of System Information.

BrandoElFollito a day ago

I was looking for a USB cable tester (where I would plug in both ends of my cable and it would test it (power, data, ...).

There are plenty for Ethernet, but none such ones for USB. Was I looking with the wrong keywords or such device does not exist?

Note: I have a dongle that measures the power when inserted between the laptop and the charger, this is not what I am looking for

  • lxgr a day ago

    The reason is probably that anything faster than USB 2.0 (480 Mbit/s) and supporting power over 3A/60V will need to have an active marker, and to read that, you'll need something slightly more complex than a connection tester.

    That said, these things do seem to exist at this point, as sibling comments have pointed out.

    As an aside, it's a real shame devices with USB-C ports don't offer this out of the box. They need to be able to read the marker anyway for regular operation!

    • _rs 5 hours ago

      Right? It would be great if I could plug a USB-C cable into 2 ports on my Mac and it could figure out what the cable is capable of

  • moray a day ago

    On aliexpress for very cheap search for "DT3 Data Cable Detection Board Type-C". I got the one below and seems to work fine for what I needed.

    https://fr.aliexpress.com/item/1005007509475055.html

    Edit: This will test whether the cable is functioning properly. It will show the connections and indicate whether the cable supports only power or also data transfer. However, it won’t provide information about the USB-C cable type or its speed capabilities.

  • 9029 21 hours ago

    I have been planning to get either Witrn K2 or Power-Z KM003C. If just cable testing is enough, the Treedix one is probably good.

    Related: If you are looking for cables, this guy has tested a bunch (mainly for charging capabilities) https://www.allthingsoneplace.com/usb-cables-1

    • amelius 20 hours ago

      Can these instruments measure bit error rates?

    • amelius 19 hours ago

      I'd expected to see at least characteristic impedance in that table.

      And some metrics on internal reflections.

  • mrheosuper a day ago

    What do you mean "testing" it, reading hardcoded data from e-marker chip, or really test it?

    The later would require multi-thousands dollar machine.

    • tom_alexander 20 hours ago

      I'm curious as to why it is so expensive? Admittedly I know very little about electronics, and naturally the validation testing that a cable manufacturer does is going to be more thorough, but for consumer-grade testing couldn't we just have an FPGA or microcontroller scream the fibonnaci sequence in one end and another listen for the fibonnaci sequence on the other end? Sort of like memtest but instead ramping up speed until the transmission becomes garbled.

      • klausa 20 hours ago

        120GB/s is _really_ fast.

      • ssl-3 4 hours ago

        Sure. If it's easy, then it's easy.

        For a "regular" USB C that supports USB 2.0 speeds (and is rated for 60W and therefore lacks an internal e-marker chip), there's just 5 wires inside: Two for data, two for power, and one for CC. There's nothing particularly complex about testing those wires for end-to-end continuity (like a cheapo network cable tester does).

        A charging-only cable requires only 3 wires.

        But fancier cables bring fancier functions. Do you want to test if the cable supports USB 3? With one lane, or two lanes? USB 4? Or what of the extra bits supporting alt modes like DisplayPort and MHL and the bag of chips that is Thunderbolt -- does that need all tested, too? (And no, that earlier 120Gbps figure isn't a lie.)

        And power? We're able to put up to -- what -- 240W through some of these cables, right? That's a beefy bit of heat to dissipate, and those cables come with smarts inside of them that need negotiated with.

        I agree that even at extremes, it's still somewhere within the realm of some appropriate FPGA bits or maybe a custom ASIC, careful circuit layout, a big resistor, and a power supply. And with enough clones from the clone factories beating eachother up on pricing, it might only cost small hundreds of dollars to buy.

        So then what? You test the fancy USB-C ThunderBolt cable with the expensive tester, and pack it up for a trip for an important demo -- completely assured of its present performance. And when you get there, it doesn't work anyway.

        But the demo must proceed.

        So you find a backup cable somewhere (hopefully you thought to bring one yourself, because everyone around you is going to be confused about whatever it is that makes your "phone charger" such a unique and special snowflake that the ones they're trying to hand to you cannot ever be made to work), plug that backup in like anyone else would even if they'd never heard the term "cable tester," and carry on.

        The tester, meanwhile? It's back at home, where it hasn't really done anything but cost money and provide some assurances that turned out to be false.

        So the market is limited, the clone factories will thus never ramp up, and the tester no longer hypothetically costs only hundreds of dollars. It's right back up into the multiple-$k range like the pricing for other low-volume boutique test gear is.

        (I still want one anyway, but I've got more practical things to spend money on...like a second cable to use for when the first one inevitably starts acting shitty.)

Someone 3 days ago

> The script parses macOS’s system_profiler SPUSBHostDataType2 command, which produces a dense, hard-to-scan raw output

I couldn’t find source (the link in the article points to a GitHub repo of a user’s home directory. I hope for them it doesn’t contain secrets), but on my system, system_profiler -json produces json output. From that text, it doesn’t seem they used that.

  • sorcercode 3 days ago

    internally uses the same root command btw. in fact this recently changed for Tahoe (as the article mentions).

    started out as a shell script but switched to a go binary (which is what is linked).

    • procaryote a day ago

      I hope this doesn't become a trend. Moving it to go means you need to compile it before you run it, or blindly run an uninspected binary from some random guy

      It's not like the performance of this could have motivated it

      • lxgr a day ago

        I'll take the minimal hassle of having to compile a go program over a complex shell script that only the author understands (if that) any day.

        Performance isn't everything; readability and maintainability matter too.

        • timeon 21 hours ago

          > Performance isn't everything; readability and maintainability matter too.

          Is that case for this vibe-coded thing? https://news.ycombinator.com/item?id=45513562

          • lxgr 15 hours ago

            No idea, I haven't had a look at this code in particular.

            I'm just saying that I've seen several "small tools that could have been shell scripts" in Go or another more structured language and never wished they were shell scripts instead.

      • tensor 6 hours ago

        I mean, you shouldn't blindly run a shell script anymore than a binary anyways. And if you're reading the code I'd rather read Go than bash any day. That said, yes there is an extra compilation step.

  • JdeBP a day ago

    Correct. But you didn't see that the source was one level up in the directory tree from the untrustworthy binary blob?

    * https://github.com/kaushikgopal/dotfiles/blob/master/bin/usb...

    Presumably there is a sensible way to do this in go by calling an API and getting the original machine-readable data rather than shelling out to run an entire sub-process for a command-line command and parsing its human-readable (even JSON) output. Especially as it turns out that the command-line command itself runs another command-line command in its turn. StackExchange hints at looking to see what API the reporter tool under /System/Library/SystemProfiler actually queries.

    • Someone a day ago

      > But you didn't see that the source was one level up in the directory tree from the untrustworthy binary blob?

      No, silly me. Shortly searched for a src directory, but of course, should have searched for a bin directory, as that’s where vibe coding stores sources /s.

NelsonMinar a day ago

lsusb will get you this info in Linux, but I like the idea of a little wrapper tool to make the output easier to parse.

480 vs. 5000 Mbps is a pernicious problem. It's very easy to plug in a USB drive and it looks like it works fine and is reasonable fast. Right until you try to copy a large file to it and are wondering why it is only copying 50MBytes/second.

It doesn't help that the world is awash in crappy charging A-to-C cables. I finally just throw me all away.

  • lxgr a day ago

    I remember hearing it’s even possible to plug in a USB-A plug too slowly, making the legacy pins make contact first, which results in a 480 Mbps connection – despite the cable, the host, and the device all supporting superspeed!

    • lloeki a day ago

      Can confirm, was victim of this.

      Couldn't figure out why my 5-disk USB enclosure was so ungodly slow. Quickly I saw that it was capping suspiciously close to some ~40MB/s constant, so 480Mbps.

      lsusb -v confirmed. As it happened I did some maintenance and had to unplug the whole bay.

      Since the port was nearly tucked against a wall I had to find the port by touch and insert somewhat slowly in steps (brush finger/cable tip to find port, insert tip at an angle, set straight, push in) but once in place it was easy to unplug and insert fast...

      This was driving me "vanilla ice cream breaks car" nuts...

    • andrewmcwatters 15 hours ago

      Destroy the whole standard. That's literally insane.

      • lxgr 12 hours ago

        That's the price of strong backwards compatibility. Otherwise, you wouldn't be able to use a USB 3 (superspeed) device on a USB 3 host port with a USB 2 cable at all.

        And if you hate this, you should probably never look into these (illegal by the spec, but practically apparently often functional) splitters that separate the USB 2 and 3 path of a USB 3 capable A port so that you can run two devices on them without a hub ;)

_carbyau_ 5 hours ago

Is there a reason we can't plug a usb c cable with BOTH ends into the same computer and then get a full diagnostic on just the cable itself?

  • fnord77 4 hours ago

    I was able to charge my macbook from itself using that technique.

Tepix a day ago

Why does it mention USB 3.2 (i.e. 20 Gbps) at all if it's for Macs? I thought Macs only support 10 Gbps and 40 Gbps, but nothing inbetween?

(which is inconvenient because USB 3.2 Gen 2x2 20 Gbps external SSD cases are much cheaper than USB 4 cases for now).

Also, he is calling a binary a script, which i find suspicious. This task looks like it should have been a script.

  • gattr a day ago

    On a somewhat related note, I like the IO shield of my new MSI motherboard - the USB ports are tersely labeled "5G", "10G", "40G" (and a few lingering "USB 2.0").

  • cozzyd 16 hours ago

    One of my pet peeves is when people call a binary a script

madethemcry a day ago

Content wise a nice idea, but I also like the conclusion about how AI made this possible in the first place. The author itself mentions this motivation. AI is undoubtedly perfect for utilities, small (even company internal) tools for personal use where maintainability is secondary as you can ditch the tool or rebuild it quickly.

> Two years ago, I wouldn’t have bothered with the rewrite, let alone creating the script in the first place. The friction was too high. Now, small utility scripts like this are almost free to build.

> That’s the real story. Not the script, but how AI changes the calculus of what’s worth our time.

  • sersi a day ago

    I've found that to be very true. For bigger projects, I've had rather mixed results from ai but for small utility scripts, it's perfect.

    But like the author, I've found that it's usually better to have the llm output python, go or rust than use bash. So I've often had to ask it to rewrite at the beginning. Now I just directly skip bash

  • atonse 15 hours ago

    Came here to say exactly this.

    That all the naysayers are missing the tons of small wins that are happening every single day by people using AI to write code, that weren't possible before.

    I specified in a thread a few weeks ago that we manage a small elixir-rust library, and I have never coded rust in my life. Sure, it's about 20 lines of rust, mostly mapping to the underlying rust lib, but so far I've used claude to just maintain it (fix deprecations, perform upgrades, etc).

    This simply wasn't possible before.

simianparrot a day ago

This is a vibe coding Trojan horse article.

> That’s the real story. Not the script, but how AI changes the calculus of what’s worth our time.

Looking at the github source code, I can instantly tell. It's also full of gotchas.

  • Cthulhu_ a day ago

    Ugh. I appreciate the tool and I suppose I can appreciate AI for making the barrier to entry for writing such a tool lower. I just don't like AI, and I will continue with my current software development practices and standards for production-grade code - code coverage, linting, manual code reviews, things like that.

    At the same time though I'm at a point in my career where I'm cynical and thinking it really doesn't matter because whatever I build today will be gone in 5-10 years anyway (front-end mainly).

    • ChrisGreenHeur a day ago

      is it worth it for everything? if you need a bash script that takes some input and produces some output. Does it matter if it's from an AI? It has to get through code review, the person who made it has to read through it before code review so they don't look like an ass.

      • alex_duf a day ago

        yeah recently I needed a script to ingest individual json files into an sqlite db. I could have spent half the day writing, or asked an AI to write it and spend 10 minutes checking the data in the DB is correct.

        There are plenty of non critical aspects that can be drastically accelerated, but also plenty of places where I know I don't want to use today's models to do the work.

        • cozzyd 16 hours ago

          I worked with contractor for a contractor who had AI write a script to update a repository (essentially doing a git pull). But for some strange reason it was using the GitHub API instead of git. The best part is if the token wasn't set up properly it overwrote every file (including itself) with 404s.

          Ingesting json files into sqlite should only take half a day if you're doing it in C or Fortran for some reason (maybe there is a good reason). In a high level language or shouldn't take much more than 10 minutes in most cases, I would think?

          • alex_duf 15 hours ago

            regarding how long the ingestion should take to implement, I'm going to say: it depends!

            It depends on how complex the models are, because now you need to parse your model before inserting them. Which means you need tables to be in the right format. And then you need your loops, for each file you might have to insert anywhere between 5 to nested 20 entities. And then you either have to use an ORM or write each SQL queries.

            All of which I could do obviously, and isn't rocket science, just time consuming.

            • cozzyd 15 hours ago

              Sure, if the JSON is very complicated it makes sense that it could take a lot longer (but then I wouldn't really trust the AI to do it either...)

  • raincole a day ago

    The author literally says this is vibe-coded. You even quoted it. How the hell is this "Trojan horse"? Did the Greeks have a warning sign saying "soldiers inside" on their wooden horse?

    • amelius 19 hours ago

      No, but it lacked the product safety leaflet.

    • simianparrot 21 hours ago

      Because it’s not in the title, and I personally prefer up-front warnings when generative “AI” is used in any context, whether it’s image slop or code slop

  • wcrossbow a day ago

    I'm not a go developer and this kind of thing is far from my area of expertise. Do you mind giving some examples?

    As far as I can tell skimming the code, and as I said, without knowledge of Go or the domain, the "shape" of the code isn't bad. If I got any vibes (:))from it, it was lack of error handling and over reliance on exactly matching strings. Generally speaking, it looks quite fragile.

    FWIW I don't think the conclusion is wrong. With limited knowledge he managed to build a useful program for himself to solve a problem he had. Without AI tools that wouldn't have happened.

    • alias_neo a day ago

      There's a lot about it that isn't great. It treats Go like a scripting language, it's got no structure (1000+ lines in a single file), nothing is documented, the models are flat, no methods, it hard codes lots of strings, even the flags are string comparisons instead of using the proper tool, regex compiles and use inlined, limited device support based on some pre-configured, hard-coded strings, some assumptions made on storage device speeds based on its device name: nvme=fast, hdd=slow, etc.

      On the whole, it might work for now, but it'll need recompiling for new devices, and is a mess to maintain if any of the structure of the data changes.

      If a junior in my team asked me to review this, they'd be starting again; if anyone above junior PRd it, they'd be fired.

    • nottorp 21 hours ago

      > Generally speaking, it looks quite fragile

      I have a usb to sata plugged in and it's labeled as [Problem].

bediger4000 3 days ago

Two years ago, I wouldn’t have bothered with the rewrite, let alone creating the script in the first place. The friction was too high. Now, small utility scripts like this are almost free to build.

This aligns with the hypothesis that we should see and lots lots of "personalized" or single purpose software if vibe coding works. This particular project is one example. Are there a ton more out there?

  • emilburzo a day ago

    +1 here, with the latest Chrome v3 manifest shenanigans, the Pushbullet extension stopped working and the devs said they have no interest in pursuing that (understandable).

    I always wanted a dedicated binary anyway, so 1 hour later I got: https://github.com/emilburzo/pushbulleter (10 minutes vibe coding with Claude, 50 minutes reviewing code/small changes, adding CI and so on). And that's just one where I put in the effort of making it open source, as others might benefit, nevermind the many small scripts/tools that I needed just for myself.

    So I share the author's sentiments, before I would have considered the "startup cost" too high in an ever busy day to even attempt it. Now after 80% of what I wanted was done for me, the fine tuning didn't feel like much effort.

  • kayge 15 hours ago

    Yep! Nothing worth sharing/publishing from me, but quite a few mini projects that are specific to my position at a small non-tech company I work for. For example we send data to a client on a regular basis, and they send back an automated report with any data issues (missing fields, invalid entries, etc) in a human-unfriendly XML format. So I one-shotted a helper script to parse that data and append additional information from our environment to make it super easy for my coworkers to find and fix the data issues.

  • mvdwoord a day ago

    Definitely.... I just bought a new NAS and after moving stuff over, and downloading some new movies and series, "Vibe coding" a handful of scripts which check completeness of episodes against some database, or the difference between the filesystem and what plex recognized, is super helpful. I noticed one movie which was obviously compressed from 16:9 to 4:3, and two minutes later, I had a script which can check my entire collection for PAR/DAR oddities and provides a way to correct them using ffmpeg.

    These are all things I could do myself but the trade off typically is not worth it. I would spend too much time learning details and messing about getting it to work smoothly. Now it is just a prompt or two away.

  • pjmlp a day ago

    I see it differently, no need to assign learning tasks to juniors that can now be outsourced to the computer instead.

    This is currently the vibe on consulting, possible ways to reduce headcount, pun intended.

  • AceJohnny2 a day ago

    I have a bunch at work, yes. Can't publish them.

    Just an hour ago I "made" one in 2 minutes to iterate through some files, extract metadata, and convert to CSV.

    I'm convinced that hypothesis is true. The activation energy (with a subscription to one of the big 3, in the current pre-enshittification phase) is approximately 0.

    Edit: I also wouldn't even want to publish these one-off, AI-generated scripts, because for one they're for specific niches, and for two they're AI generated so, even though they fulfilled their purpose, I don't really stand behind them.

    • mrguyorama 13 hours ago

      >Just an hour ago I "made" one in 2 minutes to iterate through some files, extract metadata, and convert to CSV.

      Okay but lots of us have been crapping out one off python scripts for processing things for decades. It's literally one of the main ways people learned python in the 2000s

      What "activation energy" was there before? Open a text file, write a couple lines, run.

      Sometimes I do it just from the interactive shell!

      Like, it's not even worth it to prompt an AI for these things, because it's quicker to just do it.

      A significant amount of my workflow right now is a python script that takes a CSV, pumps it into a JSON document, and hits a couple endpoints with it, and graphs some stats.

      All the non-specific stuff the AI could possibly help with are single lines or function calls.

      The hardest part was teasing out python's awful semantics around some typing stuff. Why is python unwilling to parse an int out of "2.7" I don't know, but I wouldn't even had known to prompt an AI for that requirement, so no way it could have gotten that right.

      It's like ten minutes to build a tool like this even without AI. Why weren't you before? Most scientists I know build these kind of microscripts all the time.

      • thorncorona 6 hours ago

        Because even though I can learn some random library, I don’t really care to. I can do the architecture, I don’t care to spend half an hour understanding deeply how arguments to some API work.

        Example: I rebuilt my homelab in a weekend last week with claude.

        Setup terraform / ansible / docker for everything, and this was possible because I let claude all the arguments / details. I used to not bothered because I thought it was tedious.

    • dotancohen a day ago

      Who's the third? I'm assuming OpenAI and Anthropic are 1 and 2.

      • AceJohnny2 a day ago

        Yeah, Anthropic & OpenAI for two, the third being Google. I hear Gemini's gotten quite good.

  • INTPenis a day ago

    Absolutely. I can come home from a long day of video meetings, where normally I'd just want to wind down. But instead I spend some time instructing an AI how to make a quality of life improvement for myself.

  • nurettin a day ago

    For me, claude churns like 10-15 python scripts a day. Some of these could be called utilities. It helps with debugging program outputs, quick statistical calculations, stuff I would use excel for. Yesterday it noticed a discrepancy that lead to us finding a bug.

    So yes there is a ton but why bother publishing and maintaining them now that anyone can produce them? Your project is not special or worthwhile anymore.

  • ThrowawayTestr a day ago

    I've used chatgpt to make custom userscripts for neopets but I've never published them.

citizenpaul a day ago

Cross compiling is not unique to golang. It does make it pretty easy though.

  • consp a day ago

    Why cross compile if it's made specifically for macos?

    • procaryote a day ago

      Why compile it when it could be a bash script?

      • dotancohen a day ago

        Why a bash script when it could have been a one-liner?

larodi 21 hours ago

> Two years ago, I wouldn’t have bothered with the rewrite, let alone creating the script in the first place. The friction was too high. Now, small utility scripts like this are almost free to build.

adding to the theory that soon we gonna prefer to write, rather download ready-made code, because the friction is super low

  • usrusr 19 hours ago

    Arguably a one-off written by the cloud would still be downloaded to the place where it eventually runs.

eikenberry 6 hours ago

> I was punching through my email actively as Claude was chugging on the side.

I wonder how much writing these scripts cost. Were they done in Claude's free tier, pro, or higher? How much of their allotted usage did it require?

I wish more people would include the resources needed for these tasks. It would really help evaluate where the industry is in terms of accessibility. How much is it reserved for those with sufficient money and how that scales.

  • wingworks 3 hours ago

    You could do this with the free tier. It's fairly generous.

coldtea a day ago

> Go also has the unique ability to compile a cross-platform binary, which I can run on any machine.

Huh? Is this true? I know Go makes cross-compiling trivial - I've tried it in the past, it's totally painless - but is it also able to make a "cross platform binary" (singular)?

How would that work? Some kind of magic bytes combined with a wrapper file with binaries for multiple architectures?

basepurpose a day ago

i don't understand why do competent people need to mention that they vibe coded something.

  • sand500 a day ago

    It's a disclaimer that they have no idea what they code does.

  • jasode a day ago

    It's just because vibe coding is still "new" and various people have mixed results with it. This means that anecdotes today of either success or failure still carry some "signal".

    It will take some time (maybe more than a decade) for vibe coding to be "old" and consistently correct enough where it's no longer mentioned.

    Same thing happened 30 years ago with "The Information Superhighway" or "the internet". Back then, people really did say things like, "I got tomorrow's weather forecast of rain from the internet."

    Why would they even need to mention the "internet" at all?!? Because it was the new thing back then and the speaker was making a point that they didn't get the weather info from the newspaper or tv news. It took some time for everybody to just say, "it's going to rain tomorrow" with no mentions of internet or smartphones.

    • DonHopkins 6 hours ago

      It's like how in space, they just call the "space bar" the "bar".

  • stefanfisk a day ago

    I wouldn’t be surprised if it’s actually a plus in the eyes of possible new employers these days.

    • basepurpose a day ago

      vibe coding in my understanding is losing/confusing the mental model of your codebase, you don't know what is what and what is where. i haven't found a term to define "competently coding with ai as the interface".

      • stefanfisk a day ago

        I agree. But management types bedazzled by AI probably see these kids as the future leaders of our profession.

  • sharkjacobs a day ago

    I mean, they seem to address that pretty directly in the post

    > Two years ago, I wouldn’t have bothered with the rewrite, let alone creating the script in the first place. The friction was too high. Now, small utility scripts like this are almost free to build.

    > That’s the real story. Not the script, but how AI changes the calculus of what’s worth our time.

  • raincole a day ago

    "My static blog templating system is based on programming language X" is the stereotypical HN post. In theory the choice of programming language doesn't matter. But HNers like to mention it in the title anyway.

  • dist-epoch a day ago

    For the same reason competent people need to mention that X utility was (re)written in Rust.

    • timeon 21 hours ago

      I would guess the reason there is opposite. Like code that even newcomer can safely edit.

      But in general you are right. The article was for developers so mentioning the tool/language/etc. is relevant.

cratermoon 7 hours ago

"yes, vibe coded. Shamelessly, I might add"

I wouldn't trust this as source code until after a careful audit. No way I'm going to trust a vibe-coded executable.

pmlnr a day ago

Imagine if we printed the capabilities on the cables, like we used to.

  • jeroenhd a day ago

    Capabilities are printed on the side of ethernet cables and the text printed on the cable rarely seems related to the actual capabilities of the ethernet plug. Some cat5e cables are rated for 1000mbps but will happily run 5000mbps or 2500mbps (because those standards came after the text on the cable was last updated), other "cat6" cables are scams and struggle achieving gigabit speeds without packet loss.

    Plus, it doesn't really matter if you put "e-marker 100W USB3 2x2 20gbps" on a cable when half those features depend on compatibility from both sides of the plug (notably, supposedly high-end devices not supporting 2x2 mode or DP Alt mode or charging/drawing more than 60W of power).

    • Dylan16807 a day ago

      USB cables push the boundaries of signal integrity hard enough that unless it's a 1 foot passive cable you're not really going to get any surprise speed boosts.

      And when they upped the max voltage they didn't do it for preexisting cables, no matter what the design was.

      > those features depend on compatibility from both sides of the plug

      That's easy to understand. Cable supports (or doesn't support) device, it can't give new abilities to the device. It doesn't make labeling less valuable.

  • mrheosuper a day ago

    We used to what ? Back in the day there are countless cables with no printing. Sometime the only way to know if they are 3.0 or not if checking if they have blue connector.

  • threatripper a day ago

    That would only confuse potential buyers. You have to design everyday products for non-technical people.

    • withinboredom a day ago

      Not only that, it doesn’t stop unscrupulous manufacturers from just printing whatever they want.

    • Dylan16807 a day ago

      How could a max speed rating possibly be worse than a blank plug end?

thefz a day ago

Vibe coded, no thanks.

self_awareness a day ago

Vibe coding. Producing code without considering how we should approach the problem. Without thinking where exactly is the problem. This is like Electron, all over again.

Of course I don't have any problems with the author writing the tool, because everyone should write what the heck they want and how they want it. But seeing it gets popular tells me that people have no idea what's going on.

  • basepurpose a day ago

    if the author knows what they're doing and understand the model of the code at least, i don't understand the reason mentioning that it was vibe coded. maybe declaring something is vibe coded removes part of the responsibility nowadays?

    • self_awareness a day ago

      Someone once told me that their mistake wasn’t theirs, but rather it was ChatGPT being wrong.

      I think you have a good point about why people say it was vibe coded.

      It might also be because they want to join the trend -- without mentioning vibe coding, I don't think this tool would ever reach #1 on Hacker News.

      • drcongo a day ago

        HN guidelines say one shouldn't question whether another commenter has read TFA, so I won't do that. But TFA explains exactly why it was vibe coded, and exactly why they're mentioning that it was vibe coded, which is that that was the central point of TFA.

  • mrheosuper a day ago

    And why should they care what's going on ?

    Do you care about your binary code inside your application, or what exactly happen, in silicon level, when you write "printf("Hello World")" ?

    • self_awareness 17 hours ago

      Yes.

      I verify dynamic linking, ensure no superfluous dylibs are required. I verify ABI requirements and ensure a specific version of glibc is needed to run the executable. I double-check if the functions I care about are inlined. I consider if I use stable public or unstable private API.

      But I don't mean that the author doesn't know what's going on in his snippet of code. I'm sure he knows what's going on there.

      I mean that upvoters have no idea what's going on, by boosting vibe coding. People who upvote this are the reason of global software quality decline in near future.

      • mrheosuper 13 hours ago

        All your stuff is still pretty high level compared to the pure metal inside CPU. Do you which register the compiler decied to use to store this variable, or does the CPU will take this execution branch or not ?

        It's all abstraction, we all need to not know some low level layer to do our job, so please stop gatekeeping it.

        • self_awareness 12 hours ago

          What's your point? That we shouldn't care about anything at all because there is 1 thing we truly shouldn't care about?

          That we shouldn't care about spending $1 for a sandwich therefore managing home budget is pointless?

          • mrheosuper 5 hours ago

            My point is, you should care what you work with, and it's perfectly fine to not know the lower detail.

            Different people will care different layers.

rmunn a day ago

Please update the title to mention that this is MacOS only; I got excited to try this out, but I only have laptops running Linux and Windows.

  • sorcercode a day ago

    yeah sorry about that. I don't have access to a Linux/Windows machine.

    if I got a hold of the output and commands run, would gladly modify it.

    • jasonjayr a day ago

      > lsusb -v

      On Linux that produces a lot of info similar to the macos screenshots, but with values and labels specific to the Linux USB stack.

      I wonder if AI could map the linux lsusb output to a form your tool can use...

    • lanyard-textile a day ago

      Is it really vibe coding if you’re testing it on the target machine? ;)

      • adastra22 a day ago

        Yes, I think? “Vibe coding” is more about whether you are reading/reviewing the generated code, or just going with it, afaik.

    • kenperkins a day ago

      fwiw, it would take 10 minutes to download a linux docker image and build it in go to test. The harder part is getting the information from a different API on Linux.

      • RKearney a day ago

        This post is 12 minutes old. Have you finished yet?

      • skissane 21 hours ago

        A Linux Docker image, probably doesn’t have any USB devices exposed to it-well, it depends on exactly how you run it, but e.g. if you use Docker Desktop for Mac, its embedded Linux VM doesn’t have any USB passthrough support. This is the kind of thing where a physical Linux host (laptop/desktop/NUC/RPi/etc) is much more straightforward than running Linux in a VM (or as a K8S pod in a datacenter somewhere)

      • febusravenga a day ago

        ... and orders of magnite more time to properly access USB devices to some arcane VM not in your control

  • tomhow a day ago

    We've updated the title now, thanks.

  • swyx a day ago

    what do you mean, all developers only use macs!

    (/s)

jedbrooke 7 hours ago

I feel like we kind of got monkey’s paw’ed on USB-C. I remember during the 2000’s-2010’s people were drowning in a sea of disparate and incompatible connectors for video, audio, data, power, etc. and we’re longing for “One Port To Rule Them All” that could do everything in one cable. We kind of got that with USB-C, except now, you see a USB-C cable/port and you have no idea if it supports data only, data + charging, what speeds of data/charging, does it support video? maybe it does, maybe it doesn’t. at least it can plug in both ways… most of the time

  • op00to 5 hours ago

    I bought the coolest, fattest USB-C cables, and I failed to read the description enough to hear they only support USB 2 speeds! They work fine for the specific use I have for them, but I wish I could use ‘em for everything!

  • bluedino 7 hours ago

    I was just thinking the other day, if the connectors had been USB-C from the start.

    No Type-A, no Type-B, no Mini, no Micro...

    • op00to 5 hours ago

      insert futuristic city picture with flying cars here

self_awareness a day ago

Guys, please, don't upvote this. If this topic will beat the "Physics Nobel Prize 2025", I will lose my faith in HN.