palmotea 2 days ago

This has a ton of holes:

> Z-Day + 15Yrs

> The “Internet” no longer exists as a single fabric. The privileged fall back to private peering or Sat links.

If you can't make CPUs and you can't keep the internet up, where are you going to get the equipment for enough "private peering or Sat links" for the privileged?

> Z-Day + 30Yrs

> Long-term storage has shifted completely to optical media. Only vintage compute survives at the consumer level.

You need CPUs to build optical media drives! If you can't build CPUs you're not using optical media in 30 years.

> The large node sizes of old hardware make them extremely resistant to electromigration, Motorola 68000s have modeled gate wear beyond 10k years! Gameboys, Macintosh SEs, Commodore 64s resist the no new silicon future the best.

Some quick Googling shows the first IC was created in 1960 and the 68000 was released in 1979. That's 19 years. The first transistor was created in 1947, that's a 32 year span to the 68k. If people have the capacity and need to jump through hoops to keep old computers running to maintain a semblance of current-day technology, they're definitely f-ing going to have been able to repeat all the R&D to build a 68k CPU in 30 years (and that's assuming you've destroy all the literature and mind-wiped everyone with any knowledge of semiconductor manufacturing).

  • lauriewired 2 days ago

    > If you can't make CPUs and you can't keep the internet up, where are you going to get the equipment for enough "private peering or Sat links" for the privileged?

    Storage. You only need a few hundred working systems to keep a backbone alive. Electron migration doesn’t kill transistors if they are off and in a closet.

    > You need CPUs to build optical media drives! If you can't build CPUs you're not using optical media in 30 years.

    You don’t need to make new drives; there are already millions of DVD/Bluray devices available. The small microcontrollers on optical drives are on wide node sizes, which also make them more resilient to degradation.

    > they're definitely f-ing going to have been able to repeat all the R&D to build a 68k CPU in 30 years (and that's assuming you've destroy all the literature and mind-wiped everyone with any knowledge of semiconductor manufacturing).

    If you read the post, the scenario clearly states “no further silicon designs ever get manufactured”. It’s a thought experiment, nothing more.

    • kadoban a day ago

      > If you read the post, the scenario clearly states “no further silicon designs ever get manufactured”. It’s a thought experiment, nothing more.

      This kind of just breaks the thought experiment, because without the "why?" of this being vaguely answered, it makes no sense. How do you game out a thought experiment that starts with an assumption that humanity just randomly stops being humanity in this one particular way? What other weird assumptions are we meant to make?

      • esseph a day ago

        If you don't like the rules of the game, you don't have to play it.

        • bee_rider a day ago

          But, this is as if people said “well, I can’t carry the soccer ball in my hands, so I’ll carry it with my elbows instead.”

          • esseph 14 hours ago

            It's not that complicated, you just literally choose to not participate in the thought experiment and you move on with your life.

    • pointlessone a day ago

      OK, no silicone. But we might be just fine after all. Just yesterday we had a story about Bismuth transistors that are better in every way than silicon ones. Maybe a tad more expensive. There are a plenty of other semiconductors out there too. We’ll have to adjust manufacturing but it will probably be just one upgrade cycle skip. Even with a complete mind wipe it’s still not that bad if only silicone is out.

  • 3eb7988a1663 2 days ago

    Surely knowing something is possible would speed up the process. Transistors had to go from this neat lab idea to find more and more incremental use cases. Eventually snowballing into modern chips. If you know from the beginning that computers are a neat idea, surely that would warrant more focused R&D.

  • hilbert42 a day ago

    There's a lot we could still do.

    Let's assume we go back to the pre-transistor era—1946 and earlier, the world then was a very different place but it was still very modern.

    It's too involved to list in detail but just take a look at what was achieved during WWII. The organization and manufacturing was truly phenomenonal. Aircraft production alone during the War was over 800,000 aircraft, manufacturing aircraft at that rate has never been equalled since, same with ships.

    We developed huge amount of new tech during the War including the remarkably complex atomic bomb and much, much more.

    And we did all this without the transistor, integrated circuit, CPUs, internet and even smartphones!

    Now consider the planning and organizational difficulties of D-Day—probably the most complex and logistically difficult understanding ever—without the aid of modern communications, the internet and smartphones, etc.—all of which depend on CPUs. Right, that happened too, and it was a total success.

    I wonder how a generation brought up during the post-silicon era would cope if all that were no longer available. It could happen if we had another Carrington Event or one that's even bigger (which has occurred), or say with nuclear EMP events.

    WWII Aircraft production https://en.m.wikipedia.org/wiki/World_War_II_aircraft_produc...

    WWII Military production: https://en.m.wikipedia.org/wiki/Military_production_during_W...

    • hilbert42 11 hours ago

      That 'understanding' typo is actually 'undertaking'.

CuriousRose 2 days ago

If humans forgot how to make new CPUs, it might finally be the incentive we need to make more efficient software. No more relying on faster chips to bail out lazy coding and make apps run lean. Picture programmers sweating over every byte like it's 1980 again.

  • burnt-resistor 2 days ago

    Probably not. Devices would run out within a generation.

    It ain't ever going to happen because people can write these things called books. And computer organization and architecture books already exist and there are many 10k's copies of them. What should be captured in modern computer organization books is applied science aspects of the history until now and the tricks that made Apple's ARM series so excellent. The other thing is TSMC needs to document fab process engineering. Without the capture of niche, essential knowledge they become strategic single points of failure. Leadership and logic dictate not allowing this kind of vulnerability to fester too deeply or too long.

    • saulpw a day ago

      The essential tacit knowledge can't be captured in books. It has to be learned by experience, participating in (and/or developing) a functioning organization that's creating the technology.

  • immibis a day ago

    Programmers haven't been able to rely on CPUs getting faster for the last decade. Speeds used to double every 1.5 years or so. Now they increase 50% per core and double the number of cores... every 10 years. GPU performance has increased at a faster pace, but ultimately also stagnated, except for the addition of tensor cores.

  • djmips 2 days ago

    That's already happening

  • geysersam a day ago

    Ah the good old days again, what a beautiful vision. Decadence and lazyness begone! Good luck running your bloated CI pipelines and test suits on megahertz hardware! /s

PaulKeeble 2 days ago

There is a bit of an issue that almost all the know how exists within a couple of private companies and if the industry down turned, such as from a crash in an AI bubble causing a many year lull, giant companies could fail and take that knowledge and scale with them. Some other business would presumably buy the facilities and hire the people but maybe not. It's one of the issues of so much of science and engineering happening privately we can't replicate the results easily.

  • bsder 2 days ago

    This isn't unique to semiconductors.

    If you turn off any manufacturing line, your company forgets really quickly how to make what that line made. GE discovered this when they tried to restart a water heater line in Appliance Park.

    • to11mtm 2 days ago

      Heck, the US had this problem when they needed to renew/refurbish nuclear weapons due to more or less 'forgetting' how to make Fogbank.

      • AlotOfReading 2 days ago

        FOGBANK was a little more complicated. The process that was written down just didn't work as expected. That was partially lost institutional knowledge that had never been recorded, but the original manufacturing just didn't understand their process. The process had contaminants improving the final product that they were unaware of. When the process was restarted, that didn't happen until it was investigated.

    • silisili 2 days ago

      Yup.

      Remington apparently has no idea what the blueing formula was they used in their original 1911s.

      Colt lost the ability to handfit revolvers.

    • bitwize a day ago

      We as a global civilization are close to forgetting how to make CRTs. There's like one company that still makes them, but only for military or major industrial applications (fighter jet HUDs and the like), at call-for-pricing prices. The major manufacturers, like Sony etc. all shut down their production lines, never to be restarted again because the knowledge of how to make them dissipated with those production lines. If you're an enthusiast who wants to experience retro video games as they appeared back in the day, your only option is to scavenge an old TV from somewhere.

      • wmf 10 hours ago

        Also since Sony shut down Walkman/Discman production it's now impossible to make good portable tape or CD players. The ones made now are huge and low quality.

      • LargoLasskhyfv a day ago

        Why would a HUD need a CRT? They are old, maybe to replace failed ones in old systems. But not how they are made today.

mahirsaid 2 days ago

There will be a great tragedy to be had if that was ever a reality in the near future. The bigger questions is what if you forgot hot to make the machines that make the CPU's. That is the bigger challenge to overcome in this crisis. Only one company specializes in this field that gives big company's like TSMC Their abilities to manufacture great CPU's. The trick is to create the machine that makes them and go from there. 10nm - 2nm capabilities.

  • RetroTechie a day ago

    In case more advanced nodes would disappear, even a semiconductor factory capable of producing eg. 200..500nm structures would already be very useful.

    Remember the first 32-bit cpus were manufactured on >1um processes. Never mind 8-bitters or KB-sized memory chips.

    Also note that IC designs, assembly programming & more, can be (and has been) done by hand. Having any kind of compute, no matter how slow by today's standards (couple MHz) helps a lot. Same for basic applications like text processing, spreadsheets, small databases, software development, etc etc.

  • LargoLasskhyfv a day ago

    There is more to lithography than ASML catering to TSMC, Samsung, Intel.

    There could be https://global.canon/en/technology/nil-2023.html &

    https://spectrum.ieee.org/nanoimprint-lithography

    There also is https://www.searchforthenext.com &

    https://semefab.com/

    claiming to not need that ASML high-end stuff at all, to be competitive.

    As reported around 2019 amongst many others like here:

    https://eepower.com/new-industry-products/search-for-the-nex...

    Maybe it's vaporware, because I'm unaware of anything 'big' produced there. Maybe it's only lack of funding,lack of trust because non-standard and 'unproven', inertia? Who knows?

    And finally the forgotten minimal.fab by Yokogawa https://www.yokogawa.com/industries/semiconductor/minimal-fa...

    https://www.minimalfab.com/en/ with no outrageous claims about structure size equivalence, but way faster turn-around times for prototyping, and none of the usually necessary investment in all that clean-room tech.

    And not to forget the push and incentive China got as 'development help' to be independent :-)

    I'm sure they're up to many interesting things in the near future.

nxobject 19 hours ago

The author’s a little bit too optimistic about the longevity of old consumer market computers: having collected vintage compact Macs, you become keenly aware of all of the possible points of failure like CRT displays, storage devices, and even fragile plastics. We may have to go back to much more analog forms of I/O: typewriter teletype with decreasing levels of logic integration, random access DECtape-style magnetic tape, etc.

trollbridge 2 days ago

I’m a little puzzled how “forgot how to make CPUs” also included “forgot how to make the mechanical part of hard drives, how to make flash memory, and how to make other chips”. I guess I don’t think of a 74xx series chip as a “CPU”?

  • geor9e 2 days ago

    I read it as: we have millions of hard drives and flash drives with a dead controller chip, so we harvest their other parts as spares. We still know how to make the spare parts from scratch, but we have so many for free.

vardump 2 days ago

We're toast should we ever lose ability to make CPUs.

Perhaps there should be more research how to make small runs of chips cheaply and with simple inputs. That'd also be useful if we manage to colonize other planets.

  • Legend2440 2 days ago

    Be more concerned about whatever nuclear war or social breakdown led to that point. Massive industrial manufacturing systems don’t shut down for nothing.

    • vardump 2 days ago

      It could also happen as natural decay over centuries. There's no guarantee we'll get more advanced over time.

      • spencerflem 2 days ago

        That would be a pity, but I don't see why we'd be toast.

    • kimixa 2 days ago

      To have zero effort in attempting to reproduce even 70s-era silicon technology for 30 years implies some real bad stuff, if the entire chain has been knocked out to that level I doubt "silicon chip fabrication" would really be a worry for anyone during that time.

  • spencerflem 2 days ago

    We as in civilization? We made it at least a few thousand years without it.

    Or do you mean the circumstances that would lead to this (nuclear war perhaps) would make us toast

    • throw0101d 2 days ago

      > We as in civilization? We made it at least a few thousand years without it.

      Civilization is a continuity of discrete points of time.

      We were able to enter (so-called) Dark Ages where things were forgotten (e.g., concrete) and still continue because things were often not very 'advanced': with the decline of Rome there were other stories of knowledge, and with the Black Death society hasn't much beyond blacksmithing and so was able keep those basic skills.

      But we're beyond that.

      First off, modern society is highly dependent on low-cost energy, and this was kicked off by the Industrial Revolution and easy accessible coal. Coal is much depleted (often needing deeper mines). Then next phase was with oil, and many of the easy deposits have been used up (it used to bubble up to the ground in the US).

      So depending on how bad any collapse is, getting things up without easily accessible fossil fuels may be more of challenge.

      • anthk 10 hours ago

        >hasn't much beyond blacksmithing and so was able keep those basic skills.

        That's an Anglo-Saxon black legend. How do you think boats and trebuchets were made? Navigation in the ocean without trig? Astrolabes? Yeah, sure. Year 600 wasn't the same as 1200.

        Read about Alphonse X. https://en.m.wikipedia.org/wiki/Alfonso_X_of_Castile

    • AlienRobot 2 days ago

      I'm not sure we can actually support 8 billion people's food production and distribution logistics without CPU's anymore.

      • spencerflem 2 days ago

        Whatever makes us forget CPUs will make there be less than 8 billion people I'm sure.

      • Tabular-Iceberg a day ago

        You don’t think we could do it with human computers?

        As long as the algorithmic complexity of food logistics is O(n) or better with respect to population size, I guess.

      • squigz 2 days ago

        No, but that's hardly the same as suggesting humanity would die off. We'd adapt, just at a much smaller scale.

    • digitalsushi a day ago

      i just learned about the Haber process. This guy, Fritz Haber, realized we could suck nitrogen out of the literal air and make soil fertilizer with it. The population is like 4 times higher than it would be without it.

      Scary how high up this tightrope is.

    • vardump 2 days ago

      The civilization as it is.

      • spencerflem 2 days ago

        I mean, the chips they're talking abt didn't exist until like 40 years ago I think we could manage.

        But tbh I don't see it as at all likely short of something like nuclear war that would be the much bigger problem.

        • vardump 2 days ago

          Do we really still have the society wide institutional knowledge to do things how they were done 50 years ago? I wouldn't be so sure.

          • squigz 2 days ago

            Mate, 50 years wasn't that long ago. We had computers and everything else we have now. We still did a lot of things fundamentally the same way. Everything was just slower and smaller (in scale; not physically)

            I think you also should realize much of the world continues on without bleeding-edge technology - homes are still built, crops are still harvested, and the world goes on.

          • spencerflem 2 days ago

            I don't think this is likely, but say we, the whole world, goes back to using telephones and writing paperwork on paper.

            I don't think it'd be the end of life as we know it.

            • alabastervlog 2 days ago

              The important chips aren’t the ones on our desks and in our hands. I think all that shit’s of dubious value to begin with.

              It’s the ones in factories, power systems, and transportation equipment, among other things.

        • conductr 2 days ago

          It would be chaos at first, but if it's at all physically survivable our species likely will, then we're only a few years away from a "humans exist, but most their knowledge has been lost". Only a few more years later before no human alive has ever interacted with a CPU using device, then the whole notion of a CPU kind of disappears before too long.

          We've had this happen before of course. There's a ton of things ancient civilizations were doing that we are clueless about. So clueless, that one of the leading theories is that they must have been aided by aliens.

          • anthk 20 hours ago

            Clueless? Bullshit and Amerigo-centrist crap from the far right. Non White -WASP- people were doing trigonometry far earlier than the Germanic ones, which were living in huts. Just have a look at Greece and Rome. Also, the Chinese knew basic Algebra too.

            • conductr 16 hours ago

              Perhaps you need to re-read my comment. I'm saying they were exceptionally knowledgeable about things in general (STEM stuff). And, I'm saying WE are clueless as to how they knew a lot of things, they probably knew a lot more than we have clues to know about - thus, clueless

              I'm talking about people all over the globe, separated by time, I don't know what your deal is with acting like I'm a white person poo'ing on POC - or how any of the racial/nationality/etc stuff you wrote factors in at all. You're obviously easily triggered and/or need to work on reading comprehension

              • anthk 10 hours ago

                Bear in mind the wacko alien thesis it's being held from envious right wing nuts as if the Egyptians were dumb or something. They are butthurted about Mediterranian non-Wasp origins of civilization, because their Aryan discourse crumbles down.

                It always was a matter of commerce and knowledge sharing between distinct races and tribes.

    • NoMoreNicksLeft a day ago

      >We made it at least a few thousand years without it.

      We did that during a period of peculiar circumstances that won't ever be replicated. Relatively large, distributed population with many different ecological environments that we were already pre-adapted to. A far smaller single-point-failure population that can't just go out and hunt for its food among the vast wildlife might have it pretty rough if industrial civilization were to falter.

  • kimixa 2 days ago

    Eh, there's plenty of small fabs globally that do smaller run "nowhere near cutting edge" (180nm or so) runs - you can make a pretty decent processor on that sort of tech.

    Would be a pretty solid intermediate step to bootstrap automation and expansion in the cases where the supply of the "best" fabs is removed (like in a disaster, or the framework to support that level of manufacturing isn't available, such as your colony example)

    • anthk 20 hours ago

      Forth Deck and CPUs with TTL's from magic-1.org.

asciimov 2 days ago

> … no further silicon designs ever get manufactured

The problem wouldn’t be missing CPUs but infrastructure. Power would be the big one, generators, substations, those sorts of things. Then manufacturing, lot of chips go there. Then there is all of healthcare.

Lots of important chips everywhere that aren’t CPUs.

  • M95D a day ago

    We had power for a very long time before silicon.

    Generators are just big coils of copper. Substations too. Solar won't work without silicon, but anything with a spinning coil of copper would. Voltmeters would need replacing with the old analog versions and humans would need to manually push switches to keep power levels constant, just like in the '50s.

    • nuc1e0n a day ago

      And a spring together with an electro-magnet can be made into a relay. They're big and slow of course, but they do the same thing as a transistor. If you can make metal into a wire you can make them. In the 1940s computers were electro-mechanical.

0xTJ 2 days ago

A fun read, but I do find it a bit odd that in 30 years the author doesn't think that we would have reverse-engineered making CPUs, or at least gotten as far as the mid-70s in terms of CPU production capabilities.

Also, the 10k years lifespan for MC68000 processors seems suspect. As far as I can see, the 10,000 figure is a general statement on the modelled failure of ICs from the 60s and 70s, not in particular for the MC68000 (which is at the tail end of that period). There are also plenty of ICs (some MOS (the company, not the transistor structure) chips come to mind) with known-poor lifespans (though that doesn't reflect on the MC68000).

  • therealpygon 2 days ago

    Agreed. It is a whole lot easier to recreate something you know is possible than to create something you don’t know is possible.

roxolotl 2 days ago

So taking this as the thought experiment it is what I’m struck by is that seemingly most things will completely deteriorate in the first 10-15 years. Is that accurate? Would switches mostly fail by the 10 year mark if not replaced? I’ve been looking at buying a switch for my house should I expect it to not last more than 10 years? I have a 10 year old tv should I expect it starts to fail soon?

  • __d 2 days ago

    My experience with retro computers is that things start to fail from around the 10-15 year mark, yes. Some things are still good after 30 years, maybe more, but .. capacitors leak, resistors go out of spec, etc, and that means voltages drift, and soon enough you burn something out.

    You can replace known likely culprits preemptively, assuming you can get parts. But dendritic growths aren’t yet a problem for most old stuff because the feature sizes are still large enough. No one really knows what the lifetime of modern 5/4/3nm chips is going to be.

  • protocolture 2 days ago

    Theres a hardware law that hardware past its half life often lives for an excessively long time.

    Really depends on brand and purpose but consumer hardware switches do die pretty frequently.

    But if you bought something like a C2960 fanless switch I would expect it to outlive me.

  • floating-io 2 days ago

    I have a 10+ year old Cisco 2960G and a pair of 10+ year old Dell R620's in my homelab, still humming happily along.

    So, no.

mikewarot a day ago

We'd still be able to make relays, and that's enough to do computing. If not that, then mechanical computer systems could be constructed to process data.

There's enough information on machine tools and the working of iron to make all the tooling and machinery required to start an assembly line somewhere.

After all, there was an assembly workshop turning out the Antikythera mechanism, there was a user guide on it. Obviously it wasn't the only one produced at the time.

  • delian66 a day ago

    > Obviously it wasn't the only one produced at the time.

    It is not obvious at all to me. Where are the others like it found?

datadrivenangel 2 days ago

It would be a bad decade, but someone would figure out how to get older microcontroller class chip production going pretty fast because $$$

myth_drannon 2 days ago

We would go back to 6502, it will be fine. Just more time spent optimizing the code.

  • Suppafly 2 days ago

    The stuff people write for old consoles and computers is pretty amazing. Computers definitely evolved faster than they needed to for the general public. All of these industries were built around taking advantage of Moore's Law instead of being about getting them most bang from existing limitations.

  • to11mtm 2 days ago

    A 6502 can easily power a robot to bend metal and other objects. You can bootstrap everything else from there.

    • trollbridge 2 days ago

      So can a machine built entirely from discrete components. A VAX 11/780 had no microprocessor at all. Just a CPU built out of components.

      • nxobject 18 hours ago

        Not quite - the VAX-11/780 relied on what was essentially a LSI PDP-11/03 as a general purpose glue processor (the “console interface board”) for the CPU initialization sequence, console and floppy I/O, and other assorted tasks.

    • vardump 17 hours ago

      We could call it Bender.

0x000xca0xfe 19 hours ago

Honestly, probably not much would happen.

My daily driver laptop is a 2012 Thinkpad I literally pulled out of a scrap heap at my local university but it refuses to die. Moore's law has slowed enough that old hardware is slow but still perfectly capable to run 99% of existing software.

Already existing machines would give us at least one or two decades to restart manufacturing from zero and that is more than enough time to avoid existential problems.

And most computers are under-utilized. The average gaming PC is powerful enough to run the infrastructure for a bunch of small companies if put to work as a server.

FrankWilhoit 2 days ago

The larger point is that we are going to forget a lot of things.

waynesonfire 2 days ago

We would have to git-revert linux 486 support.

cadamsdotcom 2 days ago

It’s a bit like trying to censor a LLM: to delete such an interconnected piece of information as “everything about making CPUs” you have to so significantly alter the LLM that you lobotomize it.

CPUs exist at the center of such a deeply connected mesh of so many other technologies, that the knowledge could be recreated (if needed) from looking at the surrounding tech. All the compiled code out there as sequences of instructions; all the documentation of what instructions do, of pipelining, all the lithography guides and die shots on rando blogs.. info in books still sitting on shelves in public libraries.. I mean come on.

Each to their own!

andsoitis a day ago

what do you have to believe to be true in order for humanity to forget how to make CPUs?

johnea 2 days ago

Wouldn't it be better if the world totally forgot the twitverse?

charcircuit 2 days ago

Even if humanity forgot, most of the process is automated, so it shouldn't be too hard to figure out how to keep an factory running.

  • M95D a day ago

    It would be extremely difficult to keep the factory from destroying itself.

    I work in a medical lab. The company bought a new automated coagulation analyzer. The old machine was shut down (proper shut down procedure) and kept in storage in case it was needed. They should have replaced the wash fluid with water. This procedure isn't documented because nobody expects that kind of machine to just sit unused for a long time. After a few months we needed to start it again (can't remeber why, I think there was a delivery problem with the reagents for the new analyzer). We couldn't. The washing fluid dried and detergents and other chemicals it contained solidified inside the valves, just like it happens with an inkjet printer if left unused. They were all stuck. Repair would have been too expensive and it was scrapped.

    I saw this happen with a haematology analyzer too. It was kept as a backup but wasn't need for a few months. I was able to resurrect that one after several hours of repeated washing.

    An electrolyte analyzer is even worse. Keep it turned off for only a few hours and the electrodes will need to be replaced.

    I don't think any other advanced industrial machine is any different. They can't be just left unused for a while and then expect them to work. It's even more problematic if the shut down procedure isn't done right (exceeding the documented requirements) or not at all.

    • Tabular-Iceberg a day ago

      That’s what killed my suspension of disbelief watching Idiocracy. Most of the automation would have broken down in less than a day.

daft_pink 2 days ago

thankfully the way capitalism works, we would quickly reinvent them and remake them and the companies that did so would make a decent profit.

generally the true problems in life aren’t forgetting how to manufacture products that are the key to human life.

spencerflem 2 days ago

This doesn't make a ton of sense to me. What situation would everyone lose the ability to make any CPU, worldwide, and we don't have a much much bigger problem then how to run AWS?

  • MostlyStable 2 days ago

    This reads to me as mostly just an interesting way to teach about expected hardware lifetime assuming we were trying as hard as possible to keep things going. There is an entire genre of speculative SF that posits one major change and tries to think through the repercussions of that change. Often, the change itself is not very sensible, but it's also not the point.

    • spencerflem 2 days ago

      I do think its interesting that digital records may not survive any sort of truly world war.

      Its so easy to think of them as lasting forever

      • stevenwoo 2 days ago

        We don't have enough time with digital data but it seems extremely doubtful anything digital would survive as long as the Herculaneum scrolls that were buried in mud that was on the front page last week, that's longer than almost any civilization has continuously existed (the exception only being ancient Egypt?) but maybe humans will turn it around in the near future and obliterate that record.

  • protocolture 2 days ago

    My guess:

    Something would need to happen to stop / prevent production for about 30 - 60 years.

    Thats roughly equivalent to the Saturn V engine, and Codename FOGBANK which are the 2 examples of technologies that had to be reverse engineered after the fact.

    Hypothetically we might choose to stop making new ones if demand dried up significantly.

    • bell-cot 2 days ago

      Short of asteroid Dino-Doom v2.0 hitting the earth, how could CPU demand fall so low that we don't make any new ones?

      • protocolture 2 days ago

        Some kind of demand reduction.

        It could be the case that we finally hit a sold wall in CPU progress, cloud providers demand something they dont have to replace every few years, and the result is some kind of everlasting gobstopper CPU.

        Then as failures fall off, so does demand, and then follows production.

        A pretty large drop in global population might see the same result. Labor needs to be apportioned to basic needs before manufacturing.

        • grues-dinner a day ago

          The whole of human civilization would need to be completely moribund before every manufacturer stops designing and making new weird industrial/embedded CPUs for little niches.

          And because they go into things like dishwashers and cars (and missiles) and stuff that dies for other reasons than chip failure, you always need some supply of them.

          Though I guess if we end all wars and make stuff so good that you literally never need a new widget ever and all industry just stops, then I suppose there is such a thing as a perfect design.

  • M95D a day ago

    Unabomber2 ? How many people know how to make a CPU?

  • Animats 2 days ago

    A war in Taiwan?

    • bgnn 2 days ago

      Well, we stil have enough know-how outside Taiwan on everything to produce any semiconductors. A bigger world war is most likely what it takes to bring the supply chain to a halt. Even then, nobody magically forgets these things.

      • TimorousBestie 2 days ago

        I kinda doubt it. The theoretical knowledge is there, but there’s a huge gulf between that and all the practical knowledge/trade secrets held by TSMC.

        Another view on this topic is https://gwern.net/slowing-moores-law

        • alabastervlog 2 days ago

          The stuff that really matters is mostly on microcontrollers.

          The few industries that push computing out of need would suffer. Certain kinds of research, 3D modeling.

          But most of what we use computers for in offices and our day-to-day should work about as well on slightly beefed up (say, dual or quad CPU) typical early ‘90s gear.

          We’re using 30 years of hardware advancements to run JavaScript instead of doing new, helpful stuff. 30 years of hardware development to let businesses save a little on software development while pushing a significant multiple larger than that cost onto users.

          • voidspark 20 hours ago

            > slightly beefed up (say, dual or quad CPU) typical early ‘90s gear.

            Early 90's Intel was the 486 33 Mhz. It barely had enough performance to run the TCP/IP stack at a few hundred KB/sec, using all of the CPU just for that task. I think you forgot how slow it was. Pentium II is where it starts to get reasonably modern in the late 90's. Pentium Pro (1995) was their first with multiprocessor support. It was moving so fast back then that early/mid/late 90's was like comparing decades apart at todays pace of improvement.

            • alabastervlog 18 hours ago

              166MHz pentium with 128MB (not a typo, kids!) of memory felt luxuriously snappy and spacious, including with tabbed web browsing in Phoenix/Firebird/Firefox… running BeOS or QNX-Photon. Not so much under Linux or Windows.

              Not so far removed from a multi-CPU Pentium at 90 or 100MHz, from the very early Pentium days.

              I guess what I had in mind was first-gen Pentiums. They’re solidly in the first half of the ‘90s but “early 90s” does cover a broader period, and yeah, 486 wouldn’t quite cut it. They’re the oldest machines I can recall multitasking very comfortably on… given the right software.

              • voidspark 18 hours ago

                128 MB was almost unheard of back then.

                Pentium 66 - 1993

                Pentium 90 - 1994

                Pentium 166 - 1996

                More than doubled the performance in 3 years. Two orders of magnitude from 1990 - 2000.

                There was no multi-CPU Pentium. Not until the Pentium Pro in 1996.

        • rcxdude 2 days ago

          If you wanna make something that's competitive with the latest and greatest, sure. But there's literally thousands of fabs that can make _a_ CPU, and hundreds that can make something that is usable in a PC, even if not very fast. There's a huge span of semiconductor fabrication beyond the bleeding edge of digital logic.

          • AnotherGoodName 2 days ago

            One thing the above did have though was a mention that the high end would quickly be worth its weight in gold.

            The nvidia dgx b200 is already selling for half a million. The nearest non tsmc produced competitor doesn’t come close. Imagine no more supply!

            • bgnn 17 hours ago

              Aren't they worth more tgan their weight in gold already?

              • rcxdude 16 hours ago

                Depends if you just count the die or the rest of the rack. Processed silicon wafers will tend to beat that easily, but so does low-volume short-turnaround machined aluminium.

        • bgnn 2 days ago

          I might be biased, being an insider of semiconductor industry, to think that there gulf isn't that huge. Virtuakly everything is known down to what, 28nm or so. That's still a fairly good process and pretty impossible to forget.

        • voidspark 2 days ago

          TSMC factories use ASML hardware (designed and built in the Netherlands), that actually produces the chips.

          https://www.asml.com/en

          TSMC is running a successful business but they're not the only customers of ASML.

        • chasil 2 days ago

          Intel still knows 14nm quite well, and would likely sell access to the line if asked.

          If Taiwan ceased to exist, that would put us a decade back.

          • bgnn 2 days ago

            Samsung is just tiny bit behind TSMC.

            The gap isn't a decade, more like 12-18 months.

            Also, TSMC has 5nm production in the US. There are actual people with know how of this process in the US.

            • voidspark 2 days ago

              All of their photolithography equipment is manufactured in the Netherlands by ASML

              https://www.asml.com/en

              Other companies are using the same equipment (Samsung and Intel) but TSMC has deeper expertise and got more out of the same equipment so far.

        • squigz 2 days ago

          There are other semiconductor manufacturers, right? Certainly it would be catastrophic to the industry, and would likely set progress back a while, but it would hardly be insurmountable. This discussion also assumes TSMC wouldn't sell/trade knowledge/processes, or they'd not be stolen, which wouldn't be crazy, given the hypothetical war in the region

      • NoMoreNicksLeft 2 days ago

        Not magically, they forget naturally. No one human knows the whole sequence, from start to finish, no one can really write it down (or shoot a how-to video). Distributed, institutional knowledge is extraordinarily brittle.

        • bgnn 2 days ago

          I agree, forgetting happens naturally. For example, it would be pretty difficult to produce vacuum tubes now. But I doubt this is applicable for CMOS technologies. Most of the steps down to finfets (TSMC 16nm) are rather well known. Yes we don't know the exact recipe of TSMC, Samsung or Intel, but it's not like alien technology. I read technical papers from all these fabs regularly and it's more open than what would people expect. Of course they keep their secrets too, for the cutting edge. There's so much knowhow out there that it's quite probable we can get there again in a short time if TSMC vanished fron earth all of a sudden.

          • insaneirish 2 days ago

            > For example, it would be pretty difficult to produce vacuum tubes now.

            Vacuum tubes are still made. They’re used extensively in instrument amplification.

            But I think this bolsters your point!

            • grues-dinner a day ago

              There's a guy who makes them in his garage. They're not really conceptually hard to make as such, they're just fiddly, delicate, labour-intensive and mostly replaced by astoundingly cheaper and often better (outside of a few niches) solid state options.

              If there were some kind of interdiction on silicon (an evil genie or some kind of Butlerian Jihad perhaps?), the market would remember and/or rediscover the thermoelectric effect and throw money/postapocalyptic bartered goods at glassblowers pretty sharpish.

              If that status continued, I'm sure we'd see developments in that space in terms of miniaturisation, robustness, efficiency, performance, etc., that would seem as improbable to us as a modern CPU would seem to someone in the no-silicon timeline. You may never get to "most of a teraflops in your pocket, runs all day on 4000mAh and costs three figures" but you could still do a meaningfully large amount of computation with valves.

              • NoMoreNicksLeft a day ago

                >There's a guy who makes them in his garage.

                Savant-tier, obsessive, dedicates his life to it "guy" does it in his garage over a period of how many years, and has succeeded to what point yet? Has he managed even a single little 8-bit or even 4-bit cpu? I'm cheering that guy on, you know, but he's hardly cranking out the next-gen GPUs.

                >the market would remember

                Markets don't remember squat. The market might try to re-discover, but this shit's path dependent. Re-discovery isn't guaranteed, and it's even less likely when a civilization that is desperate to have previously-manufacturable technology can't afford to dump trillions of dollars of research into it because it's also a poor civilization due to its inability to manufacture these things.

                • grues-dinner a day ago

                  You don't need trillions of dollars to start making tubes again. And it wouldn't be that one guy doing it for funsies, would it? If the question was "can one hobbyist bootstrap everything on his own" then I would agree. Maybe you completely lose even the insight that a small electric current can be used to switch or modulate a larger one. But if you're also losing mid-high-school physics knowledge, that's a different issue.

                  As I said, you probably won't ever get to where we are now with the technology, but then again probably 99.999% of computing power is wasted on gimmicks and inefficiency. Probably more these days. You could certainly run a vaguely modern society on only electromechanical and thermionic gear - you have power switching with things like thyrotrons, obviously radios, and there were computers made that way, such as the Harwell Witch in 1952.

                  Maybe you don't get 4K AI video generation or petabyte-scale advertising analytics but you could have quite a lot.

                  • fragmede a day ago

                    Looking at the Ryzen 7 9800X running at 5.2 GHz, if you chopped off 99.999% of that, you'd get a 52 kHz CPU, with 6.6 megaflops vs the original 6.6 gigaflops.

                    For reference, the original 4004 Intel CPU from 1971 ran at 740 kHz, so 52 kHz isn't even enough computing to do a secure TLS web connection without an excessively long wait. The 4004 did not do floating point, however, and it wouldn't be until between the 486 (1989) and the Pentium (1993) that we see 5-10 MFLOPS of performance.

                    • vardump 17 hours ago

                      > 6.6 megaflops vs the original 6.6 gigaflops.

                      Hmm... I think 9800X should be able to do at least 32 FLOPS per cycle per core. So 1.3 TFLOPS is the ceiling for the CPU. 1/100000 leaves you... 12 MFLOPS.

                      Then there's the iGPU for even more FLOPS.

                    • grues-dinner 17 hours ago

                      99.999 may be an ass-pull of a figure, but I was thinking more in terms of having whole datacentres screaming along doing crypto, billions of cat videos, Big Data on "this guy bought a dishwasher, give him more dishwasher adverts", spinning up a whole virtual server to compile a million line codebase on every change, and AI services for pictures of a chipmunk wearing sunglasses. There's a good chunk of computation that we as a society could just go without. I know of embedded systems that run at hundreds of MHz and could replaced by no CPU at all and still fulfill the main task to some extent. Because early models indeed used no CPU. Many fewer functions, but they still fundamentally worked.

                      Many things we now take for granted would indeed be impossible. I suppose the good news is that in some electropunk timeline where everyone had to use tubes, your TLS connection might not be practical, but the NSA datacentre would be even less practical. On the other hand, there'd be huge pressure on efficiency in code and hardware use. Just before transistorisation, amazing things were done with tubes or electromechanically, and if that had been at the forefront of research for the last 70 years, who knows what the state of the at would look like. Strowger switches would look like Duplo.

                      Probably there would still be a lot of physical paperwork, though!

                      • fragmede 13 hours ago

                        > 99.999 may be an ass-pull of a figure,

                        Comparisons to old technology is just something I do for fun, don't read too much into it. :)

                        Fun fact: A usb-C to HDMI dongle for has more computing power than the computer that took us to the moon.

                        As far as the NSA being even less practical, they're among the few who have the staff that could eke every last cycle of performance out of what remained. Maybe the Utah datacenter wouldn't work, but Room 641A long predates that.

          • NoMoreNicksLeft 2 days ago

            It's not just about secrets... it's about how many techniques and processes simply aren't documented. There's no need (someone knows how, and is in the business of training new hires), no capacity (they're not exactly idle), and no perception that any of this is important (things have kept working so far).

            Could they eventually replicate a CMOS technology? No one doubts this, but the latest lith process took how many years to develop, and only one company makes those machines anywhere in the world? Nearly microscopic molten tin droplets being flattened mid-air so that it can radiate a particular wavelength of UV?

            That's not something they'll have up and running again in 6 months, and if it were lost, regression to other technologies would be difficult or impossible too. We might have to start from scratch, so to speak.

    • spencerflem 2 days ago

      Chips are made elsewhere too. At worst, we (civilization) would lose the cutting edge if that happened.

      It would be a sad thing but not as sad as everything else that would happen in a war.

      • PlunderBunny 2 days ago

        We would go back to brick phones, with gears stuck on the side. Awesome!

Vilian 2 days ago

Hopeful they forget about JavaScript too, that would be a good thing to forget