This has a ton of holes:<p>> Z-Day + 15Yrs<p>> The “Internet” no longer exists as a single fabric. The privileged fall back to private peering or Sat links.<p>If you can't make CPUs and you can't keep the internet up, where are you going to get the equipment for enough "private peering or Sat links" for the privileged?<p>> Z-Day + 30Yrs<p>> Long-term storage has shifted completely to optical media. Only vintage compute survives at the consumer level.<p>You need CPUs to build optical media drives! If you can't build CPUs you're not using optical media in 30 years.<p>> The large node sizes of old hardware make them extremely resistant to electromigration, Motorola 68000s have modeled gate wear beyond 10k years! Gameboys, Macintosh SEs, Commodore 64s resist the no new silicon future the best.<p>Some quick Googling shows the first IC was created in 1960 and the 68000 was released in 1979. That's 19 years. The first <i>transistor</i> was created in 1947, that's a 32 year span to the 68k. If people have the capacity and need to jump through hoops to keep old computers running to maintain a semblance of current-day technology, they're definitely f-ing going to have been able to repeat <i>all</i> the R&D to build a 68k CPU in 30 years (and that's assuming you've destroy <i>all</i> the literature and mind-wiped everyone with any knowledge of semiconductor manufacturing).
If humans forgot how to make new CPUs, it might finally be the incentive we need to make more efficient software. No more relying on faster chips to bail out lazy coding and make apps run lean. Picture programmers sweating over every byte like it's 1980 again.
There is a bit of an issue that almost all the know how exists within a couple of private companies and if the industry down turned, such as from a crash in an AI bubble causing a many year lull, giant companies could fail and take that knowledge and scale with them. Some other business would presumably buy the facilities and hire the people but maybe not. It's one of the issues of so much of science and engineering happening privately we can't replicate the results easily.
There will be a great tragedy to be had if that was ever a reality in the near future. The bigger questions is what if you forgot hot to make the machines that make the CPU's. That is the bigger challenge to overcome in this crisis. Only one company specializes in this field that gives big company's like TSMC Their abilities to manufacture great CPU's. The trick is to create the machine that makes them and go from there. 10nm - 2nm capabilities.
The author’s a little bit too optimistic about the longevity of old consumer market computers: having collected vintage compact Macs, you become keenly aware of all of the possible points of failure like CRT displays, storage devices, and even fragile plastics. We may have to go back to much more analog forms of I/O: typewriter teletype with decreasing levels of logic integration, random access DECtape-style magnetic tape, etc.
I’m a little puzzled how “forgot how to make CPUs” also included “forgot how to make the mechanical part of hard drives, how to make flash memory, and how to make other chips”. I guess I don’t think of a 74xx series chip as a “CPU”?
We're toast should we ever lose ability to make CPUs.<p>Perhaps there should be more research how to make small runs of chips cheaply and with simple inputs. That'd also be useful if we manage to colonize other planets.
> … no further silicon designs ever get manufactured<p>The problem wouldn’t be missing CPUs but infrastructure. Power would be the big one, generators, substations, those sorts of things. Then manufacturing, lot of chips go there. Then there is all of healthcare.<p>Lots of important chips everywhere that aren’t CPUs.
A fun read, but I do find it a bit odd that in 30 years the author doesn't think that we would have reverse-engineered making CPUs, or at least gotten as far as the mid-70s in terms of CPU production capabilities.<p>Also, the 10k years lifespan for MC68000 processors seems suspect. As far as I can see, the 10,000 figure is a general statement on the modelled failure of ICs from the 60s and 70s, not in particular for the MC68000 (which is at the tail end of that period). There are also plenty of ICs (some MOS (the company, not the transistor structure) chips come to mind) with known-poor lifespans (though that doesn't reflect on the MC68000).
So taking this as the thought experiment it is what I’m struck by is that seemingly most things will completely deteriorate in the first 10-15 years. Is that accurate? Would switches mostly fail by the 10 year mark if not replaced? I’ve been looking at buying a switch for my house should I expect it to not last more than 10 years? I have a 10 year old tv should I expect it starts to fail soon?
We'd still be able to make relays, and that's enough to do computing. If not that, then mechanical computer systems could be constructed to process data.<p>There's enough information on machine tools and the working of iron to make all the tooling and machinery required to start an assembly line somewhere.<p>After all, there was an assembly workshop turning out the Antikythera mechanism, there was a user guide on it. Obviously it wasn't the only one produced at the time.
Honestly, probably not much would happen.<p>My daily driver laptop is a 2012 Thinkpad I literally pulled out of a scrap heap at my local university but it refuses to die. Moore's law has slowed enough that old hardware is slow but still perfectly capable to run 99% of existing software.<p>Already existing machines would give us at least one or two decades to restart manufacturing from zero and that is more than enough time to avoid existential problems.<p>And most computers are under-utilized. The average gaming PC is powerful enough to run the infrastructure for a bunch of small companies if put to work as a server.
It’s a bit like trying to censor a LLM: to delete such an interconnected piece of information as “everything about making CPUs” you have to so significantly alter the LLM that you lobotomize it.<p>CPUs exist at the center of such a deeply connected mesh of so many other technologies, that the knowledge could be recreated (if needed) from looking at the surrounding tech. All the compiled code out there as sequences of instructions; all the documentation of what instructions do, of pipelining, all the lithography guides and die shots on rando blogs.. info in books still sitting on shelves in public libraries.. I mean come on.<p>Each to their own!
thankfully the way capitalism works, we would quickly reinvent them and remake them and the companies that did so would make a decent profit.<p>generally the true problems in life aren’t forgetting how to manufacture products that are the key to human life.
This doesn't make a ton of sense to me. What situation would everyone lose the ability to make any CPU, worldwide, and we don't have a much much bigger problem then how to run AWS?