Counterpoint: lots of software is relatively very simple at its core, so perhaps we don't need nearly as many employed developers as we have today. Alternatively, we have far more developers today, so perhaps companies are only firing to re-hire for lower salaries.<p>Regarding the first hypothesis: For example, one person can make a basic social media site in a weekend. It'll be missing important things from big social medias: 1) features (some of them small but difficult, like live video), 2) scalability, 3) reliability and security, and 4) non-technical aspects (promotion, moderation, legal, etc.). But 1) is optional; 2) is reduced if you use a managed service like AWS and throw enough compute at it, then perhaps you only need a few sysadmins; 3) is reduced to essentials (e.g. backups) if you accept frequent outages and leaks (immoral but those things don't seem to impact revenue much); and 4) is neither reducible nor optional but doesn't require developers.<p>I remember when the big tech companies of today were (at least advertised as) run by only a few developers. They were much smaller, but still global and handling $millions in revenue. Then they hired more developers, presumably to add more features and improving existing ones, to make profit and avoid being out-competed. And I do believe those developers made features and improvements to generate more revenue than their salaries and keep the companies above competition. But at this point, would <i>more</i> developers generate even more features and improvements to offset their cost, and are they necessary to avoid competition? Moreover, if a company were to fire most of its developers, keeping just enough to maintain the existing systems, and direct resources elsewhere (e.g. marketing), would they make more profit and out-compete better?<p>Related, everyone knows there's lots of products with needless complexity and "bullshit jobs". Exactly <i>how</i> much of that complexity is needless and how many of those jobs are useless is up to debate, and it may be less than we think, but it may really not.<p>I'm confident the LLMs that exist today can't replace developers, and I wouldn't be surprised if they don't "augment" developers so fewer developers + LLMs don't maintain the same productivity. But perhaps many programmers are being fired because many programmers just aren't necessary, and AI is just a placebo.<p>Regarding the second hypothesis: At the same time, there are many more developers today than there were 10-20 years ago. Which means that even if most programmers <i>are</i> necessary, companies may be firing them to re-hire later at lower salaries. Despite the long explanations above this may be the more likely outcome. Again, AI is just an excuse here, maybe not even an intentional one: companies fire developers because they <i>believe</i> AI can improve things, it doesn't, but then they're able to re-hire cheaper anyways.<p>(Granted, even if one or both the above hypotheses are true, I don't think it's hopeless for software developers. Specifically because, I believe many developers will have to find other work, but it will be interesting work; perhaps even involving programming, just not the kind you learned in college, and at minimum involving reasoning some of which you learn from development. The reason being that, while both are important to some extent, I believe "smart work" is generally far more important than "hard work". Especially today, it seems most of society's problems aren't because we don't have enough resources, but 1) because we don't have the logistics to distribute them, and 2) because of problems that aren't caused by lack of resources, but mental health (culture disagreements, employer/employee disagreements, social media toxicity, loneliness). Especially 2). Similarly to how people moved from manual labor to technical work, I think people will move from technical work; but not back to manual labor, to something else, perhaps something social.)