I'd like to talk about the claim that high-skill work is less likely to be automated than low-skill work. The author says:<p>>On almost all points, Baugues misses the mark. First, there is a qualitative difference between an auto worker (unskilled manual work) and a software developer (skilled knowledge work) that made automating the former inevitable.<p>As a rebuttal, I would point to some passages from Martin Ford's book on automation, The Lights in the Tunnel:<p>>A common misconception about automation is the idea that it will primarily impact low paying jobs that require few skills or training. To illustrate that this is not necessarily the case, consider two very different occupations: a radiologist and a housekeeper....<p>>In fact, we can reasonably say that software jobs (or knowledge worker jobs) are typically high paying jobs. This creates a very strong incentive for businesses to offshore and, when possible, automate these jobs....<p>>As a result, we can expect that, in the future, automation will fall heavily on knowledge workers and in particular on highly paid workers....<p>In general, I think knowledge jobs are at greater risk of automation than manual labor jobs. Robots are expensive. And they also don't scale as well as software. It's no coincidence that the largest software companies dwarf the largest robot companies. Ultimately, because software can scale, it becomes ridiculously cheap when you deploy it to millions or billions of people.<p>I agree with your overall point that creativity and design are hard to automate. However, software jobs are being and will be automated. What matters in the end is whether automation increases or decreases the overall demand for development. I.e., whether software automation is a complement to labor or a supplement to labor. So far, it's been a complement and software productivity has skyrocketed over the last couple of decades. This trend seems likely to continue, but who knows what the future holds 100 years from now.<p>P.S. Another industry that evolved similarly is agriculture. At first, tools and animals made farmers more productive, and farmer employment rose. But eventually, productivity rose so high that farmers started to saturate demand. Today, only 2% of Americans are farmers, despite farmers being the most productive of all time. So what's the moral here? What can software learn from farming? I think the moral is that quantity supplied is a function of both SUPPLY of labor and DEMAND for labor, and you can't ignore one when you predict the future.<p>P.P.S. You probably realize that the weakness of the agriculture comparison is that the hunger for food is far more easily sated than the hunger for software. The first has a biological limit, whereas the second is ostensibly unlimited. Therefore, you might reason, higher productivity in software is less likely to saturate demand and reduce employment. However, I would caution that some software demands might be easily saturated. Consider a company that wants its own website and app. If websites and apps become 100x cheaper (as a result of automation/higher productivity), would businesses demand 100x as many? Probably not.<p>P.P.P.S. Not all manual labor is created equal. Repetitive, indoor labor is much easier to automate than non-repetitive or outdoor labor.