I’m just in the middle of re-reading Simon Wardley’s collected blog posts on Wardley Mapping (via the soft cover book).<p>My recency bias aside, its uncanny how much his concepts infuse this Economist article.<p>Almost every paragraph echoes a concept from Wardley’s writings; diffusion versus evolution, inertia, co-evolution of practices and capabilities, capital flows, initial innovation versus refinement of an idea, and with hindsight, eventual ubiquity.<p>I greatly enjoyed the serendipity of this article appearing alongside my holiday reading.<p>One point missing from the article is the increased speed of diffusion via communication, and the relatively evolved states of compute, and other required underlying infrastructure for AI.<p>One could map the user needs of farms and farmers and todays knowledge enterprises, alongside the underlying infrastructure required to deploy tractors and AI, and draw some conclusions.
I can't read the entire article but I can guess its meaning. My grandfather (born in 1901) used to go out to western Canada each fall for the harvest. Tractors then because affordable and fewer people were needed until it was mainly all just tractors and a few people. With generative AI it's sort of a tractor that still needs people but not as many. The former farm workers will need to find other work as will people displaced. The tool is only as good as the information available to it and how that mess is filtered and made into something valid.<p>A few years ago in college we discussed AI and the instructor showed a video regarding AI or rather AGI. The point is super intelligent AI is a far off problem it's AGI just like an average Joe which will be a problem for workers. You can have a million instances of Joe AGI working 24/7/365 doing call center work, programming, reservations, any job not requiring physical contact.
Really interesting to read about the history of the tractor, but the relevance to AI is a short almost throw-away para at the end of the article.<p>One massive difference between tractors and AI - not addressed in TFA - is that unlike tractors, using AI doesn't impose a massive upfront cost on the adopters. Many of the major software platforms are rolling out AI without even being asked to. People using Photoshop, MS Word, etc, suddenly find that AI-based functionality is appearing in their next product release. This is radically faster than the decades it took for tractors to achieve dominance.
The tractor as well as the horse are both <i>capital equipment</i>. Also, hardware. Material stuff. While the horse may to some degree represent a variable cost, the degree of variablility on the tractor amounts to basically fuel.<p>AI and LLM on the other hand are <i>operating equipment</i> and fully variable costs. Also software, and immaterial.<p>The only real similarities between a tractor and a LLM is that both at some point were new inventions, and that both makes some tasks easier to do at scale. That's all, as seen from an economic perspective. In all other respects they are almost incommensurate.<p>Why is it that people think that the mere notion of "AI" means that you can just argue nonsense and forget all previous knowledge? I don't get it.<p>I think economist.com needs to take an Economy 101 lesson.
That this article could have easily been written by the same AI it’s claiming won’t have fast profound impacts on the world is all the evidence I need that this article is wrong.