><i>We’re moving to a future where AI can conduct arbitrary information-processing tasks based on natural-language instructions from any reasonably intelligent human who understands the problem they’re trying to solve.</i><p>AKA programming, but compared to my current efforts to instruct my compiler via very specific instructions (nothing like ambiguous natural language), I find that although I consider myself reasonably intelligent, it can take me a long time to fully understand any non-trivial problem I'm trying to solve, during which AI "help" often sends me on a wild goose chase (because the "I" in AI is still very much absent). Overall AI might help me more than hinder, but I think the claimed "future" end of the software industry might be a long way off. Current AI gives me the impression there might be as much industry growth to fix AI errors as there will be industry shrinkage due to AI assistance.