Speaking as someone who has been responsible for "turning the lights back on" to fix problems with "fully-automated", "lights-out" factory lines, much of this paper still rings true forty years on - if nothing else as a check against our engineering hubris. It remains tremendously difficult to quash entirely the long tail of things that can go wrong in a factory.<p>That said, many contentions raised here really have been resolved substantially with increased computing efficiency and ubiquitous connectivity. The touted expert human operator's ability to see and understand processes from a high-level, informed by years of observing (and hearing, and "feeling") machine behavior has truly been eclipsed by an advanced machine's capacity to collect increasingly granular snapshots of its complete operating state - the temperatures, vibrations, positions, and other sensations of its various organs and elements - every few milliseconds, hold on to that data indefinitely, and correlate and interpret that data in ever-expanding radii of causation.<p>The best human operators (of any technology) not only respond to problems, they anticipate and prevent or plan around them. Massive data, advanced physics-based simulations, and "digital twinning" capabilities of manufacturing equipment afford pre-emptive testing of virtually infinite scenarios.<p>Not only can you simulate throwing a wrench in the works - you can simulate the effect of the wrench entering the works at every possible angle!<p>It's not infallible, and will for a long time still require a human-in-the-loop at some level, but as the author rightly put it themselves near the end of the paper:<p>"It would be rash to claim it as an irony that the aim of aiding human limited capacity has pushed computing to the limit of its capacity, as technology has a way of catching up with such remarks."