I've had a couple of conversations on whether AI can truly take over programming jobs (notice I did not say Software Engineering on purpose), and I tend to steer towards refuting by saying that if that would be the case (still a maybe), we should see some AI programming languages springing up and down no? My hypothesis is that these would be more efficient, memory safe, performant, etc, etc, (maybe more "readable" - whatever that means for an AI programming language)
Surprisingly enough, it transpires that the actually difficult part of programming isn't the languages.<p>It is learning to express what it is you want, clearly and unambiguously enough so that there is as little room for error as possible; and even before then, learning to understand for yourself inside your own head what it is you want in sufficient detail that you can verbalise it, and maybe reason through how it might be achieved and what the possible outcomes, whether good or bad, might be.<p>Tooling can mainly help here by getting out of our way, so that we are free to focus on thinking about the actually important parts of what we want to happen without distracting details of how specifically it might be implemented such as the memory safety you mention.<p>However, with all those details gone, the actually hard problem of bringing your creation from your head into the outside world still remains. It predates computers, and is still present even if you remove computers from the equation entirely: it is why, e.g., lawyers as a profession exist. It arises not from the technology or area of activity, but from the nature of our own minds.
What, you’re saying Rust is an AI programming language?<p>I don’t understand why AIs wouldn’t use the languages humans already use to program?<p>Or do you mean, if an AI designed a programming language, it would be better than anything humans have come up with? Why? I’d expect it would be more like some unholy mashup of C, Javascript and Python.