Yawn. And currencies are dead because of crypto. And factories are dead because we all have 3D printers on our desk.<p>If you've had hands-on experience coding with ChatGPT (or 3D printing), you'll know it has huge blind spots and limitations. It's impressive, but it's a long way from fizzbuzz and fibonacci to comprehending and iteratively improving on a large codebase, running all the related tools, operating and testing a GUI as a user, etc.<p>Excited to see what happens, but I wish people would delve into the details a bit more and not always make these crazy extrapolations based on first impressions.
> When most code is machine-generated, how much you like a language simply doesn’t matter. Other factors will be more important when choosing what to use. Performance, tooling, and knowledge of how to operate it at scale will all be more important than the language itself.<p>But the popularity of the language <i>does</i> matter. "Write <i>function</i> in Python with numpy" is going to work way better than "Write <i>function</i> in SYCL" because there is orders of magnitude more example code, and LLM usage is only going to exacerbate the issue.
In the same sense that it is almost impossible for a human to write secure code in C, the same is going to be true for ChatGPT, as it thinks more like a human than a machine and was essentially trained to make mistakes like a human would. The LLM isn't some kind of god; and, frankly, <i>if it were</i>, that is going to be a Terminator-level of problem we are going to be dealing with and not some kind of panacea.
Regardless of who or what writes it, readability and ease of understanding code doesn’t become irrelevant. If 50% of code is being generated by something else, seem like it becomes even more important
The writers show they have no credibility, its incredibly stupid both what they say, and what they specifically don't say. Its very malign and manipulative.