Here is my take: I think we will go really hard in this idea of using AI to generate source code. And the reason isn't because I believe the quality will increase exponentially in the next 10 years.<p>The reason is that it will allow companies to increase their outputs. Think of like CISC vs RISC processors, on one hand we have experienced developers creating really good software while on the other hand we have inexperienced developers using AI creating mostly shitty code.<p>The former is what you want, but it is expensive and really hard to find. The latter is cheap and easy to find. And that is why I think a lot of companies will come to the conclusion that hiring a lot of inexperienced developers makes financial sense.<p>Of course this is not sustainable in the long run, but I think people will need 5-10 years to really understand that. And when this finally happens we will probably see a new "Agile manifesto" to help guide us in this new technology landscape.<p>And probably the worst part is that until we mature our understanding about good practices regarding the use of AI to write code we will create some of the worst code we will ever seen. Picture this: fast inverse square root[0], but with bad performance, misleading comments and some subtle bugs that are really hard to debug. And don't forget about the tests. We will have LOADS of tests, but most of them will be meaningless and it will make almost impossible to do any meaningful refactor without throwing tens of thousands of lines of code away.<p>Do you think I'm being overly pessimistic?
What are your predictions?<p>[0] - https://en.wikipedia.org/wiki/Fast_inverse_square_root
> Think of like CISC vs RISC processors, on one hand we have experienced developers creating really good software while on the other hand we have inexperienced developers using AI creating mostly shitty code.<p>I'm not sure I get your analogy here, unless one set of options should be reversed i.e. CISC has been hanging on for the last 30 years only because of all the shitty non-portable code written for it, and layer upon layer of hardware hacks.
The cost of a line of code will continue to decline which means we'll see a revolution in software in the coming years. It also means that coders, those who only write code as assigned, will see their jobs in jeopardy if they don't continue to improve their skills. Learning new languages will not cut it. They need to look at the employment environment and adjust as needed.
My take:<p>Any prediction 10 years into the future about technology will be incorrect one to two years from the date of prediction thus making such predictions is not a good use one one's time or electrons.