Potential shown by LLMs recently is just mind blowing. Even though right now it can only code basic stuff, but if the rate of progress remains the same as what it is right now then I won't be shocked that they will be able to do real complex level of coding / tasks in coming future.<p>Though it might take some time for companies to trust / adopt such agentic solution but once the agents are packaged as stable scalable products with tremendous cost cutting, they will spread like wildfire.<p>Given this, is the job of programmers in future in jeopardy?
Some programmers will lose their jobs to LLMs. Those who lack confidence and give up because they saw a rigged Devin demo, for example. And those who can't actually deliver value now, because CoPilot can copy/paste from StackOverflow better.<p>If you think of your job as just writing more code then you might get pushed aside. If you think of your job as solving problems and adding business value, working with other people and thinking things through, I expect you will survive the LLM hype train.<p>Right now companies pay a small fraction of the true cost of LLMs. We're getting that first hit free to get us hooked. Once companies have to pay actual costs LLMs might not present an attractive alternative. And the performance of LLMs has started to plateau -- no reason to think the "rate of progress" will continue. And many reasons to think the rate and the progress get over-hyped and exaggerated.<p>I've survived several no-code and outsourcing panics in the business already. Do you think the managers who can't reliably reply to an email or sum a column in Excel will just turn into "prompt engineers" cranking out working code?
No.<p>To be fair there are many, so very many, people employed to write software that cannot actually write software. Many of these people are likely already replaceable by LLMs if they just modify static templates and copy/paste programming patterns.<p>For developers that actually write original logic and solve real problems your jobs are safe. LLMs are only as good as what they are fed.
<p><pre><code> I won't be shocked that they will be able to do real complex level of coding / tasks in coming future.
</code></pre>
What about something that could automatically convert code written in one language into another? <a href="https://en.wikipedia.org/wiki/Source-to-source_compiler" rel="nofollow">https://en.wikipedia.org/wiki/Source-to-source_compiler</a><p>Or an abstraction that permits non-specialists to actually interact directly with computers? <a href="https://en.wikipedia.org/wiki/General-purpose_programming_language" rel="nofollow">https://en.wikipedia.org/wiki/General-purpose_programming_la...</a><p>Or maybe I can simply describe my problem visually, and let the computer generate code? <a href="https://en.wikipedia.org/wiki/Visual_programming_language" rel="nofollow">https://en.wikipedia.org/wiki/Visual_programming_language</a><p>The point is, this isn't the first time the job of the programmer has been automated, and it won't be the last. Software engineering is the discipline of abstractions - we are good at automating ourselves. The arrival of high-quality LLMs that generate code is merely a new wave of automation.<p>Which is gonna be better, the layman who wants to generate code with AI, or the senior engineer who wants to generate code with AI? The former may be cheaper, but that's been true since the dawn of computing, and not just when LLMs became available for the masses.
The European Union has just made producers of software liable for bugs, as has been the case with other e.g. electronic products already for a long time.<p>This means it would be quite risky for companies to rely on only a few junior developers that can only copy & paste from ChatGPT.<p>But LLMs are useful for initial boilerplat generation if one is careful to vet the result (in particular error handling is often incomplete or missing because the models are trained from Internet tutorials rather than robust production software).
Calculators can do math per CEO of Texas Instruments. (h/t @naval)<p>More seriously, I don't believe the current progress will expand beyond fields where there's clearly defined rewards (eg narrowly defined task of producing some outputs given well-defined inputs).
I think you have it backwards.<p>The companies trust and are adopting AI now.<p>Though it might take some time for AI to be develop that can live up to those expectations.
<i>Potential shown by Outsourcing recently is just mind blowing....<p>Though it might take some time for companies to trust / adopt outsourcing teams, but once the teams are brought up to speed with tremendous cost cutting, they will spread like wildfire.</i>