Not in the slightest.<p>Generating a snippet, or even entire blocks, of code is not a good metric on which to base the quality of LLMs.<p>Rather, we should pay attention to how good is an LLM at altering, editing, maintaining code? In practice, it's quite poor. And not in the way that a new developer is poor, but in weird and uncertain and unpredictable ways, or in ways that it fixed in the last prompt but broke in the new prompt.<p>Engineers spend little time writing code compared to the amount of time they spend thinking and designing and buildings systems for that code to operate in. LLMs are terrible at systems.<p>"Systems" implies layers, and layers implies constraints, and constraints implies specificity, and specificity is not a strong suit of an LLM.<p>The systems in which most code is written is not written once and never touched again. Instead, it passes through many layers (automated and manual) to reach some arbitrary level of correctness across many vectors, in many contexts, some digital and some not. LLMs are passable at solving problems of one, maybe two, layers of complexity as long as all the context is digitized first.<p>Lastly, not all problems have a technical solution. In fact, creating a technical solution for a non-technical problem so that we can apply an LLM to it will only cause layers of unnecessary abstraction that locks out "normal" people from participating in the solution.<p>And we may not like to admit it, but we engineers often have work to do for a job we like for pay that's pretty good, only because there are a whole lot of non-programmers out there solving the non-technical problems that we don't want to do.
Yeah the writing is on the wall. I'm a senior AI programmer for an AAA games studio that you probably know (I worked on a very famous RTS game). I have reduced the code I write by about 90% since ChatGPT 4 was released. My colleagues have also reduced their coding time similiarly. This technology is going to remove all the toil and any need to hire/communicate with junior devs.
I imagine it will be the same for other seniors in the programming community. An industry where it's just seniors and LLMs within 4 years is likely, if not sooner, 2 years tbh, and if this slowly transitions to just LLMs and a small team of code reviewers on each site, that would be ideal. Programmers in my area (AI) have a lot of domain knowledge and specialization, we just use code for implementation.<p>So we don't care if LLMs replace coders by auto generating code. All the better for us. The people who are trying to support families ... by offering the world to write CRUD or maintaining codebases are doomed.... they are going to end up as the homeless guy on the street holding a sign saying "will code html for food"
Not too worried. The direction I get is not nearly enough to create anything that works with an LLM. I get told “make X”, and any questions I have are ignored or can’t be answered by the person making the request. I end up making hundreds of design choices along the way that an LLM isn’t going to make. It’s just going to spit out a garbage result based on the garbage input. I don’t ever see an LLM having enough information about the internal workings of my company, the unspoken wants and needs of the company, and feeling the need to recognize and compensate for the astounding lack of information to avoid an angry boss.
I’m not at all worried about LLMs replacing even low-level engineers. We’ve been using Copilot and ChatGPT for awhile at work now and the most useful applications we’ve found are more analogous to a compiler than a developer. For example, we’ve had good luck using it to help port a bunch of code between large APIs. the process still involves a lot of human work to fix bugs from its output but a project we estimated at 6 months took three weeks.<p>On the other hand, as someone whose role gives me visibility into the way senior leaders at the company think about what AI will be able to do, I’m absolutely terrified that they’re going to detonate the company by massively over-investing in AI before it’s proven and by forcing everyone to distort their roadmaps around some truly unhinged claims about what AI is going to do in the future.<p>CEOs and senior corporate leaders don’t understand what this technology is and have always dreamed of a world where they didn’t need engineers (or anyone else who actually knows how to make stuff) but instead could turn the whole company into a big “Done” button that just pops out real versions of their buzzword filled fever dreams. This makes them the worst possible rubes for some the AI over-promising — and eager to make up their own!<p>Between this and the really crazy over-valuations we’re already seeing in companies like Nvidia, I’m seeing the risk of a truly catastrophic 2000- or 2008-style economic crash rising rapidly and am starting to prepare myself for that scenario in the next 2-5 years.
The less experienced you are and less mature as a software engineer — the more fearful.<p>You can literally estimate the proficiency and experience level here in comments.<p>People that have seen some serious sh*t probably won't even bother answering the question.
Not one bit. Writing code isn't my job, it is just the tool I use to do my job. The actual job is to provide systems that solve problems. Even if a new tool comes along that can write the code, it does not change my job.
Not at all. I do worry though that people will not try as much to learn fundamentals. I’m certainly somewhat guilty of this being in the Search-era, I fear it’ll be easier to succumb to it in the AI-era.
While I agree that LLMs can be useful for coders, and productivity can be increased, coding blindly based on ChatGPT code without understanding concepts could result in very serious bugs.
How can you be sure that the code does not have vulnerabilities for example, or solve things in a non-idiomatic way, if you just copy paste code without an understanding.<p>As long as an understanding of the domain and problem is required a skilled person is required between the codebase and the LLM.<p>Even if we get to a point in the future where a programmer can be replaced by an LLM, we'll have businesses where they want someone to use the LLMs to create software.
I don't think that's gonna happen... unless I don't keep learning. Let me explain:<p>I started my career over a decade ago. I knew PHP and MySQL only. Then I moved onto other companies, learnt other stacks (frontend, node, postgres, cloud, go, python, datadog, ci/cd, docker, k8s, aws, etc.). Nowadays it's not hard for me to find a new job. But what would have happened if I stayed with PHP and MySQL and never learnt anything else? I would be jobless.<p>So, I'm not afraid of LLMs replacing me. I think they will open the door to more jobs in IT, actually.<p>I just need to keep learning (and AI has nothing to do with this).
I just tried to build a c program for an Arduino to display some text on a screen.<p>It needed a lot of work to get it to work. That’s not to say that it couldn’t have been coerced into being right through prompts, but it’s a reminder that LLMs are not thinking. You still need someone who understands not only what is wanted, but why it looks the way it does.<p>I suspect that most of this stuff will end up like syntactic sugar: Something that makes our work easier, but not fundamentally replace us.
Not too much. Mostly doing security work and lot of can be automated, but still I don't believe LLMs have all the ideas. Same goes for system design and picking the right parts, even if they are ready or can be done by LLM.<p>In the end we need humans as final sanity check or over all design for good while. Or just to decide the right requirements.
Pair programming with AI already is great. I fear if it weeds out experts and people implement AI versions of themselves as offshoot. Seems far away given depth of business logic/available data in those limited spots
The labor replacement is happening in the whole human history, but I am not worried, not because I don't believe LLM's capabilities in 10-15 years, but because I believe I can always find some areas where LLM is not good at.
I am.<p>I think with enough scale and more organized and structured use of them, they (or another soon-to-come AI breakthrough) will outdo a human mind.
You are naive if you think you have 10-15 years. GPT-5 will most likely be out by the end of the year. It will be significantly better than GPT-4. I expect it will replace millions of people - not explicitly - companies will gradually have fewer people doing more work, resulting in increasing layoffs and decreasing hiring. This has already started happening: I spoke to several startup founders recently who use GPT-4 instead of hiring marketing people.<p>GPT-5 will be significantly better at coding, to the point where it might no longer make any sense to hire junior developers.<p>And this is just GPT-5, this year. Next year there will be GPT-6, or an equivalent from Google or Anthropic, and at that point I fully expect a lot of people everywhere getting the boot. Sometime next year I expect these powerful models will start effectively controlling robots, and that will start the process of automation of a lot of physical work.<p>So, to summarize, you have at best 2 years left as a software engineer. After that we can hope there will be some new types of professions that we could pivot to, but I’m struggling to think what could people possibly do better than GPT-6, so I’m not optimistic. I’d love for someone to provide a convincing argument why there would be any delay to the timeline I outlined above.<p>p.s. I just looked at the other 20 responses in this thread, and it seems that every single one is based on current (GPT-4) LLM capabilities. Do people seriously not see any progress happening in the nearest future? Why? I’m utterly baffled by this.
I'm more worried about the web becoming unusable due to AI/ML spam. Already now there is a huge amount of LLM generated bullshit content in the web and it will only get worse. And then other LLM's might train on that BS.
Not worried. LLMs are just another tool that you can use to be more productive. How often and to what extent you will use LLMs differ from person to person. I think LLMs can help you but not replace you.
Very worried. The BLS edited their stats, projecting the number of software engineers will decrease by 11% in the next ten years:<p><a href="https://www.bls.gov/ooh/computer-and-information-technology/computer-programmers.htm" rel="nofollow">https://www.bls.gov/ooh/computer-and-information-technology/...</a><p>This is a pretty drastic change from their earlier prediction (often cited all over the internet) that the number of software engineers will increase by more than 30% in the next 10 years. It's not the full replacement of software engineers I'm worried about, so much as the steep reduction in the number of jobs and the labor/wage pressure that will make this job pay a fraction of what it's paying now, and make everyone's livelihoods more precarious in the next 10 to 15 years.<p>Karpathy already stated in 2017 that "Gradient Descent writes better code than you", when he wrote about "Software 2.0" as feeding data to neural networks:
<a href="https://karpathy.medium.com/software-2-0-a64152b37c35" rel="nofollow">https://karpathy.medium.com/software-2-0-a64152b37c35</a>
Nvidia's CEO, Jensen Huang, seemed to have confirmed that point this week in persuading parents not to encourage their kids to learn to code.<p>Today, this YT video by a dev named Will Iverson about how software engineering jobs are not coming back made me really anxious, and start to worry about making backup career plans in case I need to transition in my late thirties / early forties. (That sounds sooo hard...I'm a recently laid off mid-level full stack engineer of seven years, but I wonder if it would be better to transition now while I'm younger. Why wait 10 to 15 years to become increasingly obsolete or more stressed of becoming laid off? How can I support a family like that? Or make any plans into the future that might impact other people I'm responsible for?)
<a href="https://www.youtube.com/watch?v=6JX5ZO19hiE&t=3s" rel="nofollow">https://www.youtube.com/watch?v=6JX5ZO19hiE&t=3s</a><p>I don't think the industry will ever really be the same again. But I'm sure a lot of us will adapt. Some of us won't, and will probably have to switch careers. I always thought I could at least make it to retirement in this profession, by continually learning a few new skills each year as new tech frameworks emerge but the fundamentals stay the same -- now I'm not so sure.<p>If you think I'm wrong, can you please help me not be anxious? Older devs, how have you managed to ride out all the changes in the industry over the last few decades? Does this wave of AI innovations feel different than earlier boom-bust cycles like the DotCom Bubble, or more of the same?<p>What advice would you give to junior or mid-level software engineers, or college grads trying to break into the industry right now, who have been failing completely at getting a foot in the door in the last 12 months, when they would have been considered good hires just two or three years before?