There are now prominent people in tech suggesting our next generation shouldn't learn coding because that'll be done by AI. What programming jobs do you expect will be less affected?
Have you ever seen anyone build a software project end-to-end using only "AI" to write code? It can be done and the process is illuminating, specifically when you look at what the AI actually contributes.<p>At the very minimum, you need a human operator who can a) work with stakeholders to gather requirements and express them as unambiguous prompts, b) debug code that they themselves didn't write and c) build a coherent architecture for integrating, packaging, and deploying the code IRL.<p>In a very tangible sense, it's similar to outsourcing the coding to a consultant who is fast but sloppy and hard to communicate with. They may be quick but it takes considerable effort to clean up after them. No manager would ever put up with a human employee that "behaved" like an LLM (fast delivery with no quality control, outright lies and hallucinations, constantly ignoring subtle aspects of the instructions). LLMs are cheap and fast, but you must compensate heavily for their faults.<p>Think about the core skills that we identified as table stakes for working effectively with AI: sussing out requirements, exceptional debugging, and a solid sense of software architecture. I think you find those are the core skills of most good software developers! AI merely raises the bar for software professionalism - it's no longer enough to "just write code", you need engineering leadership and project management skills to tie it all together. Those skills are now in <i>higher</i> demand.<p>The only programming jobs that need to be worried are the mindless code-monkey positions. If your only contribution is the code you type in your editor, yeah AI's taking your job tomorrow. As we flood the market with "code", consider that its value trends toward zero. And since code is but one of the necessary ingredients in making a software system, those other factors rise in value.
I don't think any common software engineering roles will be disrupted by ai.<p>I would speculate that a fair number of those prominent people are talking their book and have financial incentives for others to believe that is true and spend money on their ai products.
In the short term managers and companies are firing because they dont understand. Long term they will hire people back.
(dont let them forget, dont work for cheaper than worth, unionize where possible)
> There are now prominent people in tech suggesting our next generation shouldn't learn coding because that'll be done by AI.<p>If people listen to that advice, then we all have nothing to worry about.
The main problem with AI-gen solutions is validating the solutions. Across all industries, the most automatable jobs are the ones where the correctness of the AI's solution are most easily validated, and the situation is <i>especially</i> acute where the validation itself can be either done automatically (e.g. CI/CD), or outsourced to Mechanical Turk or Upwork.<p>So: anything that is impractical to validate in an automatic way, or, better, validate at all, is going to fare better.<p>Careful, however: many things that do not appear easily validatable actually are! For example, you might not be able to validate (with a GitHub workflow) that users prefer (say) a new UI design, but of course, you can check the design against well-understood usability principles, A/B test, and, soon, probably, user simulation, etc.<p>So: in the short and medium term, work that requires careful human attention or domain knowledge to validate (e.g. whether or not an interface design or system design or architectural plan is fit-for-purpose) will last longer.<p>In the long term, I don't think any programming work is safe.
Every skill we learn incurs some cost to learn it and provides some benefit.<p>The benefits are uncertain, they might be more or less than before LLMs could produce some code.<p>We know the cost of learning to code is now much less than before. The cost of learning almost anything is much less than before.<p>Those who say we should not learn are fools.
Embedded systems.<p>It's going to be a while before an AI can take a chip spec and give you a device driver. It's going to be longer before AI can take a <i>buggy</i> chip spec and give you a <i>working</i> device driver.<p>Still longer before it can give you a whole working product.
ML is quite ok at answering questions,
but it is very bad at asking the right questions.<p>Engineering is about asking questions.<p>Looks like we are at least 50 years out from "AI" replacing entry level engineers.
I'll let me grand children worry about that,
and use this glorified auto completion until then.
Software engineering will change. Up to this point, programmers have to build software code line by code line. In the near future, in less than ten years, we will get tools that automate a big chunk of that process. Software engineers will still need to design the program but the building process will be much more automated. You'll be able to request big chunks of code that will be glued and arranged by software engineers, OOP on steroids. Basically, the programming process will be sped up greatly. Software engineers will need to adapt. It reminds me of the transition from hand-coded HTML to a designing tool that lets you draw a web page and the editor takes care of the HTML code. The HTML code produced by the editor is mostly messy but it works and it greatly automates and speeds up the frontend design process. I will even say that at some point we will have apps that will output custom apps designed and created by amateurs. It will be similar to the custom websites that are produced by companies that host sites, companies like Godaddy or Squarespace.<p>This means that software engineers will need to be more like software designers. They will need to focus on what people need and want rather than being handed a design and having to code it by hand. There will still be a need for software engineers but the way things are done will change. Translating what users need to apps will be much more important.<p>I suggest you keep up with AI's coding abilities. And you transition to a role where you can speak directly to those people who have ideas they want to translate to computer code. Those apps will include AI. We are starting to see that in the OpenAI store. However, software engineers will be able to put together much more complicated apps.<p>If you have the skills you might want to create apps that create custom apps for amateurs. Basically, the way I see it, a person opens an app, types in a request, and out will come out a complete app that can be used. On the backend, you can have a set number of apps that will be produced by a bot that will guide users by asking questions to a complete and working app.<p>We won't get to the point of the user asking for a custom-complicated app and having AI output the complete program 100% of the time anytime soon but we are inching towards it.<p>The big question is: What apps are needed and how abstract can they be so they fill the needs of the maximum requestors?<p>I suspect game software will be greatly impacted since you can create many different versions of a game but there is only a set amount of software engineering needed.<p>So, for the foreseeable future, we will still need engineers but the role requirements will change.
The only jobs in programming that are truly threatened by AI are ones that are held by people who don't want to learn to use AI to make their programs better and easier to write.*<p>AI is like the internet. Those who make platforms with it will get seriously rich. Those who don't get it all will get left behind economically. The rest of us will adapt to it and use it as a different, and often better, way of working.<p>*Copilot for VS Code, for example.
Very low-level work (OS kernels, firmware) and the more "product managery" high-level work at that requires speaking to customers. If models remain as unstable as they are rn, infrastructure and testing would be needed.