Been messing around with ChatGPT & I am sincerely amazed! I am wondering what the future will look like a couple years from now when these LLMs improve . Also, What fields do you think will be safe from the AIs
No, this will just lead to more software being developed, with more requirements for devs. Jevon's paradox, perhaps.<p>It's like any tool, it multiplies the effort of a designer. Previously we had shovels, so we needed a lot of healthy guys to shovel dirt. Now we have earth movers, and we use a more trained person to operate it. But it also means there's a lot more projects that become viable, so more people will end up getting sucked into them.<p>Personally I'm looking forward to being able to specify software without having to deal with minor issues like forgetting what to import, off-by-ones, and that kind of thing. I can spend more time thinking about the requirements.
My uncle started coding at a young age with punch card programming. He says that a developer was at least a hundred times more productive at the end of his career than in the beginning. And still, it did not result in the death of the programmer.<p>We are a long way from AIs that can operate independently. And until then the AI will just be a productivity tool. Maybe we will all become a hundred times more productive, and it will still not be the end of the programmer.<p>The real threat that I see is that the AI will leave the boring part to us: The code review of AI work. If I see this development I will plan for an early retirement.
How well does the AI on stuff that you would not find implemented online or in its training data?<p>Another important question is about scale, making a thousand times a 10 line application is not the same as making one 10000 lines application. How good will the AI be at that?<p>Right now it is the same as "googling the answer", just faster. Which is impressive, but far from career ending.<p>I think another important aspect is that a lot of SW development is building upon already existing code bases. One benchmark for an AI would definitely be its ability to take in an entirely novel code base and make edits to it. I would guess that any current language model, no matter how well trained can achieve that. The tendency for it to <i>invent</i> by itself seems to make that destined for failure.<p>>Also, What fields do you think will be safe from the AIs<p>Any industry which is more than a bit hesitant about change. E.g. aerospace and defense.
I doubt it in near term, because frankly, the people who ask for or design features frequently aren't very good or always logical. Someone who could prompt or specify the real business need would end up being an engineer/developer of some sort.<p>Mindless stuff like "add a column with birthdate in it" could be automated though.<p>I say this because look at some the garbage software produced by real humans/contractors that actually have reasoning capabilities, and it's still buggy and terrible.
Healthcare will probably be safe for the foreseeable future (not a long period these days). This is because (1) current AI models need an <i>enormous</i> training set relative to humans, and (2) medical information is treated with relatively high levels of confidentiality, making it difficult to get a training set big enough for current AI to become really good with.<p>There will be specific exceptions with specific medical tasks, though I can’t really forecast which or how many tasks. Being able to triage arrivals at A&E based on their symptoms is different from having a database of known pharmaceutical interactions, and both are very different from being able to tell which mole is cancerous and which is benign.<p>But overall, I expect the domains with limited or absent training sets to be those that go longest before being automated, and I think medicine has a lot of specific tasks in that category.
Most of the code I write is highly focused on the APIs and concepts specific to my own company. Anything general enough to be amenable to ChatGPT I feel I could generally get from a third party library or StackOverflow. Maybe it will replace StackOverflow? But if it does, where will it get the next generation of answers from?<p>I do see lots of positives and use cases for more narrowly defined tools that help the programmer and make him/her more productive and powerful.<p>For example, I've been playing with a new Terminal app (Warp) the lets you type plain English at the prompt. It then translates it to the proper bash command using GPT3. It's brilliant, it works, and it doesn't put me out of a job. It just makes me more productive and more able to focus on the problems specific to my company.
No, because as good as it is at boilerplate, the domain specific stuff eventually becomes extremely important.<p>And without coding skills, that boilerplate will break or fail to get stitched together, too.<p>I absolutely see it making some parts of the development process vastly more efficient though.
I think it will come for the low hanging fruit first. Thing of all the simplistic code that gets outsourced to India. Banking applications, general mobile apps, internal time trackers, internal resource management tools, etc. It is going to reduce the costs of creating/maintaining those run of the mill business applications.<p>As you get more specialized, there is less example code, thus the AIs do worse. As you move to the cutting edge, there is almost no example code.<p>So things that have been done thousands of times, will get eaten by the AI coders first. Then it will move up the value chain, how fast it isn't clear.
A lot of the top comments seem to be saying no. I'll go ahead and make the argument that, yes, this technology will eventually pretty much end most programmers jobs. I'm not talking about tomorrow. Even if the technology was perfect right now (which it is not), it takes time for organizations to adapt to change. I am also not talking about something as far away as 20 years in the future either. I think in as little as five years, we will see a very measurable decrease in paid programmers due to automation. In the short term many of them may be shunted over to code review and QA workflows, but the idea that the programming industry is somehow special, and will be the exception to the rule that increasing automation in a field decreases the number of people employed in that field seems a bit groundless to me. Someone used the example of heavy equipment and shoveling dirt. Yes. There are still people employed in the business of moving around dirt. But the percentage of the population that is so employed definitely went down with the invention of things like backhoes. Take it from a guy who was actually employed at a job that involved moving around a lot of dirt. I used to work for a water utility company. Hearing the old timers talk, you realized that the work crews used to be a whole lot bigger. Backhoes existed, but they were very expensive weren't yet ubiquitous. The backhoes would be reserved for the most complex or deep dig sites. Otherwise it was shovel time. The crews had to be bigger. Now every dig crew has a backhoe; the standard dig crew is four guys, but you can totally get a dig done with just three, if one of them called out. That being said, people do still get paid to move dirt around. You still need somebody to operate the machinery. But if a magical backhoe was released tomorrow that could reliably and safely dig on its own, it would without a doubt be adopted as soon as the organization had the money for it. Digging is a huge source of liability; the monetary incentive would be quite high.<p>Now earth-moving is one industry. Let's talk for a minute about another industry. Ice cutting. The ice trade was a huge industry in the 19th century. At its peak it employed 90,000 people in the United States alone. It was ended almost overnight by the invention of the refrigerator. Hardly anybody even remembers that it was a thing. The world went on without a thought. That is another possible outcome of automation. And as scary as it is, I'd still rather have my refrigerator.<p>So maybe the programming industry survives, and just employees fewer people, or maybe it goes the way of the ice trade, and in a century or two it will just be a footnote of history that people used to have to manually program computers.
It's going to be a tool to help developers get things done faster for a while. Stuff that has been done a million times and has lots of stack overflow posts will be easy for AI to "get". Asking it to develop an application to very specific requirements (ie, specific data queries, specific UI designs, interacting with internal company services, etc) is not going to work so well and programmers are still needed. Personally I would love if I could just feed in a bunch of specs and requirements and have a program be spit out for me to review. I wonder if we could train an AI to take a bunch of test cases and a description and it could generate the code that passes the tests (ie TDD).<p>Could AI reduce the need for programmers? Probably.. I could see some "low code / no code" services built around AI. They could make their own programming language and/or training data in a specific domain to be more easily digestible by a LLM.<p>I think the situation for programming is pretty similar to the AI art situation, tho arguably programming is the more difficult problem.
In general AIs tend to be overhyped and tend to underdeliver when faced with the real world. I think AI in this field is no exception. It will be able to do the simple stuff better and better, but levelling up to the complex parts of SE (like turning complex requirements into functionally correct code) is probably very far away indeed.
I ran across another post here where someone shared their experience seeing cooking recipe articles that were plain wrong but presented like known well established recipes.<p>If AI assisted code is like this then no because it would waste more time than it saves.<p>My biggest concern is the competitive nature of engineering and pressure from the business side of software moves the industry to adopt it and instead of 3 zero day vulnerabilities there are 500. Failures being opportunities, nation states will see AI assisted code and try to figure out how to use it to their benefit, while you and I get a 15 second shout out during the next all hands meeting for spinning up the app the VP of fantasyland dreamed up.
as long as neural networks are not sentient developers should be safe because most of our work is not actually writing a code (even if its not CRUD but an actual R&D its usually simple and built on top of simple concepts) but understanding what owner/customer/boss/etc wants, planning and adapting the architecture and generally "seeing the future" to make something that will be expandable, upgradable and maintainable.
GPT can't really do that (for now) and in the far future when the A.I will be advanced enough we will probably be the ones who'r writing promts and moderating the outputs
> Is AI-Assisted Coding the Start of the Death of Software Development?<p>Would you feel safe if your airbag system would be written by some statistical model? Or worse, if all layers in the stack for the airbag system would be written by some statistical model?<p>Having an almost correct implementation written by some "AI" at the highest level sounds reasonable at first thought, but now imagine having almost correct implementations at all levels. On average it should work, but it won't. Software isn't about getting things right on average. It is about getting things exactly right.
I think there are two roughly important rates to consider here:<p>1) Peter Thiel’s 10x improvement concept from Zero-to-One.<p>2) The exponentially increasing rate of AI research and improvement.<p>additionally, a third point is relevant:<p>3) Max Tegmark’s idea that the best AI agents will have been generated by AI.<p>I think we’re seeing an ‘intelligence’ explosion begin to rustle from its sleep. It’s difficult to predict when it’s coming, but I don’t think SE’s are the ones with the safest jobs. The inflection point will be a ChatGPT that costs $10k/yr to run, as that’s basically a 10x improvement in the cost of devs.
yes and it should have been a long time ago. You need to see AI's as frameworks and then we will go from there.<p>No job will be safe and thats a good thing. People value themselfs to highly and i bet from experience all of us know at least one guy who could have been easily replaced with a shell script.<p>That is not meant in a way that that person is not able to do other things or is impaired in anyway, buutttt imagine how many jobs in general could be replaced by very small shell scripts. Coding aint gonnna be any different.
I don't think it will die but much of the current typical activity of programmers and other jobs will be gradually switched over to AIs over the next 5-10 years or so.<p>A "computer" used to literally be a person operating a machine or doing arithmetic in a bank.<p>"Programmer" will start to mean more like someone who knows how to interface with and connect up different AIs.<p>Many jobs will be mostly replaced with AI. There will be a lot of new jobs. Especially for people who integrate with AI via brain computer interfaces.
It's kind of just an ide on steroids. You'll probably still need someone to make sure the code _actually_ does what is supposed to. Also, interfacing with weird APIs or writing drivers for physical hardware where you need the datasheet to know what to do are probably still far away from being outsourced.<p>I'm more worried about everything becoming even more of a buggy mess than it already is. Wonder how good something like this can get at predicting weird edge cases.
Any job an assistant could do for you wasn't a very interesting job to begin with. It's like any job some codegen could do for you before. A chore.<p>If you can type a comment and the (current gen) AI assistant spits out a working implementation, that makes it almost a certainty that you could type it as a google question copy the StackOverflow answer. It does't invent anything novel. At least not yet.
Well it depends. If the AI can be used to create an opensource model of the AI then the value of the AI will plummet faster than the value of software developers who know how to use the AI.<p>Software is highly profitable because of its very low capital intensity. A proprietary productivity boosting AI would increase the capital intensity of software development which is bad for workers aka software engineers.
No. Software does not scale linearly.<p>I.e. 10 LOC -> 100 LOC -> 1000 LOC -> 10K LOC -> 100K LOC.<p>I failed to see AI moving behind 1000 LOC. I.e. 10*100LOC << 1000LOC.<p>Also, the AI is statistical, it does not have any notion of the real world. I.e. he might know how to write a function, but it does not know why you need the function.
> What fields do you think will be safe from the AIs<p>A new arm of Government will develop to regulate A.I. --- a need Elon M has been crowing about for some time. There's talk of writing spec for Wario bots along a gradient of danger levels.
It depends if you mean software development as in building software or "Software Development" as in the bloated and ridiculous industry that has evolved around building software.
I think it’s the end of all the jobs.<p>The job of most programers is to understand a problem and to create a few kb of text per day. On this front GPT is already a million time more productive. Maybe it doesn’t understand all the subtleties, but I don’t see a reason why it can’t.<p>But if you look at most office, the job of people is to shuffle a few kb of text per day. Email to customers or suppliers, a bit of documentation, etc.<p>Add an humanoid robot for the manual tasks and one or two year of improvement of the AI.
Maybe. It surely has to potential to eliminate the "hero" team setup (1 senior + n juniors doing all the boring stuff), which means people won't get junior jobs and seniors will thin out eventually. So, yes, maybe indeed the death of software development.<p>Or it could transform swdev into an AI-centricindustry where people take care of everything the AI can not.