After reading about this I decided to try my hand at using ChatGPT. I decided okay, let's see if it can recreate some code that took me a few hours at work to figure out. I asked it very precisely what I needed and my mind was blown as it produced code that looked similar to what I had coded at work. And, I was like, well that's that then, we're all out of a job. But, then I tried to run the code, and it didn't work. I looked more closely and the code had a lot of flaws. Even after manually fixing those, it still didn't work. And, then using my knowledge of how to actually solve the problem I rewrote the code 40% and made it perform the action needed.<p>I think all ChatGPT is doing is grabbing a lot of different answers off the interwebz and squishing them together and hoping it answers your question. But, in a lot of cases it only kind of looks like what you want. If you look at images generated by AI, it is the same issue, they sort of look like what you want but there are flaws, like faces that don't look quite human, fingers that are just squishy appendages barely resembling actual fingers, etc. I mean, the tech is getting better, it's impressive, and uncanny.<p>But, I think we're pretty far from having these things write themselves, they need quite a lot of human intervention to be useful. Still, very impressive and something that could potentially get you closer to an answer. But, no more than spending a little time googling or learning the skill yourself. And, if you learn the skill you're better off, because then you can do it right yourself IMHO.<p>Also, anytime someone gets a fully working program generated out of this thing the saying, "A broken clock is right twice a day." comes to mind.