HN doesn't really qualify as a "community." It's site full of mostly tech-related or tech-adjacent posts and comments, viewed and added to by people from all over the world, of all ages and experience levels, interests, motivations, etc. So it doesn't make a lot of sense to look for a consensus or overall opinion, or to ask if "HN is in denial" because HN doesn't stand for any coherent group of people or set of opinions.<p>I have seen HN opinions about what we're calling AI this year range from fear of the robot overlords, to wide-eyed acceptance of the hype and reverence for Sam Altman, to wait and see, to skepticism and outright dismissal of the whole thing as another grift run by VCs. So opinions range all over the place, as does the reasoning expressed to support them.<p>No one knows what will happen, other than the obvious: Lots of money will get poured into AI, and the technology will get deployed (almost certainly prematurely because of FOMO), and pushed on us whether we want it or not. That's already happening. Whether the money and the hype will actually accomplish the stated goals of either AGI or vastly improved productivity, who knows. So far we don't even have full self-driving or LLMs that don't just make things up.<p>Some people, like Sam Altman, seem to think AGI or at least productive uses superior to humans comes down to scaling LLMs up. Other people think that LLMs will plateau, reaching a limit of capabilities inherent in how they work and increasingly polluted and "photocopy of a photocopy" training data as future LLMs train on LLM-generated content of questionable quality. Follow the money and that very likely points to who stands to profit from LLM hype and adoption.<p>I've worked in IT as a developer and system admin for over four decades, so I remember plenty of previous hype cycles, including the second "AI winter" (the first AI winter happened when I was still in high school). And I've survived multiple apparent threats to my career future, mainly offshore outsourcing, but also things like no-code/low-code, trends that stoked a lot of worry but either fizzled out or caused more of a ripple than a tsunami. I agree with the people who have pointed out, on HN and elsewhere, that writing software requires a lot more than learning a programming language and cobbling together snippets gleaned online, though a lot of people get jobs in the industry with those skills and little else (or used to).<p>Some specific fields look more vulnerable than others. News and "content production" have already taken a hit. I wouldn't tell my kids to go into transcription, translation, paralegal, or even law at this point. Maybe avoid rote medical jobs like looking for anomalies in lab and imaging results. VFX looks vulnerable, but that field got computerized a long time ago. For everyone amazed at Sora's videos (which I admit seem impressive), consider we've had Pixar-style movies and photorealistic video games for quite a while without anything we might call AI. We could experience significant hollowing out of some industries, very similar to what happened to auto manufacturing, and manufacturing in general, due to both automation and offshoring.<p>Because I write code for a living I like to think that won't get taken over by LLMs in my lifetime. I've looked at the current crop of tools, I'm not worried, but junior developers might consider the impact of those tools on their prospects. I will retire when I can't buy a keyboard that doesn't have a copilot key on it.