I can't keep up with the advancements in technology and it's taking my sanity away. I remember when GPT-3 was released in 2020 and it was all the news. I remember how "stupid" it was and how we mocked it, last year we got GPT-4 that blew everyone away and this year we are getting GPT-5. Technology is moving forward way too fast for my brain to keep up and all these LLMs are getting so good that almost nothing is worth doing anymore.<p>Learning takes time. Perfecting a skill takes time. I have to spend a year learning Unity game engine and another year to make a simple video game. Maybe I want to try selling it and make a living out of it. Why bother when AI can alredy give me code snippets and detailed instructions, in a year or two it will be able to produce a complete product. I can't compete with that, there are already tens of thousands video games for sale on Steam alone right now.<p>Maybe I want to learn digital painting and 3D modelling and animation to show it to others. Why spend years learning all that when text to 3D is already so good you can plug and play without very little work. Unity 6 is introducing an impressive text to 3D animation feature. That will make animators and modellers obsolete overnight.<p>Maybe I want to write a novel and have people read it, but why bother when Gemini can already do it better than I ever could. It's already producing and publishing both fiction and non-fiction in an automated fashion and it's getting harder and harder to know if it's written by a human or AI. You can't compete with all the spam on Amazon and even if you could, how could you ever prove to someone that you are the sole author and haven't used AI in your work?<p>Same goes for everything else that is not a strictly physical activity. There is already so much spam generated by people using AI products that I wonder how will anything have value in the future? I watched a Youtube short the other day and I had no idea the video was AI generated (it was a podcast) until I read it in the comments. I could spot these fakes in a split of a second just a year ago.<p>AI can do it fast, it can do it well and it can do it cheap. Everything I ever wanted to learn or do now seems like a waste of time. I am becoming apathic. I still have the same job I had when GPT-3 came out, still live in the same property and drive the same car. Seems like not much has changed in my personal life but the techno-world we are currently living through is changing so fast that I might not know how to navigate it in the very near future. So how do I cope with that? How do I adjust? What is my responsibility as a member of society in the context of intellectual participation if all the intellectual work is being outsourced to AI? Can I even be useful anymore? What is my role from now on? What do great minds of today think about that?
Firstly, remember the AI companies have something that is able to do interesting product demos but that is it. If I had a dog that could tell me what was happening in the world but got it wrong 5% of the time my response would be "Oh shit, a talking dog!" and then I would keep watching the news as normal. This is not going to get better: the energy demands are enormous, the amount of untapped data sets for consumption into the models is rapidly running out and the one company who I would have thought would have the best data set, Google, has an AI that tells people cook spaghetti in gasoline for flavor.<p>Secondly, I want you to think about who is most excited for AI "art". Without wishing to disparage them too unfairly, they are the sort of people who could have a meal prepared by a Michelin Star restaurant or the custard machine from Teletubbies and would evaluate both by portion size.<p>No one is excited to go see an AI generated movie. No one is looking forward to the weekend so they can curl up on the couch and read an AI generated novel. There is no demand for AI art on the consumer side, it is entirely on the production side by bean counters who would have had Michelangelo carve the Statue of David from styrofoam to save money.<p>The future is not in AI, if anything the oceans of AI slop will destroy the recommendation algorithms so we'll be back to good old fashioned human curation.<p>Honestly, I would suggest maybe talking to a therapist about these feelings; you cannot look at every aspect of your life through a lens of effort/reward mechanics.
> AI can do [intellectual work] well<p>You are living in a radically different world than I am. AI-generated content is noticably shitty in almost every sphere. Improvement probably will come, but it's pretty damn far off from acceptable (let alone good) right now. Literally every time I've tried to use it for coding problems, it's hallucinated a non-existent function/option or given me something that _looks_ correct but has a fundamental flaw, and AI prose has a distinctive and jarring tone.<p>That said, taking your premises as assumed - well, you have two choices. You can "uplevel" - not in terms of skill, but in terms of "where in the stack" you operate. Instead of creating thought-work outputs, be the one that wrangles and directs the AI-tools (and catches their errors and false assumptions - of which there will be many! In effect, make yourself a Tech Lead of a bunch of developers, where the developers happen to be AIs instead of people).<p>Or, you can opt out of the technological space and do something that can still only be done by an embodied human - like in-person services, creation/upkeep/maintenance of physical objects or systems.<p>(This is approaching this from the perspective of "how do I keep remaining valuable and employable to earn a living wage?", rather than "how do I continue to find fulfillment and meaning in my life?", which still hasn't changed - there are plenty of people who are better than me at the things I find enjoyable, and my enjoyment of them wouldn't change if AI also surpassed me. See /u/quasse's comment for a better presentation of this idea)
Someone commented on the thread yesterday about people's most humbling experience and it stuck with me:<p>> I will never be a world champion at anything, so I might as well play for the love of the sport.<p>The place to start is your own mental state if making art, playing music or making a game seems devalued to you because something else (AI) can do it better than you.
> I'm getting overwhelmed by AI. How to cope?<p>Ignore it.<p>The world may or may not be going to hell in a handbasket, but the being overwhelmed part is on you.<p>If it's really immediately interfering with your ability to support yourself, and I mean immediately: Have you been laid off? Is some other aspect preventing you from maing the money to pay your rent? if not, follow the "Ignore it" advice above.<p>If you really are looking for a long term strategy to prevent being economically sidelined, then consider beconming a plumber, or electrician. This answer is in your intro. Physical trades won't be replaced by software, and they are in demand. Conisder becoming a heat pump installer.<p>Or, maybe this is all a troll? Written by AI? Who knows, so why worry?
I think it's okay to do tasks that others (including AI) can do better and faster. For me personally, making art is as much of the process as the end product.<p>I like to draw digital art and paint and make digital music. I find the activities therapeutic and a fun way to pass the time. Can AI do better? Sure. Do I care? No. I make the art for myself and the people I share it with.<p>Another thing that helps is getting offline a bit. Seriously. ChatGPT can't go for a hike, bake a cake, make love, enjoy people laughing at your joke.<p>I'm in a data mining class and it feels a little dated because of LLMs (though we do discuss them), but I have found the class useful for understanding principles of data and knowledge even though we can basically plug it into a LLm and get an answer.
I have a bit of a spiritual, but also actually a super practical take.<p>Imo this whole situation finally liberates us from chasing things for personal gain, for result, being driven by selfish ego. And instead allows us to focus on what we really enjoy or find meaning in _doing_, for the sake of the process, not the outcome. Which is basically the definition of selfless action (karma-yoga) and is one of several paths to elevated conciousness.<p>Can confirm from my personal experience (of being a shitty but joyful musician among other things) that this indeed is a great way of dealing with this reality. The key is not to compare yourself to AI or other people but to just keep playing your own games that have personal meaning and satisfaction.
AIs are tools similar to hammers and such. Their function is to help us achieve a goal. Mostly, we don't grab a hammer and think about where we can use it. We start with a problem and then we ask what tools we need to fix it. A hammer might be part of the solution.<p>We have many problems in real life where AI will help. Our job is to find impactful problems that will help the human race and then use AI to help solve them. Your job is to understand and focus on a section of life, figure out where help is needed and then use the available tools to solve problems. AI might be part of the solution but there are countless other tools that will help too. In short don't let AI dictate your actions. Instead, use (tell) AI where it needs to help.<p>To start, here are areas of human life that will never change. As long as there are humans. Humans needs such as food, water, sex, companionship, peace keepers, the ability to communicate and on and on. Focus on solutions in any of these areas and you'll be rewarded.
There seem, to me, really two bifurcating paths ahead. Maybe 2.5.<p>In both of them, AI is Not My Problem.<p>In A, we get a technological singularity, and AI is beneficial and benevolent; life is radically improved. In A.5, we don’t quite get to that technological singularity immediately, but get vast improvements in domain-specific systems from individually optimized models. Life is still significantly improved, so much that the edges of post-scarcity start to hit. This is the only one with a maybe caveat, but if anything it’d be leveling disparities.<p>In B, we get a technological singularity (or closeish) but fail to get the alignment problem down. The Terminator movies are so far from how badly we’d lose it won’t matter.
We always lived in this world though, you were already competing with millions of other people (human workers). You can't possibly compete with millions. So why even try? AI just adds even more competition but the equation hasn't changed.