As a regular person, I would suggest that one method is to not over-intellectualize it, as much as promoter types may throw extreme possible future outcomes at the screens one look at.<p>Having thought a little about it, some thoughts below. If something helps, great. If not, I tried. What do you think of the following concepts?:<p>1. Big fat trends. There is repeatedly a "hype wave" of this or that, in technology. Companies jump on board. Old work gets done using the latest fad. There are some improvements and some detriments as a result. But overall "progress" marches on. So, you have crops of companies doing the same basic busy-work, which still requires integrations. It's a cash cow. Companies at the top may be vertically integrated, making shovels. Lots of people buying and using the shovels to help dig for the gold. The more things change, the more things stay the same, though. In some way, this might just go on this way. In some ways, big AI appears to be just a jobs program to keep people busy, so don't worry too much about it.<p>2. Limitations. Without being overly specific, the ML systems today can do a lot and almost surely will do more tomorrow. But there are still huge, obvious (to some) problems with these systems, that, currently, and a far as this observer can see into the future, will still require a human in the loop (to actually be as effective as purported). I guess this gets closer to the "politics" that you, the OP, state a valid reason (the pointlessness of it) to stay away from (perhaps for good reason). But there's something to be said for addressing the limitations of machines trained on a bunch of imperfect data. At some point it does seem to come down to ideas, and ideas can be inherently political. I think we have technological progression, and society gets complacent and needs to catch up, in order for people to really be able to make use of the tech advances. Some people have talked about this at length I think. As a chef turned web guy, I think you could pivot into this adjacent/more-human-centric area if eventually needed.<p>To elaborate on (2) broadly, consider what ML branded as AI is supposedly able to unlock. Then look at the websites of 5 or 6 top AI or AI-tangential companies, including maybe their blogs or hiring pages. And see JUST HOW LACKING in true intelligence those companies really are. (If you can't see it, that's okay.) The point is that the people that develop the AI really are not utilizing very fully AT ALL the larger field or true capabilities/applications of what they sell. If we're going to be all lofty about it, we are obligated to explorer here, too, IMO. I don't think it's that such companies and their employees are too close to the problem. I think it is that the problem space is that large and difficult. This is not meant to be a detractor so much as it shows how much opportunity is out there, and how much careful, sustained effort meaningful FORWARD progress will take.<p>3. Just stay focused on what is actually real and in front of you. What do the customers want to pay for? Offer that. In tech this seems to often require picking up certain skills along the way I guess, although often old things remain applicable, too. I think that will keep going for a long time.<p>4. There could theoretically be a burgeoning arts and architecture scene if we don't have to manufacture widgets ourselves so much. I guess we haven't really seen this play out a whole lot, maybe some (not talking NFTs). What is thought to throw this off track would be an asymmetry in art-funders (and art-interested) and art-makers. I don't have the answer here, maybe UBI is part of it. But I think this is an area that have businesses invent creative solutions. Because we shouldn't defund music class to push AI and enviro-tech; there has to be a better way, what is it?