It's not just that there are people talking about existential risk but that the most prominent are talking about them in a flawed framework.<p>The "longtermists" aren't so concerned that <i>we</i> die but are more concerned about an imagined glorious future where our descendants built self-replicating problems and fill the galaxy with simulated humans living inside Dyson spheres, Dyson swarms, something like that.<p>As preposterous as that sounds (there are at least as many steps from here to there as there are in Drake Equation, do we know we approve of those "people"?, can you make rational decisions about the future without incorporating a "discount rate" that extinguishes the weight of the infintely far future, ...) they make a case based on Pascal' Wager, even if there is only 1 part of a billion chance of this future coming true but there are 1000 trillion trillion beings in the future the welfare of those beings greatly exceeds the welfare of us all (PRO TIP: there's a reason why you can't add or multiply the utility functions of various beings in reputable game theory, economics, philosophy, ...)<p>It's really a cult and it has as many front groups ("effective altruism", Aella's sex parties, "morewrong") as the third international, Scientology or the LaRouche organizations. Like Scientology they think that you should be thinking about what <i>might</i> happen 50 million years from now or what the e-Meter said happened 76,412,981 years 54 days 7 hours and 35 minutes ago as opposed what is going on right now. They'll tell you what logical fallacy I'm using if I compared them to People's Temple, Heaven's Gate, Aum Shinkrikyo, and they might be right, but few people thought those apocalyptic groups were going to come to their logical conclusion before they did.<p>And oddly... They couldn't care less about climate change.