In the long term, we're all dead.<p>Call me selfish, but I personally don't worry about the future of humanity in thousands of years. I'm sure most people don't either.<p>A kid born in the year 10,000 is a stranger to me. Sure, people can feel concerned about his fate, just as people can feel concerned about a stranger in a foreign country, but most people just don't. We're all selfish bastards.
Baffled by Ord's defence of discount rates. The transcript reads like a primer on the pure theory of discounting.<p>In practice it's a cargo cult ritual that leads to the conclusion that we should give within epsilon of zero fucks about anything that happens N years from now for sufficiently large N (which you get to choose as the author of an analysis).<p>That's partly because there's no mathematical difference in exponential discounting for the role played by 'pure rate of time preference' and the 'good' discount rate. It's always just 3.5% for no particular reason other than that's what in the back of the Treasury handbook written before 2008 during the Great Moderation.<p>Oh, and is 3.5% a reasonable number? Sounds kinda like inflation/mortgage rates, right? But if you look at CPI $100 dollars in 1817 is, equivalent to roughly $2k dollars in 2017. But would I rather spend this 'same' amount of money in 1817 or 2017? It's a no-brainer as long as I prefer penicillin and smart phones to legal opiods and vintage mustache waxing. This is <i>really</i> hard stuff to model, which is a clue that <i>it doesn't get done</i>.<p>If Ord and Wiblin could pick one easy fight to change about government policy making and promote longtermism it would be savaging the use of discounting in long term policy decisions rather than beating around the bush.
This is my philosophy on the future. Feel free to adopt or reject it as you see fit.<p>Trying to reason about the future on cosmic scales is beyond our current and any foreseeable capabilities so let's discount that right away; we don't quite know if the universe will end in heat death (though we have some hints). We don't know if it is possible to create or move to other universes or even whether such a concept makes sense. As long as such things haven't been ruled out we can assume they're true: if life survives to the end of this universe it will have accumulated enough knowledge and technology to create or move to a new one.<p>Back in the present every species on earth is doomed. Whether an asteroid strike or the sun turning into a red giant eventually this planet will become uninhabitable.<p>It is not known whether intelligent life is a common result of evolution or we're a fluke. We do know it takes a very long time if it happens at all. It is possible we are the only highly intelligent species that will ever appear on this planet.<p>The primary responsibility of humanity then is to balance resource consumption with the creation of an interplanetary society first, then eventually an interstellar one. We must protect the earth's resources and preserve as many species as we can reasonably preserve... but we must also preserve modern technology and pursue the establishment of permanent self-sufficient colonies on other worlds in the solar system. Once we do so we must take and spread as many species from earth as we can. In the medium term (next million years) this isn't so important but once we can achieve some form of interstellar movement spreading life to other systems is key to ensuring it isn't wiped out (or reset back to single-cell organisms).<p>We are the only natural process capable of preserving life and so far as we are aware ours may be the only instance of life (or intelligent life) in the universe. What future iterations of life will make of such an opportunity we can only guess, but if we don't do it in all likelihood no one else ever will.<p>If no intelligent life survives then our existence is pointless and will be erased when the sun's red-giant phase scours every last trace that any living thing has ever existed from the face of the earth.
In part I think this is why Musk is such a bigger than life figure. His whole spiel about making humanity a multi-planet specie makes for good PR but as far as I can tell it also genuinely matter to him & provides drive.
It matters... but not more than anything else. What matters even more is that we retain our humanity.<p>Notice that this idea is only one step removed from "the long-term future of the <i>race</i> matters more than anything else". Think about what horrors have been done in the service of <i>that</i> idea. Let us be very careful, then, that we commit no horrors in the service of <i>this</i> idea. ("Nothing matters more than X" easily becomes "therefore it's OK - even good - to trample on people in the service of X".)
Man this is a sad thread.<p>Humanity is the most incredible thing in what we know of the universe. Stars and gas clouds sure are beautiful too, but unless you are very religious, the those are simple an artifact of statistics, there was no intention there.<p>Without the humans, the universe is empty.
"Why the long-term future of humans matters more than anything else"<p>Because humans care about humans more than anything else <i>and that's fully sufficient</i>. Let's not kid ourselves, <i>we</i> care about us, most everybody else is kinda meh. Symbiotic species like us too, I guess.
Because otherwise everything up to this point would've been pointless do we really need an article explaining why the survival of our species is important?
> Robert Wiblin: Are there some approaches that you think are just obviously too broad?<p>> Toby Ord: Good question. I think just saying, “Okay, what about improving science?” my guess is that because this begets technology, which begets some of the risk, it’s unlikely that just pushing on that is a plausible thing to particularly help.<p>> ... They probably wouldn’t even know about asteroid risk, or super volcano risk, or various other natural risks.<p>I think talking about these kind of risks is a mistake. They're not the biggest thing on our plate and people will always be able to foresee them as too distant and too small.<p>~~~~~<p>> Robert Wiblin: We’ve talked a lot about reducing risks to the future. What about thinking about the opposite of that, which is extremely large upsides? Are there any practical ways that people might go about not so much preventing extinction or something horrible, but also trying to create something that’s much more positive than what we have reason to hope for?<p>> Toby Ord: It’s a good question. I’m not sure.<p>If you want a specific technology, work on electric transportation. Fossil fuel transportation in agriculture allowed us to scale up humanity to 7.5+ billion people, but it created a time-bomb because as soon as its too pricey to move food around, we can no longer create it as scale, and there's no <i>going back</i> to agrarian pastoral society without killing almost all the humans. Over the long term, we have a moral imperative to advance ourselves beyond fossil fuels for transportation. This is what I believe to be the most moral technology we could work on.[1]<p>"Humanity has a bug. We think of the future too little, and too often we think technology propels itself, that the future will simply unfold automatically. Or worse, many seem to have suffered a loss of faith in any real vision of the future. I wish this was not so."<p>[1] <a href="https://medium.com/@simon.sarris/the-moral-technology-6413ca8449c9" rel="nofollow">https://medium.com/@simon.sarris/the-moral-technology-6413ca...</a>
What is paramount is to support the force which brought humanity here, not humanity itself. We are an instantiation of a process, not the process. To best support that force we as a planet of people need to realize we are not the end-all-be-all and our species time will come and go. When our successor supersedes humanity we should celebrate that beautiful expression of a yet more well adapted carrier of the torch ... do not fight progress on this more supra anthropocentric perspective, embrace it, celebrate it
It should be "Why the long-term future of the global ecosystem matters more than anything else". Not humanity. Humans cause nothing but destruction and suffering on an unprecedented scale. 80000hours and the EA community are not better than any other anthropocentric interest group. Seriously what's with this obsession about humanity? How shortsighted are people?
I have troubled finding exemples in history where we made the "right" decisions for the future of humanity when we wanted too. Think religion oppression, communism, etc. Why today will be different?
There is no long term future if there is no short and medium term future. It's very hard to create a better tomorrow if something is broken today.
God, it annoys me to see so many people spouting Rick & Morty-tier nihilistic verdicts on life and it being "pointless". Seriously, go outside and read some more books, and synthesize your experiences and observations.<p>Even without going into rigorous philosophical discussion, if you still confidently claim that all of it is fatally worthless even after having seen as much of what there is to life, I think you seriously lack perception.<p>It's just shallow pessimism, not a thoughtful outlook on conscious existence.
Nobody cares about humanity 600 generations from now. We're only truly concerned about our immediate family. I'd sacrifice my life for my son, but I could give a shit about my great great great great great great great great great great grandson.<p>I would say 90% of our environmental problems stem from this one fact. I mean we all drive cars and are unwilling to make the necessary sacrifice to stop.
> <i>a sense of wonder about the universe we find ourselves in</i><p>If we have a sense of wonder about the universe or our planet, then what we should really do is rid it of us, because all we have ever done is ruin it, make it hideous, inflict pain and destruction on it.<p>I have kids. My behavior is not consistent with my beliefs. In that I'm very human.