Yes, salient risks seem more threatening than they actually are, but for a similar reason, invisible risks are more threatening than they seem. Which is why the future does not look as bright as this article is claiming.<p>In the last year or so I have become interested in the study of existential risks -- low-probability events which could extinguish humanity as we know it. Things like catastrophic nuclear war, biotechnology or nanotechnology overrun, artificial intelligence overrun, supervolcano explosions and asteroid impacts. There are few people researching these things, despite the huge potential downside to not researching them, because the risks aren't things that our amygdala responds to.<p>If you're interested in this stuff, there's much work to be done - check out the Lifeboat Foundation, Singularity Institute and Future of Humanity Institute.