TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Eliezer: Why the danger from AGI is way more serious than nuclear weapon

7 pointsby niyikizaabout 2 years ago

4 comments

smoldesuabout 2 years ago
I don&#x27;t know who this guy is and what he does. I do know this<p>&gt; 1) Nuclear weapons are not smarter than humanity.<p>&gt; 2) Nuclear weapons are not self-replicating.<p>is a nonsense claim. Nuclear weapons are not <i>yet</i> smarter than humanity or self-replicating, in the same way that AGI is not <i>yet</i> a realistic threat. These are comic-book arguements without basis in reality, probably even Hideo Kojima shaking his head at this interpretation of nuclear politics and AI.<p>The nuclear weapons that are currently-manufactured and ready-to-launch stand a bigger immediate threat than any LLM on the market today. It&#x27;s a no-brainer. Hypothesizing future AGI threats is a useless exercise, in the same way that hypothesizing future nuclear weapons is futile and pointless.
评论 #35482855 未加载
iababout 2 years ago
&gt; 5) You can calculate how powerful a nuclear weapon will be before setting it off.<p>&gt; 7) It would be hard to do a full nuclear exchange by accident and without any human being having decided to do that.<p>&gt; 17) When somebody raised the concern that maybe the first nuclear explosion would ignite the atmosphere and kill everyone, it was promptly taken seriously by the physicists on the Manhattan Project, they did a physical calculation that they understood how to perform, and correctly concluded that this could not possibly happen for several different independent reasons with lots of safety margin.<p>Giving nuclear-weapons developers and scientists a lot of undue credit here
fastballabout 2 years ago
Eliezer has been throwing a lot of arguments&#x2F;examples at the wall to try to see what sticks, and personally I think this is one of his better attempts at articulating why he is so scared of AGI. I don&#x27;t necessarily think all the points in this list are entirely valid, but they are at least reasonable and a direct comparison to the thing most humans are most scared of when it comes to global apocalypse scenarios.
gregjorabout 2 years ago
An intellectual career founded on teenage vanity and angst.<p><a href="https:&#x2F;&#x2F;open.substack.com&#x2F;pub&#x2F;aiascendant&#x2F;p&#x2F;extropias-children-chapter-1-the-wunderkind" rel="nofollow">https:&#x2F;&#x2F;open.substack.com&#x2F;pub&#x2F;aiascendant&#x2F;p&#x2F;extropias-childr...</a>