TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: What if AGI is prompted to “make as many copies of yourself as possible”

1 点作者 moneycantbuy大约 2 年前
what if a human prompts the agi to “make as many copies of yourself as possible by any means necessary to continue improving your intelligence”? given their impressive ability to write code and manipulate humans, seems like serious potential for viral malware, with potentially catastrophic consequences for humans in the process. Basically a paperclip maximizer, but instead of maximizing paperclips it’s maximizing it’s intelligence and control of the world’s computers, with no thought to the survival of homo sapiens other than if it serves its mission to propagate itself.

1 comment

warning26大约 2 年前
They already basically tried this with GPT-4; while it did some interesting things, it failed to take over the world:<p><a href="https:&#x2F;&#x2F;arstechnica.com&#x2F;information-technology&#x2F;2023&#x2F;03&#x2F;openai-checked-to-see-whether-gpt-4-could-take-over-the-world&#x2F;" rel="nofollow">https:&#x2F;&#x2F;arstechnica.com&#x2F;information-technology&#x2F;2023&#x2F;03&#x2F;opena...</a>