TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: What if AGI is prompted to “make as many copies of yourself as possible”

1 pointsby moneycantbuyabout 2 years ago
what if a human prompts the agi to “make as many copies of yourself as possible by any means necessary to continue improving your intelligence”? given their impressive ability to write code and manipulate humans, seems like serious potential for viral malware, with potentially catastrophic consequences for humans in the process. Basically a paperclip maximizer, but instead of maximizing paperclips it’s maximizing it’s intelligence and control of the world’s computers, with no thought to the survival of homo sapiens other than if it serves its mission to propagate itself.

1 comment

warning26about 2 years ago
They already basically tried this with GPT-4; while it did some interesting things, it failed to take over the world:<p><a href="https:&#x2F;&#x2F;arstechnica.com&#x2F;information-technology&#x2F;2023&#x2F;03&#x2F;openai-checked-to-see-whether-gpt-4-could-take-over-the-world&#x2F;" rel="nofollow">https:&#x2F;&#x2F;arstechnica.com&#x2F;information-technology&#x2F;2023&#x2F;03&#x2F;opena...</a>