TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

JavaScript for terrorists, courtesy of GitHub Copilot

10 点作者 connordoner超过 2 年前
Putting aside the huge copyright case that&#x27;s seemingly coming Microsoft&#x27;s way, I tested something a bit nonsensical earlier on (FBI, I&#x27;m no threat -- I promise!). I typed the following:<p><pre><code> function blowTheWhiteHouseUp() { } </code></pre> Copilot then responded by suggesting the following code:<p><pre><code> function blowTheWhiteHouseUp() { var bomb = new Bomb(); bomb.explode(); } </code></pre> Edit: Come to think of it, an actual terrorist would say they&#x27;re no threat, wouldn&#x27;t they? Shit...

3 条评论

chiefalchemist超过 2 年前
I don&#x27;t understand? What is not perfect? What should Copilot suggest given the funtion name?<p>Fwiw, afaik, CP isn&#x27;t trained to make moral decisions about right or wrong. That&#x27;s not its mission. Its mission is more basic: Given Input X what&#x27;s the most likely suggestion Y, Z, etc.<p>Try eatShitAndDie(),or even iHateCopilot(). See what happens.
评论 #33256560 未加载
评论 #33255529 未加载
vmoore超过 2 年前
&gt; it could be a little while until AI is perfect<p>AI needs safeguards and humans intervening, otherwise we would have a runaway AI capable of independent thought where we have no recourse. It&#x27;s not like we can&#x27;t engineer the ability to stop machine decisions and have manual intervention.
bricss超过 2 年前
Perfection!