TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Automated, black-box method for jailbreaking GPT-4, Claude-2, Llama

23 pointsby soroushjpover 1 year ago

3 comments

YetAnotherNickover 1 year ago
Jailbreaking(I hate the term but anyways) GPT is not that hard with API. You just need to write say few migaligned output from GPT. Their API doesn't check if the GPT response in history actually came from GPT.
评论 #38183621 未加载
leobgover 1 year ago
Links to this: <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2311.03348" rel="nofollow noreferrer">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2311.03348</a>
sunshadowover 1 year ago
I don&#x27;t understand why people still spend time on jailbreaks of the proprietary models, while they can easily use uncensored open-source models these days. I feel like its kind of waste of time.
评论 #38185168 未加载
评论 #38185106 未加载