TE
科技回声
首页
24小时热榜
最新
最佳
问答
展示
工作
中文
GitHub
Twitter
首页
"Do not hallucinate": Testers find prompts meant to keep Apple AI on the rails
2 点
作者
chha
7 个月前
1 comment
zahlman
7 个月前
Doesn't the concept of AI "hallucination" arise from observing previous LLMs? Would the training data for current LLMs include anything that would let them build a model of what such hallucinations entail?