TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: Making LLMs ask a clarifying question

2 pointsby ismdubeyalmost 2 years ago
A major problem that we are facing is that the model (GPT-4) is not aware that there are multiple answers present for the query in the context. It tries to pick one possible answer and expands on that. Ideally it should be able to detect the ambiguity and ask a clarifying question.<p>I tried to make this as an &quot;instruction&quot; in the prompt but that doesn&#x27;t seem to nudge the model towards identifying ambiguity. I am not trying to improve the retrieval to make the context less ambiguous. Anyone has found a workarond?

4 comments

linuxBlogger01almost 2 years ago
I always add a line to my prompts something to the effect of &quot;Ask as many clarifying questions as need be before answeing.&quot;
评论 #36673545 未加载
tlbalmost 2 years ago
My own experience is that when it misinterprets an ambiguity, it&#x27;s immediately obvious to me what was ambiguous in my question and I can add clarification. Asking for a clarification always adds an extra step, compared to giving me a likely answer which only adds an extra step when it&#x27;s wrong.
version_fivealmost 2 years ago
This is a problem I&#x27;ve seen too but I haven&#x27;t dug into it. Could you try specifically asking what extra information the llm would want in order to give the best answer (or e.g. what information is missing) to push it to answer that question specifically?
评论 #36673783 未加载
instaheloalmost 2 years ago
Few days abck I chatted with stanford alpaca folks. Got the response that no clear solution as of now. All the transformer models are design to answer the questions in a given context.