TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: Why can't ChatGPT solve Wordle-type questions?

5 点作者 standeven大约 1 年前
My daughter likes to give me substitution cyphers to solve, and sometimes it’s just a single word. For example, “cormorant”, but substituted so it appears as “avtfvtpwz”. If I ask ChatGPT to list every 9-letter word with the second and fifth characters the same, the third and sixth characters the same, and all other characters unique to each other, it cannot get it right. It hallucinates and tells me all sorts of other words as fitting the criteria when they don’t, at all.<p>I can ask it to paraphrase the rules and it totally understands, it just can’t get close to the right answer. Same with other AI chat models that I’ve tried. Any idea why this seemingly simple question is a limitation?

6 条评论

enasterosophes大约 1 年前
LLMs works by converting your question into a list of numbers and projecting that list, like a shadow, into a high-dimensional space which was constructed through training on other lists of numbers. Where the projection lands gives a new list of numbers, which are then translated back into words.<p>Because of the way the model (i.e. the projection surface) was constructed, the strings returned look plausible. However, you&#x27;re still just seeing the number-back-to-language translation of a vector which was guessed by statistical inference.
评论 #40208454 未加载
sk11001大约 1 年前
Tokenization. The tokens ChatGPT uses are longer than a single character. You&#x27;re asking it to play the piano wearing oven mitts.
thiago_fm大约 1 年前
It&#x27;s because it generates token by token based on probabilities, and it has no reasoning capabilities. Some AI experts like to name it reasoning for the purpose of benchmarking, but it isn&#x27;t like what we humans think.<p>LLMs typically struggle to do things about the words themselves, or basic counting, some AIs like OpenAI use hacks to not have it fail in a miserable way.
stop50大约 1 年前
LLMs encode words as numbers which don&#x27;t have an relationship with the ltters of the word.
codegladiator大约 1 年前
You are not &quot;asking&quot; chatgpt and there is no answer, and this is the basis for all confusion.<p>You are transforming text using a text transformer. You have input text and output text.<p>You are asking why is this output text not the what you expected. That is because this particular transformer has said weights.
评论 #40221376 未加载
beardyw大约 1 年前
What you are trying to solve is trivial if you have a suitable list of words. A solver would be easy to create and not need anything near as complex as AI. There are Scrabble solvers on the web which may work for you. Not everything is AI.
评论 #40207844 未加载