TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: Can You Prompt an LLM to Solve a Simple Puzzle Clearly?

1 pointsby jeeybeeabout 1 month ago

1 comment

jeeybeeabout 1 month ago
I&#x27;ve observed something interesting: my toddler can quickly and somewhat effortlessly solve a simple physical puzzle, yet when I try to prompt large language models (LLMs) to guide me clearly through the solution, they struggle.<p>Even with clear photographs and detailed instructions, LLMs tend to give general advice or indirect methods rather than precise, actionable steps that clearly demonstrate correctness.<p>Have you tried something similar? Have you successfully &quot;prompt-engineered&quot; an LLM into giving clear, precise, step-by-step solutions for physical puzzles? If yes, what&#x27;s your approach?