TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: LLM Debugging Stories?

1 pointsby cryptoz8 months ago
I just came across one of the most confusing error messages I may have ever seen. I was running some LLM-generated JS code in a VM in node that uses esprima and escodegen to modify some other JS code. The idea is, my user might have some js they want to automatically modify. So I ask an LLM to modify the JS by writing some JS to modify the JS (using esprima, escodegen, etc).<p>But I got this message in one of my demo projects:<p>&quot;Line 61: Unexpected token ...&quot;<p>I would see it in my app logs, printed to a file from my flask app that runs docker containers and node to modify user files. Every time I tried to modify this JS I got this error. And I couldn&#x27;t figure it out. It is also un-googleable as Google simply strips out the ... and leaves you with just results for &quot;unexpected token&quot;.<p>But of course the LLM itself can know and help with this right?! Of course it can. o1-mini tells me in ChatGPT that in fact, the unexpected token itself is literally &quot;...&quot;. I was using esprima without full ES6 support and the spread&#x2F;rest operator was not recognized. Of course, it was the same LLM that generated the code that had the ... in the first place!<p>I think I&#x27;m told to use babel rather than esprima for this, or maybe there in a newer esprima than I am using that has better support. For now, at least I figured out what the error message is. And that I don&#x27;t support full ES6 apparently, at least not yet.<p>Have you been flummoxed by an error message that Google couldn&#x27;t solve, but an LLM could? (not by virtue of search being impossible, but at least, current Google can&#x27;t do it?)<p>Or maybe you&#x27;re encountering new error messages since working with new code that LLMs are generating for you?<p>Hoping for some good modern debugging stories in the LLM era.

no comments

no comments