TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: How will LLMs work on an entire codebase?

1 pointsby gettodachoppaabout 1 year ago
Like many of you I use ChatGPT for specific questions, completing a function from comments, etc. But I&#x27;m reading that LLMs will soon become actual developers.<p>How can that be? Let&#x27;s forget about quality, hallucinations, etc. The largest context window from an accessible&#x2F;affordable LLM is 32k (Mixtral or GPT4). That&#x27;s barely enough for a TODO app, let alone a real project. The smallest project I work on, a desktop app, has 60k LOC&#x2F;6M characters&#x2F;1.5M tokens.<p>So what changes are coming that would allow an LLM modify an existing codebase, e.g. to modify a feature and write its tests? (without having to spoonfeed it the perfect context the way we do now in ChatGPT)

1 comment

hiddencostabout 1 year ago
Your question is posed as a hypothetical, but the problem is already solved...<p>Add a dependency graph of different agents and tools. Use summarization (either selecting subsections or rewriting). Give it a scratch space. Use RAG.<p>Why would it need to load the whole code base into memory? We can build very complex architectures on top of this task that mix LLMs with software.<p><a href="https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2402.09171" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;abs&#x2F;2402.09171</a><p>This isn&#x27;t hypothetical; all of these