TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: LLM for “talking to” a code repo? (like chatpdf.com, but for a codebase)

1 点作者 bhollan大约 2 年前
Is there a way (yet?) to add a chat layer to a codebase? Basically a layer that would take it&#x27;s best guess at the ERD and functions after reading the code, then be able to articulate about it and respond to questions?<p>One of my biggest struggles is &quot;getting to know&quot; a codebase when I first walk up to it. I&#x27;m trying my best to get better, but I have to balance that with actually producing deliverables. It would help radically if a teammate (or better yet, the author!) could spend a few hours with me answering questions, but who has that kind of time! So an LLM wading through a codebase and then being able to sufficiently answer my questions about how modules relate, &quot;what&#x27;s [thisFunction] for?&quot; and &quot;where&#x27;s the file that [doesThing]?&quot; and the like would be absolutely something I would pay for.<p>Anything out there already? Is this an unmet market need, now that we know the tech is up to the challenge?

1 comment

sinuhe69大约 2 年前
That is a basic deployment scene for OpenAI GPT API: you have a repository of text, GPT will work on it then answers your question. Just take a look at the OpenAI API page (and share your experience here :) )