TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: How do you load your code base as context window in ChatGPT?

2 点作者 alfonsodev4 个月前
I&#x27;ve read O1 has 200k tokens limit as per context window [1], my code base is about 177k, I could generate a prompt with code2prompt [2] tool and load my code base as a prompt, but:<p>- it doesn&#x27;t allow me to attach text files when o1 is selected.<p>- Pasting the whole prompt is the text area freezes the browser for a while but when si done, I can&#x27;t submitted as the send button is disabled.<p>- When creating a project I can attach files but then I can&#x27;t select o1 model.<p>I&#x27;m on the fence to buy pro subscription, if only I could use o1 with my code base as context window.<p>When they say 200k tokens it means trough the api? But then I&#x27;d incur in extra costs which seems odd since I&#x27;m already paying the subscription, and it&#x27;s not an automation use case.<p>I would appreciate if anyone can share their experience working with large context window for specific use case of a code base.<p>Thanks!<p>- [1] https:&#x2F;&#x2F;platform.openai.com&#x2F;docs&#x2F;models#o1<p>- [2] https:&#x2F;&#x2F;github.com&#x2F;mufeedvh&#x2F;code2prompt

2 条评论

cloudking4 个月前
For whole codebase prompting, you&#x27;ll have a much better time with <a href="https:&#x2F;&#x2F;www.cursor.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.cursor.com&#x2F;</a><p>You can use OpenAI, Anthropic, Google etc as the LLM provider. o1 is supported<p><a href="https:&#x2F;&#x2F;docs.cursor.com&#x2F;chat&#x2F;codebase" rel="nofollow">https:&#x2F;&#x2F;docs.cursor.com&#x2F;chat&#x2F;codebase</a>
dsrtslnd234 个月前
I use o1 with <a href="https:&#x2F;&#x2F;aider.chat" rel="nofollow">https:&#x2F;&#x2F;aider.chat</a>