TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: Cheapest way to use LLM coding assistance?

3 点作者 iamlucaswolf大约 1 年前
I would like to toy around with LLM assistance (Chat&#x2F;Copilot) for my coding side projects. Given that I will only use this sporadically (20hrs per week max, with variation between weeks), I was wondering if there was a cheaper way to access these tools than paying for an OpenAI&#x2F;GitHub Copilot subscription.<p>Has someone here done the math on self-hosting on the cloud&#x2F;running locally&#x2F;using the APIs directly&#x2F;just paying for the subscriptions?<p>Some additional details on my specific situation (but also interested in general considerations):<p>- I’m living in Germany (26ct&#x2F;kWh, no access to Claude) - I already have access to a box with 64GB RAM, no GPU though)

2 条评论

b-mmxx大约 1 年前
Cody was mentioned already. Continue [0] is another tool that you could give a shot, load local llm etc.<p>[0] <a href="https:&#x2F;&#x2F;github.com&#x2F;continuedev&#x2F;continue">https:&#x2F;&#x2F;github.com&#x2F;continuedev&#x2F;continue</a>
AtomicOrbital大约 1 年前
checkout the cody extension <a href="https:&#x2F;&#x2F;github.com&#x2F;sourcegraph&#x2F;cody">https:&#x2F;&#x2F;github.com&#x2F;sourcegraph&#x2F;cody</a> available for various editors like vscode
评论 #40023772 未加载