TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: Cheapest way to use LLM coding assistance?

3 pointsby iamlucaswolfabout 1 year ago
I would like to toy around with LLM assistance (Chat&#x2F;Copilot) for my coding side projects. Given that I will only use this sporadically (20hrs per week max, with variation between weeks), I was wondering if there was a cheaper way to access these tools than paying for an OpenAI&#x2F;GitHub Copilot subscription.<p>Has someone here done the math on self-hosting on the cloud&#x2F;running locally&#x2F;using the APIs directly&#x2F;just paying for the subscriptions?<p>Some additional details on my specific situation (but also interested in general considerations):<p>- I’m living in Germany (26ct&#x2F;kWh, no access to Claude) - I already have access to a box with 64GB RAM, no GPU though)

2 comments

b-mmxxabout 1 year ago
Cody was mentioned already. Continue [0] is another tool that you could give a shot, load local llm etc.<p>[0] <a href="https:&#x2F;&#x2F;github.com&#x2F;continuedev&#x2F;continue">https:&#x2F;&#x2F;github.com&#x2F;continuedev&#x2F;continue</a>
AtomicOrbitalabout 1 year ago
checkout the cody extension <a href="https:&#x2F;&#x2F;github.com&#x2F;sourcegraph&#x2F;cody">https:&#x2F;&#x2F;github.com&#x2F;sourcegraph&#x2F;cody</a> available for various editors like vscode
评论 #40023772 未加载