I would like to toy around with LLM assistance (Chat/Copilot) for my coding side projects. Given that I will only use this sporadically (20hrs per week max, with variation between weeks), I was wondering if there was a cheaper way to access these tools than paying for an OpenAI/GitHub Copilot subscription.<p>Has someone here done the math on self-hosting on the cloud/running locally/using the APIs directly/just paying for the subscriptions?<p>Some additional details on my specific situation (but also interested in general considerations):<p>- I’m living in Germany (26ct/kWh, no access to Claude)
- I already have access to a box with 64GB RAM, no GPU though)
Cody was mentioned already. Continue [0] is another tool that you could give a shot, load local llm etc.<p>[0] <a href="https://github.com/continuedev/continue">https://github.com/continuedev/continue</a>
checkout the cody extension <a href="https://github.com/sourcegraph/cody">https://github.com/sourcegraph/cody</a> available for various editors like vscode