TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Build a quick Local code intelligence using Ollama with Rust

17 pointsby tinco9 months ago

1 comment

joshka9 months ago
Do you have token counts &#x2F; cost breakdown for using groq models for this?<p>Something I&#x27;d really love to see as an open source library maintainer is something of an amalgam of:<p>- current source<p>- git commit history plus historical source<p>- github issues, PRs, discussions<p>- forum posts &#x2F; discord discussions<p>- website docs, docs.rs docs<p>And to be able to use all that to work on support requests &#x2F; code gen &#x2F; feature implementation &#x2F; spec generation etc.