TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: Fixthisbug.de – AI Code Fixes with Local LLM for Privacy

2 pointsby DeflectedFlux6 months ago
For fun (and learning) I&#x27;ve let an AI built FixThisBug.de, an AI-selfpowered code fixing tool that&#x27;s different from typical AI coding assistants as it uses a local Ollama instance with the qwen2.5 model.<p>Key Features:<p>- AI build: build entirely with AI, over several iterations via v0 and cursor.<p>- Local Processing: Uses qwen2.5 model through Ollama for code analysis and fixes<p>- Dual Language: Full support for English and German interfaces<p>- &quot;Free Tier&quot;: Limited daily fixes for casual users<p>- &quot;Pro Access&quot;: Unlimited fixes for authenticated users (just for fun, its free :) )<p>- Privacy Compliant: GDPR-compliant &amp; hosted in Germany<p>Tech Stack:<p>- Frontend: Next.js 15.0, React 19<p>- Backend: Supabase + Next.js API Routes -<p>- AI: Local Ollama instance running qwen2.5<p>- Auth: Supabase Auth with HCaptcha<p>- Infrastructure: German servers for EU compliance<p>To add a little <i>extra</i> to the project: Unlike GitHub Copilot or similar tools, FixThisBug.de processes everything locally on the server. No external AI service is used.<p>I&#x27;m particularly interested in feedback on:<p>1. The local LLM integration approach<p>2. Performance optimizations for Ollama<p>3. Your User experience using the service<p>4. Privacy considerations<p>Would love to hear your thoughts and suggestions!

no comments

no comments