TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Show HN: Fixthisbug.de – AI Code Fixes with Local LLM for Privacy

2 点作者 DeflectedFlux7 个月前
For fun (and learning) I&#x27;ve let an AI built FixThisBug.de, an AI-selfpowered code fixing tool that&#x27;s different from typical AI coding assistants as it uses a local Ollama instance with the qwen2.5 model.<p>Key Features:<p>- AI build: build entirely with AI, over several iterations via v0 and cursor.<p>- Local Processing: Uses qwen2.5 model through Ollama for code analysis and fixes<p>- Dual Language: Full support for English and German interfaces<p>- &quot;Free Tier&quot;: Limited daily fixes for casual users<p>- &quot;Pro Access&quot;: Unlimited fixes for authenticated users (just for fun, its free :) )<p>- Privacy Compliant: GDPR-compliant &amp; hosted in Germany<p>Tech Stack:<p>- Frontend: Next.js 15.0, React 19<p>- Backend: Supabase + Next.js API Routes -<p>- AI: Local Ollama instance running qwen2.5<p>- Auth: Supabase Auth with HCaptcha<p>- Infrastructure: German servers for EU compliance<p>To add a little <i>extra</i> to the project: Unlike GitHub Copilot or similar tools, FixThisBug.de processes everything locally on the server. No external AI service is used.<p>I&#x27;m particularly interested in feedback on:<p>1. The local LLM integration approach<p>2. Performance optimizations for Ollama<p>3. Your User experience using the service<p>4. Privacy considerations<p>Would love to hear your thoughts and suggestions!

暂无评论

暂无评论