TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: CPU-based LLM for Markdown editor

2 点作者 thangalin大约 1 年前
I&#x27;ve developed a free, open-source, cross-platform desktop Markdown editor and would like to integrate an LLM into its ancillary podman container (that&#x27;s used for typesetting). The editor can be used for writing novels and technical documents. A few areas where the integration may help:<p>* Grammar check<p>* Idea generation<p>* Rephrasing passages<p>* Translating short snippets<p>Requirements:<p>* Free, open source, MIT&#x2F;Apache&#x2F;BSD&#x2F;etc. license<p>* Local only (no cloud)<p>* GPT-4 quality (or very close)<p>* Fallback to CPU, if GPU unavailable (or CPU-based)<p>* Trained on public works (no copyright material)<p>* Container-friendly (i.e., works with podman)<p>* No Python, if at all possible (not a deal-breaker)<p>Is there something that could be integrated at this time, or would you suggest waiting until the end of the year to see what shakes out?

1 comment

throwaway888abc大约 1 年前
Ollama <a href="https:&#x2F;&#x2F;ollama.com&#x2F;">https:&#x2F;&#x2F;ollama.com&#x2F;</a><p>or llamafile <a href="https:&#x2F;&#x2F;github.com&#x2F;Mozilla-Ocho&#x2F;llamafile">https:&#x2F;&#x2F;github.com&#x2F;Mozilla-Ocho&#x2F;llamafile</a><p>*there are some tools&#x2F;projects already integrating it that way