I've developed a free, open-source, cross-platform desktop Markdown editor and would like to integrate an LLM into its ancillary podman container (that's used for typesetting). The editor can be used for writing novels and technical documents. A few areas where the integration may help:<p>* Grammar check<p>* Idea generation<p>* Rephrasing passages<p>* Translating short snippets<p>Requirements:<p>* Free, open source, MIT/Apache/BSD/etc. license<p>* Local only (no cloud)<p>* GPT-4 quality (or very close)<p>* Fallback to CPU, if GPU unavailable (or CPU-based)<p>* Trained on public works (no copyright material)<p>* Container-friendly (i.e., works with podman)<p>* No Python, if at all possible (not a deal-breaker)<p>Is there something that could be integrated at this time, or would you suggest waiting until the end of the year to see what shakes out?
Ollama
<a href="https://ollama.com/">https://ollama.com/</a><p>or llamafile
<a href="https://github.com/Mozilla-Ocho/llamafile">https://github.com/Mozilla-Ocho/llamafile</a><p>*there are some tools/projects already integrating it that way