TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Chat TUI for Ollama

41 pointsby lijunhao12 months ago

3 comments

halJordan12 months ago
Ollama is getting some crazy vendor lock in. Ollama has an openai api, Llama.cpp has an openai compatible api, the various local llm proxies all support the openai api. But people insist on tying their products to the ollama api.<p>Hopefully as it goes forward if you implement the full ollama api then you will at least implement some subset of the openai api so the non-ollama tooling will work with the cool projects.
评论 #40622400 未加载
thih912 months ago
&gt; no need to run servers<p>&gt; In order to use oterm you will need to have the Ollama server running<p>These mutually exclusive statements in the readme were confusing to me.
guestbest12 months ago
Are there any of the TUI’s that have multi and split windows like emacs or vi
评论 #40628525 未加载