TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: Light like the Terminal – Meet GTK LLM Chat Front End

35 pointsby icarito28 days ago
Author here. I wanted to keep my conversation with #Gemini about code handy while discussing something creative with #ChatGPT and using #DeepSeek in another window. I think it&#x27;s a waste to have Electron apps and so wanted to chat with LLMs on my own terms. When I discovered the llm CLI tool I really wanted to have convenient and pretty looking access to my conversations, and so I wrote gtk-llm-chat - a plugin for llm that provides an applet and a simple window to interact with LLM models.<p>Make sure you&#x27;ve configure llm first (<a href="https:&#x2F;&#x2F;llm.datasette.io&#x2F;en&#x2F;stable&#x2F;" rel="nofollow">https:&#x2F;&#x2F;llm.datasette.io&#x2F;en&#x2F;stable&#x2F;</a>)<p>I&#x27;d love to get feedback, PRs and who knows, perhaps a coffee! <a href="https:&#x2F;&#x2F;buymeacoffee.com&#x2F;icarito" rel="nofollow">https:&#x2F;&#x2F;buymeacoffee.com&#x2F;icarito</a>

3 comments

guessmyname28 days ago
It’d be better if it was written in C or at least Vala. With Python, you have to wait a couple hundred milliseconds for the interpreter to start, which makes it feel less native than it can be. That said, the latency of the LLM responses is higher than the UI, so I guess the slowness of Python doesn’t matter.
评论 #43755335 未加载
评论 #43755699 未加载
Gracana28 days ago
This looks quite nice. I would like to see the system prompt and inference parameters exposed in the UI, because those are things I&#x27;m used to fiddling with in other UIs. Is that something that the llm library supports?
评论 #43755106 未加载
indigodaddy28 days ago
Does this work on Mac or Linux only?
评论 #43755885 未加载