TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Uillem – an offline, containerized LLM interface

9 点作者 uillem将近 2 年前

5 条评论

uillem将近 2 年前
I noticed that offline LLM builds running on personal computers are now possible, but it seemed like all the solutions required the installation of dependencies, so I created a containerized solution that makes it easy to swap out the model in use: <a href="https:&#x2F;&#x2F;github.com&#x2F;paolo-g&#x2F;uillem">https:&#x2F;&#x2F;github.com&#x2F;paolo-g&#x2F;uillem</a>
verelo将近 2 年前
Nice, love seeing Paolo post this! He is a great guy, we used to work together, and I’m excited to see where he takes this.
评论 #37099987 未加载
aganore将近 2 年前
This is pretty neat! Now I just need a good library of models to plug in haha
评论 #37091375 未加载
netlag将近 2 年前
Very impressive, can&#x27;t wait to give it a try!
darkwata将近 2 年前
Cool