TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: Selfhosted ChatGPT and Stable-diffusion like alternatives?

1 点作者 null4bl3超过 1 年前
I know this has been asked before, but things are moving so quickly in this realm and people here seem to have a good insight, so I am asking again.<p>The speed of answers and computation is not really an issue, and I know that most selfhosted solutions obviously is in no way fully on-par with services like Chatgpt or Stable-diffusion.<p>I do have somewhat modest resources.<p>16 GB RAM NVIDIA GPU with 4 GB vRAM.<p>Is there any options that means I can run it selfhosted?

1 comment

firebaze超过 1 年前
<a href="https:&#x2F;&#x2F;github.com&#x2F;invoke-ai&#x2F;InvokeAI">https:&#x2F;&#x2F;github.com&#x2F;invoke-ai&#x2F;InvokeAI</a> should work on your machine. For LLM models, the smaller ones should run using llama.cpp, but I don&#x27;t think you&#x27;ll be happy comparing them to ChatGPT.
评论 #38420520 未加载