TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: Selfhosted ChatGPT and Stable-diffusion like alternatives?

1 pointsby null4bl3over 1 year ago
I know this has been asked before, but things are moving so quickly in this realm and people here seem to have a good insight, so I am asking again.<p>The speed of answers and computation is not really an issue, and I know that most selfhosted solutions obviously is in no way fully on-par with services like Chatgpt or Stable-diffusion.<p>I do have somewhat modest resources.<p>16 GB RAM NVIDIA GPU with 4 GB vRAM.<p>Is there any options that means I can run it selfhosted?

1 comment

firebazeover 1 year ago
<a href="https:&#x2F;&#x2F;github.com&#x2F;invoke-ai&#x2F;InvokeAI">https:&#x2F;&#x2F;github.com&#x2F;invoke-ai&#x2F;InvokeAI</a> should work on your machine. For LLM models, the smaller ones should run using llama.cpp, but I don&#x27;t think you&#x27;ll be happy comparing them to ChatGPT.
评论 #38420520 未加载