TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: How to Keep Up with LLMs? [Linux, Self-Hosting, Info]

16 点作者 conor_f将近 2 年前
I&#x27;ve accepted that LLMs are useful when taken with precautions. A few tools I&#x27;ve used recently have convinced me on this fact. Alongside this, I&#x27;ve heard first-hand anecdotal experience of LLMs providing great benefit, from asking about APIs all the way to practising languages or summarizing texts.<p>I&#x27;m now interested enough to try run one myself and see how it suits my personal workflow. So I have a few questions:<p>1) How can I set up a LLM locally with good effort&#x2F;reward ratio? I don&#x27;t want to spend hours setting up something unreliable that needs constant modification - moreso something I can just interact with easily from a web UI&#x2F;CLI when I need to.<p>2) Is there an easy way to keep up to date with LLMs so I can update to newer models as they become popular to get the best results?<p>Note that I&#x27;m only looking for self hosted, Linux compatible solutions!

2 条评论

version_five将近 2 年前
My first stop would be llama.cpp and compatible models on your own machine. You should be able to run quantized 7B and 13B models, try them out and see if they work.<p>Though for &quot;personal workflow&quot;, unless you want to be able to play with the internals of the models or are worried about privacy, I&#x27;d just use ChatGPT (in fact I do, despite having llama.cpp setup to run various models, I always use ChatGPT for personal stuff and programming question)
评论 #36996757 未加载
tikkun将近 2 年前
1) what do you want to use it for?<p>2) &#x2F;r&#x2F;localllama is good, and then also the “open llm leaderboard” and the “lmsys llm leaderboard”
评论 #36996793 未加载