TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

RAG, fine-tuning, API calling and gptscript for Llama 3 running locally

30 点作者 lewq12 个月前

4 条评论

lewq12 个月前
But what I think is really interesting is the ability to define a helix app yaml like:<p><a href="https:&#x2F;&#x2F;github.com&#x2F;helixml&#x2F;example-helix-app&#x2F;blob&#x2F;main&#x2F;helix.yaml">https:&#x2F;&#x2F;github.com&#x2F;helixml&#x2F;example-helix-app&#x2F;blob&#x2F;main&#x2F;helix...</a><p>Then version control it and deploy your updated LLM app with a single git push. LLMGitOps?
评论 #40467444 未加载
lewq12 个月前
Deck: <a href="https:&#x2F;&#x2F;docs.google.com&#x2F;presentation&#x2F;d&#x2F;11bBUP8gBekmI7GkwvGdrw2j5L3gek4CBNz47SORjuTk&#x2F;edit" rel="nofollow">https:&#x2F;&#x2F;docs.google.com&#x2F;presentation&#x2F;d&#x2F;11bBUP8gBekmI7GkwvGdr...</a>
pavelstoev12 个月前
Very interesting project and good progress on making private LLM use cases more accessible and usable, please keep going !
NocodeWorks12 个月前
Been looking for something where I could do this all locally without doing a bunch of wiring. Will check it out.