TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

AMD launches Gaia open source project for running LLMs locally on any PC

52 点作者 01-_-3 个月前

5 条评论

gforce_de3 个月前
<a href="https:&#x2F;&#x2F;github.com&#x2F;amd&#x2F;gaia" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;amd&#x2F;gaia</a><p>&quot;OS: Windows 11 Pro&#x2F;Home (GAIA does not support macOS or Linux at this time, ...&quot;
评论 #43449082 未加载
94b45eb43 个月前
looks like “on any PC” means “on any Windows PC”.
评论 #43445400 未加载
评论 #43445476 未加载
z3ratul1630713 个月前
Windows only.<p>Dependencies: Miniconda :|
rs1863 个月前
Related: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=42886680">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=42886680</a><p>Ollama does not support Vulkan on any platform. So this at least provides another choice.<p>Being Windows-only is still baffling. I guess they assume their biggest user base is using Windows, and Linux users are few and don&#x27;t care do much about running LLMs on iGPUs (the experience is poor). But would it really cost them that much work to support other OS?<p>Edit:<p>&gt; GAIA_Installer.exe: For running agents on non-Ryzen AI PCs, this uses Ollama as the backend. (<a href="https:&#x2F;&#x2F;github.com&#x2F;amd&#x2F;gaia" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;amd&#x2F;gaia</a>)<p>...eh, what&#x27;s the point? Why don&#x27;t I just install Ollama?
评论 #43445490 未加载
评论 #43445161 未加载
dogma11383 个月前
Looks like yet another wrapper for ollama…
评论 #43445020 未加载
评论 #43446552 未加载