TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Snowflake Arctic: LLM for Enterprise AI — Efficiently Intelligent, Truly Open

19 点作者 georgehill大约 1 年前

2 条评论

blackeyeblitzar大约 1 年前
They claim several times to be “truly open” but I don’t see anything about open sourcing the training code. The inference code isn’t that interesting. What we need is total transparency on how the model weights are produced, since otherwise it’s hard to trust the biases of a model. The only actually truly open model is AI2’s OLMo as far as I know - and even they don’t get totally transparent about how they produced their training data set, which includes curation and filtering by “safety and ethics” people:<p><a href="https:&#x2F;&#x2F;blog.allenai.org&#x2F;hello-olmo-a-truly-open-llm-43f7e7359222" rel="nofollow">https:&#x2F;&#x2F;blog.allenai.org&#x2F;hello-olmo-a-truly-open-llm-43f7e73...</a><p>But until training data sets and source code are released under an OSI license, Snowflake should stop with the open washing.
bfirsh大约 1 年前
It might not be obvious from the title, but this model is absolutely massive: 480B parameters. The largest open-source model to date, I believe.<p>You can try it out here: <a href="https:&#x2F;&#x2F;arctic.streamlit.app&#x2F;" rel="nofollow">https:&#x2F;&#x2F;arctic.streamlit.app&#x2F;</a><p>Weights are here: <a href="https:&#x2F;&#x2F;huggingface.co&#x2F;Snowflake&#x2F;snowflake-arctic-instruct" rel="nofollow">https:&#x2F;&#x2F;huggingface.co&#x2F;Snowflake&#x2F;snowflake-arctic-instruct</a>
评论 #40144363 未加载