TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

X.ai's Grok-1 Model Is Officially Open-Source and Larger Than Expected

22 pointsby RafelMriabout 1 year ago

5 comments

numpad0about 1 year ago
dupe, from 6 hours ago: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=39737281">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=39737281</a>
threeseedabout 1 year ago
&gt; OpenAI CEO Sam Altman with the express objective of creating open-source AI models<p>Not aware of OpenAI at any point saying they would open-source their models.<p>Only that researchers would be encouraged to share their work with the world:<p><a href="https:&#x2F;&#x2F;web.archive.org&#x2F;web&#x2F;20190224031626&#x2F;https:&#x2F;&#x2F;blog.openai.com&#x2F;introducing-openai&#x2F;" rel="nofollow">https:&#x2F;&#x2F;web.archive.org&#x2F;web&#x2F;20190224031626&#x2F;https:&#x2F;&#x2F;blog.open...</a>
jejones3141about 1 year ago
What kind of hardware would one need to use the model with reasonable performance?
评论 #39739726 未加载
评论 #39739736 未加载
评论 #39739739 未加载
评论 #39739729 未加载
评论 #39739721 未加载
mdanielabout 1 year ago
the submission that doesn&#x27;t have a paywall: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=39737313">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=39737313</a> or the discussion on the code itself: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=39737281">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=39737281</a><p>TFA: <a href="https:&#x2F;&#x2F;x.ai&#x2F;blog&#x2F;grok-os" rel="nofollow">https:&#x2F;&#x2F;x.ai&#x2F;blog&#x2F;grok-os</a><p>TFC: <a href="https:&#x2F;&#x2F;github.com&#x2F;xai-org&#x2F;grok-1#readme">https:&#x2F;&#x2F;github.com&#x2F;xai-org&#x2F;grok-1#readme</a>
teruakohatuabout 1 year ago
From the official release page [1]:<p>&gt; We are releasing the base model weights and network architecture of Grok-1, our large language model. Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI.<p>&gt; This is the raw base model checkpoint from the Grok-1 pre-training phase, which concluded in October 2023. This means that the model is not fine-tuned for any specific application, such as dialogue.<p>&gt; We are releasing the weights and the architecture under the Apache 2.0 license.<p>&gt; To get started with using the model, follow the instructions at github.com&#x2F;xai-org&#x2F;grok.<p>A little disappointing they are not releasing the weights for the Grok-1 finetuned model.<p>[1] <a href="https:&#x2F;&#x2F;x.ai&#x2F;blog&#x2F;grok-os" rel="nofollow">https:&#x2F;&#x2F;x.ai&#x2F;blog&#x2F;grok-os</a>