TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

RedPajama at 440B tokens higher quality than Pythia and StableLM

6 pointsby jamiedgabout 2 years ago

1 comment

jamiedgabout 2 years ago
A week ago we announced RedPajama, a project to create leading open-source models. We released the first step in the project a training dataset of over 1.2 trillion tokens following the LLaMA recipe.<p>Today we shared progress on training our first model on this dataset, a 7B parameter model using the Pythia architecture. So far we are a bit less than 50% through the training - 440B parameters. We published HELM benchmark results on 16 different scenarios for this checkpoint, showing the model accuracy to be quite high for this stage of training.
评论 #35697552 未加载