TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Replicating GPT-2 at Home

254 pointsby bkkaggleover 4 years ago

8 comments

minimaxirover 4 years ago
As someone who maintains a package to both make it easy to fine-tune GPT-2 or create your own from scratch (<a href="https:&#x2F;&#x2F;github.com&#x2F;minimaxir&#x2F;aitextgen" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;minimaxir&#x2F;aitextgen</a>), this submission is a good run-through of the technical considerations toward building a GPT-2 model.<p>It&#x27;s both substantially easier and faster than it was when OpenAI released their paper in 2019, thanks to both Huggingface Transformers and Tokenizers making the architectures more efficient and other companies streamlining the training process and make it more efficient for all parts in the pipeline.<p>You don&#x27;t need a TPU cluster to train a working GPT-2 model, although it helps (unfortunately TPU support on PyTorch-based training like aitextgen is more fussy). A free GPU on Colab gets you most of the way, especially since you can get now a T4 or a V100 which lets you use FP16.
评论 #25884767 未加载
评论 #25888362 未加载
评论 #25885583 未加载
评论 #25885100 未加载
zirkonitover 4 years ago
First off -- the author has done an amazing tutorial, it&#x27;s very enjoyable, so I am by no means throwing a shade.<p>But a week of TPUv3-128 is anywhere between $10k and $20k in TPU costs alone; saying that this is an &quot;at home&quot; kind of experiment is cheeky at best, clickbait at worst.
评论 #25884724 未加载
评论 #25885025 未加载
评论 #25884467 未加载
polytronicover 4 years ago
The author at 17 years of age can understand academics and research. Has the skills and dedication to go through an exercise of reconstructing state-of-the-art.<p>I can&#x27;t help but feel pride and hope for the future, both the author&#x27;s and the world.
评论 #25886171 未加载
alexpeattieover 4 years ago
The article has moved here: <a href="https:&#x2F;&#x2F;bilal2vec.github.io&#x2F;blog&#x2F;algpt2&#x2F;2020&#x2F;07&#x2F;17&#x2F;ALGPT2-part-2" rel="nofollow">https:&#x2F;&#x2F;bilal2vec.github.io&#x2F;blog&#x2F;algpt2&#x2F;2020&#x2F;07&#x2F;17&#x2F;ALGPT2-pa...</a>
kyberiasover 4 years ago
How many off-the-shelf GPUs are needed to replicate GPT-2 in a year?
评论 #25884880 未加载
deeviantover 4 years ago
At home, in the cloud, for tens of thousands of $$$.
评论 #25886286 未加载
soohamrover 4 years ago
UWaterloo has such precocious students
ameliusover 4 years ago
TL;DR:<p>&gt; Unfortunately, ALGPT-2 doesn’t perform as well as GPT-2 (ALGPT-2 gets 313131 ppl on OpenWebText compared to 212121 ppl for my pretrained GPT-2 model), but I’m writing this series of blog posts to go through everything I’ve learned over the last few months.
评论 #25890384 未加载