TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Microsoft releases code to enable training BERT at large scale on commodity GPUs

12 pointsby TheIronYuppiealmost 6 years ago

4 comments

nonfamousalmost 6 years ago
The actual code is in this repo: <a href="https:&#x2F;&#x2F;github.com&#x2F;microsoft&#x2F;AzureML-BERT" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;microsoft&#x2F;AzureML-BERT</a>
TheIronYuppiealmost 6 years ago
Disclosure: I work at Microsoft&#x2F;Azure on ML<p>Really excited to announce the release of the code we use to train&#x2F;fine-tune BERT at Azure &amp; Bing. Please let me know if you have any questions!
kdatta1almost 6 years ago
seems like the scripts have a dependency on pytorch_pretrained_bert library. Is there a way to run the dataprep&#x2F;create_pretraining.py or train.py scripts on-premise?
评论 #20473172 未加载
maxlukalmost 6 years ago
If you have any questions, please let us know. maxluk [MSFT]