TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? [pdf]

2 pointsby freedudeover 1 year ago

1 comment

freedudeover 1 year ago
"ENVIRONMENTAL AND FINANCIAL COST Strubell et al. recently benchmarked model training and develop- ment costs in terms of dollars and estimated CO2 emissions [129]. While the average human is responsible for an estimated 5t CO2 per year, the authors trained a Transformer (big) model [136] with neural architecture search and estimated that the training procedure emitted 284t of CO2. Training a single BERT base model (without hyperparameter tuning) on GPUs was estimated to require as much energy as a trans-American flight.""