"In particular, they found that a tuning process known as neural architecture search, which tries to optimize a model by incrementally tweaking a neural network’s design through exhaustive trial and error, had extraordinarily high associated costs for little performance benefit."<p>This is specifically the ONE instance that uses five cars worth of CO2. The next highest example used 435x less CO2. Click-bait titles like this incite unnecessary ire.<p>If you're spending $1,000,000+ on cloud compute costs, you're (a) hopefully very aware that this is burning a lot of CPU/GPU cycles, (b) very unlikely to try it again if the gains are minimal as the paper states.
“In a new paper, researchers at the University of Massachusetts, Amherst, performed a life cycle assessment for training several common large AI models. They found that the process can emit more than 626,000 pounds of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car (and that includes manufacture of the car itself).“