Pretty shallow analysis.<p>How much energy was used for training chat gpt 2,3 and 4?<p>Also<p>> We’ll assume that AI models have roughly the same split of operating costs as a typical data centre<p>I don’t think this is a good assumption. GPT is not a typical application, it requires a massive amount of power hungry GPUs.<p>It’d be better to compare the power cost to non-asic crypto mining farms.