Does anyone know any stats or details about sustainability/carbon footprint of serverless vs traditional computing. I would imagine that the better resource utilization would reduce total power consumption by a similar rate to the number of resources that you remove.<p>So, for instance, if you at the moment have 5% CPU utilisation across application landscape and moving to serverless moves that to nearer to 50% CPU utilization across a timescale, could it be honestly stated that _Serverless is better in reducing carbon footprint_ by using 1/10th of total power?
I have used this with a large company before to suggest that switching off servers when not in use was not only more cost-efficient but also helped with sustainability targets (especially if we could track it).