Serious question. I’ve been reading a lot on how software developers can adopt more efficient methods and architectures in order to reduce energy usage: e.g. “green software engineering” etc. All other things being equal, a workload that uses less energy to perform the same job is a good thing. But companies don’t pass those savings to the environment. They reinvest into more projects, eventually consuming more resources.<p>Do economic concepts like Jevons Paradox and induced demand cover these efficient software scenarios as well, and does that mean that improved efficiency will make the situation worse in the long run? I really want the answer to be “no”. I’m not trying to debunk anything, just making sure there’s an answer for this I can share with others, especially my customers.