It’s interesting to see this article for me because it is similar to the “Bitcoin is bad because CO2” articles and arguments that arise from time to time. To which I’ve unsuccessfully attempted to argue you can’t really say Bitcoin is bad and shouldn’t be allowed to exist for that reason without also agreeing that other stupid things we do with computers should suffer the same fate. Which surprisingly attracts a fair amount of defenders for the Snapchat et-als.<p>What we can do is to be neutral to how it’s used, and instead make sure that electricity is appropriately priced to cover its externalities. That means a carbon tax. The same solution is still the right one for Bitcoin as it is for other silly computer uses mentioned in this article. If it becomes too expensive to deliver Snapchat when electricity is properly priced then Snapchat won’t exist. So on.
My cellphone (pocket computer) needs 10 Wh to run all day. For the same amount of energy, the lights in my office will run for 6 minutes. I'm not buying this "must decomputerize" pitch.
Alas, the author failed to do adequate research before getting on their high horse.<p>Yes, cloud computing sucks power (which is then vented to the outside world, it could be recycled _and_ cut latency)<p>but you know what pushes out more CO2?<p>Poor house insulation.
There's a surprising amount you can do with just 8-bit microcontrollers which on average (using extensive low-power mode) use less energy than a housefly. Performance can be as good as 1 microwatt per MIPS. We could run most "smart device" applications off of those little solar cells used for calculators.<p>For house applications, virtually any kind of efficiency gain that computerization gives us would offset the energy consumed in computation.<p>The problem is not the presence of computing, it's that fossil energy does not pay for its externalities.<p>Which isn't to say that computation-intensive ads <i>aren't</i> the worst thing ever.... And it does seem that some of the most hype-driven applications of digitization rely on exceedingly (and often intentionally) inefficient algorithms (blockchain and deep learning), but overall digitization can often be done with very little energy usage for the same benefit. I'm reminded of this cellphone that works on about 4 milliwatts. (although granted, it's using wireless electricity) <a href="https://www.wired.com/story/this-cell-phone-can-make-calls-even-without-a-battery/" rel="nofollow">https://www.wired.com/story/this-cell-phone-can-make-calls-e...</a>
I appreciate the sentiment but this article is riddled with factual errors.<p>1) digitization (which includes tracking objects in the real world with IoT tech) increases efficiency, which results in moving fewer real objects around in the world, thus reducing total energy consumption. Think of Netflix vs manufacturing and shipping DVDs to stores then driving to the store to buy them.<p>2) data centers are far from the biggest carbon creators. Transportation and construction are, by far, #1<p>3) data centers can run off green energy. Combustion engine cars cannot.<p>4) data centers can be built in far away places that provide the most efficient and greenest energy
The author doesn't mention decarbonizing electricity in the first few paragraphs. He simply claims ML is driven by fossil-fuel electricity, when in reality the electricity mix for computation is increasingly green over time. Since computers don't need anything except electrons to run, it's all about power sector emissions here, and that's a bright spot in climate relative to transportation, industry, and agriculture emissions.<p>It'd be far more effective to have a Renewables revolution which pushes the scale of wind, solar, and batteries much higher. That would be far more cost-effective than a Luddite revolution which would ban computationally and data intensive practices in all the most valuable companies.
"Computers are stupid: babies know what a face is within the first few months of being alive. For a computer to know what a face is, it must learn by looking at millions of pictures of faces."<p>Actually this is a completely inaccurate representation. The network to understand a human face can be transferred to a computer in milliseconds. The network to understand a human face gets baked into a human over many many months. Developing that network took a computer millions of images and several hours. Developing that network took biology many eons and billions of lifeforms.
What about a proposal to make journalist write their articles with a typewriter?<p>EDIT: Following the general rquest of writing substantive comments, let's extend this.<p>The problem is that the authors see the cost of computers in other applications, but he doesn't see the benefits. For example using machine learning to detect face has a cost, but it can be applied to many things. From trivial applications like decorate your face in the phone to security applications like face recognition of justice fugitives, to more realistic faces in movies to deep fakes. You may like some of them and dislike other, but for some people the new technology provides benefits that hopefully are most than the cost.<p>For a journalist, probably the most important use of a computer is to write an article. Perhaps also to keep the small parts of info that must put in the article. Make some research by internet instead of traveling to a library. And modern cellphones are quite a powerful computer. You can perhaps replace the cellphone by a line phone, use a lot of cards for contacts and store information. But I think that the journalist will see immediately the benefits of using a computer to write the articles instead of a typewriter.
Stop computering in places where fossile fuel is used for electricity. Build datacenters next to cheap renewable or nuclear energy and access cold air and water. There are hundreds of abandoned industry sites near lakes and rivers in Northern Europe for example.
No one is voluntarily giving up progress, ML or not. The far easier sell to the public is a faster move to nuclear, wind, and solar, with battery powered cars. Then we won't need a Luddite revolution, and progress can continue making lives better.
This article is completely wrong headed. Any profit heavy use of energy can be a boon to decarbonization. The price of renewables has been plummeting due mostly to economy of scale but also to funding of new developments. At current prices, it only takes a nudge of incentive or regulation to push new energy use to renewables. And it's a feedback loop, the more renewables get used, the cheaper their production becomes.
There are a couple of different facets to this that the article ignores. The energy consumed by computers can be offset by recovering waste heat to offset space or domestic hot water heating (which is the bulk of your energy consumption in colder climates). You can do this at a building scale, and even at an urban scale with the use of district energy systems.