This is a very interesting case. As an architect though I can't help putting in my 2 cents that cooling isn't simply a binary choice.<p>Firstly, the best thing to do environmentally would be to use the heat positively, connecting it into a centralised heat-distribution network for other users (for instance, other industries, public heated pools, or even residences).<p>Secondly, even without power, there is a lot you can do to maximise the heat-loss of a building: Eg. design to capture prevailing winds and utilise them to help heat escape, or internal arrangements to keep cooler 'microclimates' around areas used by maintenance staff and temperature-critical server elements. I am not sure what the design process of these buildings is, but I would bet that this sort of design doesn't get thought about.<p>Thirdly, you can also actively cool buildings without 'chillers' (refrigeration units), such as pumping earth-cooled water through the building (as 'chilled beams'). This is actually likely to be quite cost-effective for large projects like this, and much more environmentally friendly than burning coal.
> During these periods, the temperature inside the data center can rise above 95 degrees.<p>:-/ 95 degrees <i>what</i>? They don't use Fahrenheit in Belgium, you know (nor does most of the rest of the world, for that matter).<p>For anyone wondering, 95F is exactly 35 degrees Celsius.<p>Also, it would probably have been more relevant to mention the equivalent Belgian and EU institutes instead of the OSHA ... seriously, I don't expect for a second that the regulations would be the same. Roughly, maybe, but it just doesn't make sense to mention the OSHA at all if the servers aren't in the USA!
Is it just me, or did this article strike a dystopian tone? Computer centers of the future, perhaps achieving truly useful AI capabilities, consisting of unimaginable numbers of racks sweltering in a heat so hot that people can't survive in the same room with them? Sounds like something Dante would have written if he were alive in 2012.
I wonder how much money companies waste by keeping their server rooms as cold as a refrigerator. Some IT guy who thinks servers should be cold probably told them that.
Did anyone else think of the scene in Sunshine where Chris Evans gets stuck under the computer that's being lowered into fluid and freezes to death?<p>The mechanism for making "the hot aisle" temporarily habitable fails, you get stuck on/in something and can't get out in time.... Life imitates art.
Did anyone read the last two paragraphs? Seems the article wasn't quite done being edited:<p>> Before entering the hot aisle, a technician uses a supply trigger, typically a switch located outside the hot aisle, to activate the SmartAire T units. Cool air then enters the hot aisle until a comfortable temperature is established. SmartAire T units maintain this temperature until the technician completes the assigned work and deactivates the units, eliminating any need for rest periods and increasing productivity.<p>> Before entering the hot aisle, a technician can use a supply trigger – typically a switch located outside the hot aisle – to activate the SmartAire T units. Cool air then enters the hot aisle until the temperature reaches a comfortable level.
I honestly have to wonder why we don't have robots doing the majority of these tasks. Obviously, a robot couldn't troubleshoot detailed or obscure problems, but for things like provisioning a server, rerouting around bad network hardware, replacing drives, NICs, etc. they should be good enough. CERN already uses robots for the LHC tape backup system. I can't see that this would be that much more complicated.
I can't help but think about the logical extreme of this type of trend, which is to put the server rack itself outside- what components would you need to weather seal or protect against the elements? There are fans and disks that have moving parts, and cable connections you would have to seal, but other than those changes what would you have to do? Maybe it would be even more efficient....
One aspect of this approach they don't mention is how it affects the MTBF of the equipment. Many components suffer increased failure rates when run at a consistently higher temperature. While it may be relatively safe to run machines in a range higher than is comfortable for humans, I bet there's an upper limit past which components really start to fail in numbers.
I'm interested in how much "efficiency" can be built in to new data centres.<p>Is it possible to use the waste heat for other purposes? Would painting the roof white make any difference?
These guys are whimps.<p>I used to work in an academic building with an ancient HVAC system in which temperatures in my office would soar to 105 degrees in April, between having no AC, sun shining in the big windows, plus the heat dissipation from 5 humans and 20 computers. My deck included a Dell Windows machine and two old IBM machines running Linux and a Linux laptop and a Sun Ultra 10 and a "Pizza Box" 32-bit SPARC machine and two Sun Rays.<p>(Even if they couldn't charge enough tuition to get a decent A/C system, at least I had access to a pool of last year's hardware.)<p>We never evacuated... You see, that's why the U.S. is #1 -- people in any other country would bail out at 95, but we stay the course. ;-)