Some years ago, when power and cooling issues first started coming to the fore in data center conversations, some of us were joking that the smart thing to do would be to move big data centers to the polar caps and use the cool air keep the iron from overheating. Of course, the polar caps are melting now, thanks to global …
Combined Heat and Data
Just as we've had Combined Heat and Power projects in the past, where the local power station circulated its waste heat to local housing, perhaps the data centre could find a way to transfer its heat to local properties when the outside air temperature is low enough.
No doubt someone has already patented this, if not I claim prior art in the name of the planet to stop anyone trying it in the future.
doesn't MS promote open windows with non MS datacentres.
Didn't they do that down at the bronze windowed AMACO building at Alperton, Wembley in 1998,
a large portion of the data centre didn't see the sun the next day (cos the tea-leaves had nicked all the SUN kit overnight through the windows they removed at the back of the building).
Much better to use ground heat-pumps with the pipes looped under a river or lake, heat the water/ground and cool the data centre, just have more stages to get the temperatures down.
Than to have lots of bugs and other low life gettin in there, drawn in by all those pretty flashy lights all night.
90 degrees or cooler???
I could put a data centre in my kettle.
bug in the system, perhaps?
So what about the gigawatts required to operate the industrial bug-zapper shield? No screen is fine enough to keep them all out.
folding @home in space
How about folding @home in space? -270 degree C should be cooler than north pole. With solar panels, you get both energy and cooling at the same time? Just shoot a bunch of $70 atom boards into space. Someone in sci.physics mentioned that cpu might not function properly with temperature too cold, but I am not too sure about that. Collecting result might be a bit hard though, you would probably need a large dish.
All fine and dandy until that black inky dust gets in
This sounds like a great idea, until you open the case for regular maintenance and find a nice thick layer of oil coated dust from smog and exhaust fumes found in inner city environments.
These days i keep the window closed firmly...
Eh? ITYM Uptown.
Don't try this in London..
.. unless you want MORE equipment stolen.
The prime risk is that you provide an opening. Openings attract rats. Human ones. That is IMHO a problem you need fixing first.
Inquiring minds want to know...
Re the Microsoft tent experiment, 'the servers just kept going' they weren't running Windows then?
Open Wall cooling, I can image it now,
HP Service Rep; So, you the servers have failed because *what* happened?
Admin; Umm, a flock of pigeons took up residence and crapped on the Citrix Farm, field mice made nests in the fibre to the SAN switches and magpies have pecked out all the disk activity LEDs. Oh, and we've got a disk failure on the Exchange box. Is that covered by the warranty?
@Combined Heat and Data
It's been done. One place I worked, we were replacing a mainframe, The new one was a CMOS job, compared to the old ECL hardware it used much less power. The prime selling point was that it would save the organisation £250k annually in power consumption. (N.B. This was back in the nineties, when by comparison with today, electricity was practically free).
In it went, out came half the chillers that had been necessary to keep the old unit from frying itself. The power savings were, indeed, impressive - until it was pointed out that the swap-out had been done during the summer - during the winter the excess heat had been channeled upstairs to help warm the IT centre which was directly above it.
reclaim the heat
Dave: I'm sure I've posted comments on this before so between us we'll challenge any patent trolls. I don't know how much the waste heat is recycled even within datacentres: for example, how does the Googleplex heat the hot water for its washrooms and kitchens? Or the employee swimming pool?
I think the real answer though lies in decentralising the data processing itself, so that you replace the oil or gas burners in domestic boilers and furnaces with xeons crunching arbitrary numbers for gazillions of virtual machines in a vast distributed cloud.
I assume they don't get the levels of rain we do here in Blighty.
Whilst I think a good breeze off the Teardrop Lakes would be good for cooling, I'm not sure that the high level of... well.. "bloody water falling from the sky in sheets" would be that good for our servers.
"the polar caps are melting now, thanks to global warming, not supercomputers"
Yes! Thank you! Global warming! Not supercomputers! And not anything else that consumes electricity generated by burning fossil fu.......oh yeah.
The problem with that is that the heat is actually really hard to reclaim. If, for example, you have water cooled racks you don't get a huge temperature rise in the water, and it is very hard to efficiently extract the heat back.
No such thing. Intakes from the great outdoors - but exhaust through heat exchangers. You now have an energy differential, which can be used to generate... power. Not a lot, but surely better than just venting it?
One wonders if these great IT 'innovaters' are at all serious about the idea of cutting running costs?
A big section of the datacentre wall opens up? And then some chav steps in and steals a few blades. Not to mention what pigeon shit can do for your expensive hardware.
It's not just about temperature you know. Humidity control is important too. And what about the weather, do you really want to risk letting all sorts of precipitation and fog into your datacentre?
Obviously this stuff has a place, but I think you'd be looking at quite a high initial investment for indcution and extraction systems with filters and humidity control. Ignoring the state of the planet for a moment and concentrating on cost, it would take some careful costings to work out how soon such an investment would pay for itself.
Security, and you still have to get the cold air into the middle of the rack?
Not so sure about the security appeal of a datacenter where the whole wall pivots up! No need to ram raid, just put an icecube on the outdoor tempreture sensor and - voila! - instant access! And then I'd really want a wall-sized air-filter as I've seen the havoc of metallic dust from a grinder blowing into a datacenter room. But even with my cold outdoors air for free, I've still got to pay for fans to control the airflow throughout the datacenter, and seeing as the main AC units for use during the day are unlikely to be on the garage-door-cum-wall that means I need two sets of fans - one on the AC units to blow air in one controlled airflow, and one to blow evening air round. So, whilst you save on the chilling of the air, I doubt you are getting everything for free.
Mind you, I'm told the GLC building has a cooling system of blowing air over water from the Thames, so not totally new. Places with a permafrost layer and a good telecoms network would seem to be an even better option as the floor itself can be used as a heat exchanger - look out for those Google datacenters springing up in Alaska, Iceland and Greenland!
We did this all the time when the A/C failed
In a major EU government organisation, the server room A/C went through a "dodgy" stage in it's life. We would get a call that it had gone down, dash to the server room, open as many doors as possible, then have to run round the room opening the cabinet doors to let the air to the servers.
If you just cool the room and still have cabinet doors designed last century (which almost everywhere I've worked still does) then they were designed for bottom to top cooling (convection assisted) rather than the current vogue of front to back cooling. So the efficiency of getting cold air into an almost sealed cabinet (the air supposed to come through a floor vent underneath and exhaust out the top) from a room when there are no A/C & blowers is limited.
NB Convection assist cooling will be a familiar concept to anyone starting a fire : start it at the bottom and the flames naturally move up. Which is why it rapidly went out of fashion. A few data centre incidents of the "server completely burnt out and halon everywhere" variety soon changed the design.
waste of heat
Why not link the data centre up to the central heating system and use all that heat to heat the building and provide hot water. If not needed in the building itself it could provide heating and hot water to surrounding buildings.
Its just not efficient to use heat generated from servers. The Carnot efficiency is too low as the overall temperature rise to too small.
Data centre security??
So they open up the walls & what happens to your security?
It's out the window (ba-dum tish) as some scally runs off with your blade.
Ok I'm sure they have some sort of screen to stop this.
What about nesting birds? Wasps?
What about when its raining horizontally?
Its obvious, but so few people do it! We're having new boiler equipment fitted to our offices, so they're nice and warm for the winter. But RIGHT NEXT DOOR is the aircon equipment venting heat into the atmosphere 'captured' from the computer room. Of course, no-one has thought to combine the two to offset some of the cost of using two lots of energy.
My computer equipment is due to come in from the cold (out in the shed) where it lives during the summer months to keep the house warm in the next few weeks - it means I can reduce the time the central heating is on during the day by about an hour a day because of the continuous background heat - it soon adds up! Just a shame its so noisy!
The military have been doing this for years.
I've worked in a couple of tented server rooms. When it gets hot they just open a few flaps, when its cold they close them. Of course it doesn't help much if you are in a desert as its just bloody hot whatever you do!
A few years back, I was called out to a data centre in the former-DDR. Servers were tripping out 'over temperature', but the air-con seemed to be working perfectly. On arrival I found all the windows open to the chilly outside air - unsurprisingly the thermostats were inhibiting the air-con.
I asked one of the local operators, who was seated beneath a large 'Nicht Rauchen' sign, why the windows were open. "So that when we smoke, it doesn't set the fire alarms off", he replied.
Are you seriously trying to tell me that the _normal_ mode of operation is to re-circulate all air?
If the A/C is pulling in air from outside then opening a wall will make very little difference: more dirt and dust, less physical security, but bugger all difference to the 'leccy bill.
umm couldnt the potential sun light (not that we get any lol) heat up all those nice black metal cabinets? as any black car owner will know even a bit of sunlight turns em into an oven!
i suppose it depends on the number of windows and sizes
now if you opened them could you imagine the racket? id rather not live next to one :p
Old and obvious ?
We've done that at work for years - in part because the landlord won't allow AC units on the outside of the building. Blow cold air in at the front of the racks, suck the hot air out at the back - works in all british weather, even the hottest of summers we've had lately.
Computers contribute to global warming too
"Of course, the polar caps are melting now, thanks to global warming, not supercomputers, so there goes that idea."
Eh? Don't even think of trying to disassociate computing from contributing to climate change (which is what we more properly call global warming these days) - computers use electricity, and datacentres eat vast amounts of power. Even if they're going to open their windows at times instead of using power-hungry air conditioning, the servers are still going to be chewing through electricity - and all that electricity requires a turbine to be spinning to generate it. More often than not, that is generated by burning fossil fuels.
Before someone says 'nuclear', well there is a sophisticated argument that basically says that all the gubbins that makes nuclear electricity generation possible (massive construction, support services and storage for spent fuel) means that nuclear power can hardly claim itself as being carbon neutral. I dunno what I think about nuclear power, but nonetheless that's not an argument that can be ignored simply because it is inconvenient.
I'm all for generating as much electricity as possible from renewables (wind, wave, water), but we're not anywhere remotely near doing that.
So in the meantime don't go around claiming that computers have nothing to do with climate change - because they sure as hell do.
Comment on ‘Data centers embrace The Great Outdoors’
I agree with previous commenter; why not go one step further? In cooler climates the warm air can be used to heat up adjacent offices. I've often heard of gamers that their machine is powerful enough to stop the central heating radiator from kicking in, so why not do this at a larger scale? Air condition companies could specialise in re-using the surplus warm air in winter time.
Where exactly did they document the moving walls to help with cooling, it jsut says they can use outside air for 75% of the cooling at times
Nowhere in that article was there any mention of humidity - which for microelectronics is a big deal. Just as big a deal is static buildup, which you get if your air supply is too dry.
Presumably these trials have not had sufficient scale to see whether there are increased rates of component failure in high density colo environments using untreated air supplies?
In the 1970s a bunch of London hackers with an ICL mainframe occupied what they called the Galdor Center, which was cooled by opening a roller-shutter door. This was fairly widely publicised and probably makes any attempt at patenting the idea difficult.
They've been doing that in the Middle East for hundreds of years which proves it works in very harsh climates. It's called a wind tower: http://hubpages.com/hub/Wind_Tower_-_An_Architectural_Element_of_Local_Identity_in_UAE. You can also find an example of such a wind tower a bit closer to home, at the Palazzo dei Normanni in Palermo, Sicily.
For more recent prior art, see the Swiss Re building @ 30 St Mary Axe, London, aka the Gherkin: http://www.fosterandpartners.com/Projects/1004/Default.aspx
Keeps it cool, but...
How do they keep it clean? Worked in a data center at college (late 60s). Had a cooling breakdown in the computer center and ran most of the winter (in Michigan) with the windows open in the computer room (security? we don't need no stinking security!). Kept it cool, but god the innards of the beast were filthy.
Paris, cause even she knows you've got to keep it clean.
Why not use chimneys?
Round here (post-industrial west yorkshire) we still have lots of mill chimneys.....
It's an old technique but should work.
Cool air in through the bottom, a vaulted roof ending in a tall chimney et voila a strong draft....
I've always said Iceland should be the world datacentre capital.
Iceland have hydro schemes so powerful that they use them to run some of the world's largest aluminium smelting operations, collocating the foundries with the dams to reduce transmission losses to practically zero. On the other side of each dam is thousands of litres of water barely above zero -- one massive heat-sink that has already effectively been separated from the natural ecosystem by a humungous concrete wall.
A lakeside datacentre deriving all of its power from a local renewal source and being cooled passively by the sub-arctic climate.
Come on people, it's both economically and ecologically viable, surely?
Some 20 years ago I worked for Rockwell, and was taken to one of their very large processing centres in the backwoods of America. At the time our own Mainframe room in Peterborough was a fluorescent lit, air conditioned cave deep within the building, and I was pleasantly surprised to notice that the USA equivalent had floor-to-ceiling windows at intervals, a few feet wide, through which the forest and it's critters were clearly visible.
At the subsequent meeting I was asked what I thought of the machine room. I said "I like the windows". It was like farting at a royal garden party. "They aren't windows" I was told, in a voice loaded with distaste, "They are environment awareness panels".
It seems that there had been a lot of weird behaviour by the machine room staff (as I would have expected) and various consultants had ruled that this was because of the disconnect between their working environment and the real world. What was needed, they were told, was randomly placed environment awareness panels so they could catch a glance of the outside world as they went about loading tapes etc.
Then I jetted in from England, and in an emperor's-new-suit moment pointed out that they had not spent half a million dollars on a sophisticated psychological work-aid, but on "windows".
My career never quite recovered.
While we're on the prior art thing...
Wouldn't it be great to harness the waste heat from these data centres and channel it into greenhouses for growing tomatoes, capsicums etc during colder months, thus reducing the "need" to air-freight them from half-way across the globe?
Yay greener tomatoes!
Don't claim prior art, patent it and make the patent free to use for all, that'll stop any court cases dead in their tracks long before anyone starts to think of 'em.
Harris here we come?
One of the impediments to generating electricity from the copious wind power in the Outer Hebrides is the cost and ugliness of great thick copper cables to bring the electricity to the cities where the it will be used.
So why not move power-hungry Data Centres to the Outer Hebrides to exploit both the locally-generated electricity and the superior natural cooling from the colder/windier climate there?
Half way house
Big installations with a chilled water system and a cold climate have another option. When the weather is cold enough we switch off the chillers and pump the water through a free-cool radiator system on the roof.
Takes a lot of additional plumbing and you still need to power the pumps but you keep control over dust and humidity.
I think you are on to one important fact - the manufacturers should be looking to build more environmentally tolerant equipment.
As I understand it, the most environmentally sensitive items are the tape storage units, whilst the least sensitive is telco equipment.
Maybe we would better to divide up the area in the data centre into separate areas with differing levels of services suitable for different classes of eqipment.
Re: Combined Heat and Data
Sorry Dave, I think you'll find that several Universities with a Cray-2 in the basement were using this concept to heat the entire campus about 20 years back...
Flames, 'cause, servers are, like, hot...
Does this mean...
...that now it will be really easy to steal all those servers- we can just climb through the window!
What about fire suppression?
The only reason we can't do this is because of the need for a sealed computer room for the lovely halon system we have. How do they get around that? Do the walls/windows slam shut as the fire starts? What about humidity? I seem to remember some parts of California having a lot of fog. And are these signle story buildings/is the computer room on the top floor - they have to have a waterproof ceiling surely and that can be a problem with open windows.
A better option may just be to watch whose equipment you buy; some suppliers rate their equipment as fine in 50C working environments, and some like it cooler. If staff don't need to go in, switch the thermostat up a notch or too and see the AC do less. Spending £1,000 to cool a £500 server where MTBF is still tens of thousands of hours makes little sense.
Let the flaming commence...
"Microsoft's data center managers wanted to test the idea of how computers could handle the Washington weather...running workloads from November 2007 through June 2008...the servers just kept going."
They must have been running linux then...
Canada. Here I come.
Simple. Build your data centres where it is already cold. Job done.
Wouldn't work in London though.
Quite aside from the obvious weather issues, if you've ever seen what pigeons can do to the underside of a railway bridge, can you imagine what our grey flying rats would do to a 10USD data centre?
Plus, it would be 15 minutes or less between opening up the walls, and the travellers moving in!
Paris, because I suspect she lets the cool air in at night.
- 'Windows 9' LEAK: Microsoft's playing catchup with Linux
- Review A SCORCHIO fatboy SSD: Samsung SSD850 PRO 3D V-NAND
- Was Earth once covered in HELLFIRE? No – more like a wet Sunday night in Iceland
- Every billionaire needs a PANZER TANK, right? STOP THERE, Paul Allen
- Breaking Fad 4K-ing excellent TV is on its way ... in its own sweet time, natch