Feeds

back to article Microsoft apes Google with chillerless* data center

Microsoft has unveiled a mega data center that operates without chillers, joining Google in the embrace of so-called "free cooling." Google is apparently operating a chillerless facility in Saint-Ghislain, Belgium, and yesterday, its Redmond arch-rival boasted a similar set-up at its new data center in Dublin. According to a …

COMMENTS

This topic is closed for new posts.
Flame

that's a lot of excess heat...

"The 303,000-square-foot data center can generate up to 5.4 megawatts of critical power"

Hold on. If it's generating that much excess heat, rather than dump it directly into the atmosphere, why not run heat exchangers and offer the warm water water to neighbouring office blocks or residential areas as free heating?

0
0
Pint

Two things

One: In before the complaintards bitching about the use of Fahrenheit in the temperature.

Two: On a more serious note, pity the poor engineer who has to work on hardware in that environment! 90 degrees? Oh, hell no. The pathetic meatbags are going to be the ones who need coolers, never mind the hardware.

0
1
Thumb Down

I wish others would consider this

The place where I currently work is stuck with old thinking, and I swear it must've been less than 15' c in the server room the other day! I was bloody FREEZING!

0
0
Linux

Missing the point

No matter how much computing power, their Azure architecture is going nowhere fast!

0
0
Silver badge

@Jules 1 re. that's a lot of excess heat

"...why not run heat exchangers and offer the warm water water to neighbouring office blocks or residential areas as free heating?"

I and many other commenters have made that suggestion in the past but suggested selling it. There is no incentive to install capital equipment to provide strangers with free heat. However, it does seem a waste not to use it in some way.

It's probably that the recovery/transfer capital and running costs would not make it worthwhile for the datacentre owners. Also, there is the additional complication of contractual obligations in supplying heat and the control aspects since you would then have an extermal organisation affecting how much heat was taken out of your cooling system.

This has probably all been analysed and considered in the past and since the datacentre operators are looking closely at running costs, it surely must be an idea that is always considered and rejected. We need a datacentre planner to comment on this.

0
0
Boffin

Water Cooled

Why not build your data centre near (or over) a large body of water? All the cool H2O you could possibly want with nothing more elaborate than a pump required.

0
0
Paris Hilton

What about outages?

Whilst reading about Google's Amazing Load-Shedding Data Centers (tm) I could not help but to think that maybe that is what has been happening with Google Apps, GMail, etc.: the data centers got too hot and they just shifted the load to vapor.

Sorry, I just find it interesting to read about all the technology going into keeping unreliable services running. Sure, 99.99% uptime or 99.99% users operating normally is a fantastic record, unless your business or other critical needs falls into that 0.01% of either. Yeah, nothing is 100%, but whom can you hold accountable when the G-Almighty falls?

Microsoft used to catch no end of hell when its services went down. Good for the goose...

Paris, sauce for the goose.

0
0
Grenade

Antarctica

Stoopid over all.

Go build up your data center in Antarctica, it may melt the ice cap but no need for cooling.

The issue is with ill designed apps wasting millions of CPU cycles & MB of memory achieving bugger all. 50% of the apps' features never get used and 25% of the remaining don't work as expected. Pulled those figures out of my bum coz the statistics app was not running as expected.

0
0
FAIL

Data Center?

Surely Data Centre?

0
2
Gold badge

Two more things

The limit here is presumably set by hard discs. They start to go a bit wobbly if they spend too long above 50 Celsius. However, put a few SSDs close to the system and keep the HDs at the far end of a fat network cable and I don't see why the limit in the machine room shouldn't be over 70 Celsius.

Secondly, I would hope that the operators spent most of their time at some distance from the actual equipment, so this really isn't an issue as long as the room isn't dangerously overheated for those few occasions when you need to swap over some kit. (But yes, unless some robotic assistance is available, I do accept that 70 Celsius is too hot for the wetware.)

0
0
Thumb Up

@jules 1

its probably a bit far out to be practical for homes, but the warm air could be ducted to nearby offices/factories to heat them during winter. MS could charge them a competative rate for this free resource, everybodies carbon footprint gets lighter.

outrageous we are just funneling this back into the atmosphere

0
0

Datacenter temps.

"On a more serious note, pity the poor engineer who has to work on hardware in that environment! 90 degrees? Oh, hell no. "

The cold isle should be below 25C for 95% of the year, in short sleeves that should be comfortable. Also, Microsoft's service is designed for fail-in-place operation as that's how they operate their containers. Repairs aren't time critical. When it gets above 25C in mid summer, come in early and then take the afternoons off at the beach.

0
0

Geographic Opportunism

When will some overpaid consultant come up with the bleeding obvious - put these things in cold parts of the world rather than the Arizona desert.

At a stroke we could save the Icelandic economy and sell the waste heat on to warm the offices. It is simple stuff like this that will save the planet rather than Byzantine carbon trading schemes that no one understands.

So at my consultancy rate of £1000 per day that will be £2.08 (please) for the 10 minutes this has taken.

0
0
Unhappy

D'oh

The reason our server rooms need to be cold is because when we go into them, we get hot and bothered. Fast. It would be ridiculous if we had to wait for the room to become cold before we could fix a problem or swap out duff components, or even install new hardware.

0
0
FAIL

Cut PUE in half - not!

" ... cut its Power Usage Effectiveness (PUE) rating approximately in half ..." - probably not.

Even Micro$oft's press release is misleading with "improve Power Usage Effectiveness (PUE) by approximately 50%".

"PUE = Total Facility Power/IT Equipment Power" from the official definition at Green Grid.

Really bad data centers have a PUE of 2.0, meaning it takes Megawatt of power to cool and light a Megawatt of IT load. A data center that doesn't re-use excess heat (as many other commenters have suggested), has a minimum theoretical PUE of 1.0 - except for the Second Law of Thermodynamics.

More likely is the new data center went from a PUE of something like 1.4 to 1.2 - a reduction of 50% of the "wasted" energy.

0
0

Self centered

As is hinted at in a few of the other commets, for all the pious talk about green data centres, it is all about the bottom line. They won't join a low grade heat generation scheme because it would cost more. It might make a huge amount of environmental sense, but will cut no ice with the beancounters or shareholders. Same reason they don't relocate to really cold places, and why they like Arizona. The cost of power varies dramatically. If you are near a huge hydro-electric system and can negotiate long term big power contracts, you will get very cheap power. (Just look at aluminium smelters.) If you go to some out of the way very cold place, you will find yourself paying many times more for the power. The cost of cooling is less than the cost of the the power that generated the heat - a large system can achieve a heat rejection ratio of about 6:1. That means if the power costs 20% more in the cold place you have lost out. One suspects that all the publicity about cheap cooling is more about trying to put some green gloss on something that would have been done anyway. Geographical diversity of the datacentres means that there was always going to be one somewhere about, so they make the best one can of the location. But that is about it.

0
0
Anonymous Coward

Distribute the data centers more

we need more cabling, and it will give people jobs. More companies and individuals could then offer hosting services, and also heat their operations with the server rooms.

In time of economic recession, big public works that increase utility for society as a whole are the order of the day. We could do with improved sewers, water treatment, telecommunications, roads, recycling centres and buildings.

We need to tear down the old, and replace it with something modern, constant rebuilding and improvement should be the order of the day, that way we can create a labour shortage but it won't matter as we are only improving things not relying on them.

0
0
Silver badge
Boffin

Water-cooled air?

Someone told me the new GLC building in London (the one that looks like a squashed glass football) has no chillers. They use underground chambers where air from the building is blown over water from the neighbouring Thames River, the cooled air then being pumped back into the main bulding space. If you're willing to accept slightly hot running temperatures then that would seem a good option seeing as just about every Western capital I can think of has a major river running through it.

0
1
Pint

RotM

When data centres using AI become fully aware they will crack a cold one and reduce cycles.

Meatsacks will still be required for recreational purposes only.

0
0
Stop

Not Green at all

This is just as bad as Patio heaters are for Global Warming. Cold air in, hot air out, directly contributing to raising of ambient local temperatures.

To call this sort of data centre green, or environmentally friendly is wrong.

0
0
Alert

@Matt Bryant

And precisly how many buildings in a capital city do you think could be cooled from river? how many megawatts of heat do you think should be pumped into a river? are you suggesting introducing tropical fish also?

0
0

@Water cooled air

Sounds exactly like the system used at Sydney Opera House that was on Engineering Connections a couple of weeks ago.

0
0
Grenade

@Two Things....

...Two: On a more serious note, pity the poor engineer who has to work on hardware in that environment! 90 degrees? Oh, hell no. "

Wuss.

Before I did IT I did small scale paper coating (making the sticky label paper). Average temp once the machine furnaces were up an running (fed by 4 inch gas main) was 70 in Winter and often went over 100 in the summer, at one time was peaking at 110.

That's why we had things called shorts and t-shirts. And drunk a hell of a lot of water. Granted got funny looks unloading wagons in the snow dressed like this.

No wonder IT has such a weedy image! Get a grip...

Grrrrrr, I eat nails me!

0
0
FAIL

Temperature "excursions"??

"When the temperature starts to excurse"

Excurse? What the hell's that?

Words fail me. As they do Vijay Gill it seems.

0
0

Regardez votre sang-froid

Why did the comments start being about Arizona? The data centres in question are in Belgium and Dublin - hardly desert conditions. The problem with any scheme that tries to sell the heat generated as a resource (in effect using the external buildings of your customers as part of your heat sink) is that the majority of your problems will arise in the summer months, when your customers may be unwilling to accept this excess heat. If your data centre is generating enough waste energy to warm several buildings in the teeth of a Dublin winter, then there is something seriously amiss inside the data centre.

As for water-cooled air, Google's Belgian data centre does use river water cooling. The difficulty here is keeping any such system clean. Most Western capitals do, indeed, have major rivers running through them, but they're badly polluted major rivers, in many cases, whose water quality is outside of the control of any one body wishing to utilise it, unless (as is the case with Google's Belgian data centre) you include a full water treatment plant on site, to treat the water before using it. This is cheaper than using mains water, but does represent an extra cost and energy overhead, in what is supposed to be an energy-saving measure. Even when the risk to human health can be eliminated (where you are simply cooling machinery, and have less to worry about, with regards Legionnaire's disease, or what have you), the tendency for algae, bacteria and even fungi to build up in these systems and eventually clog them can mean you incur not-inconsiderable costs, down the line: once established, a population can become difficult to eradicate.

Finally - Register writers, please - interesting as they are, I have real problems reading these stories past the point where I encounter the words "Mountain View Chocolate Factory". Please, enough with retelling this not-very-funny joke, over and over again. I sometimes feel I'm being hit with a hammer by a bloke who demands that I laugh. We get it, okay?

0
0

@ Dennis Healey

'When will some overpaid consultant come up with the bleeding obvious - put these things in cold parts of the world rather than the Arizona desert.

'At a stroke we could save the Icelandic economy'

Already being done: http://www.verneglobal.com/

And they get points for extra smugness because all their power is renewable.

I hope you haven't spent all that consultancy money ;)

0
0
WTF?

@Stuart Ball

'This is just as bad as Patio heaters are for Global Warming. Cold air in, hot air out, directly contributing to raising of ambient local temperatures.'

But surely you can see it's better than using vast amounts of water and electricity (probably non-green energy that also pollutes the environment), in order to cool a room to lower temperatures, when the temperature outside is already similar. The computers WILL give off heat, it has to go somewhere. Better to have it go outside than to keep it inside but waste electricity to keep the temperature low...? Those chillers probably give off loads of heat anyway.

0
0
Silver badge
Flame

Recycle the heat

It never ceases to amaze me that heat generated by computers is just thrown away. Now, blowing it outside is being trumpeted as a step up from pumping it outside!

In the UK, an office building needs heating at least half the year, during which time the server chiller plant should be capable of pumping the "waste" heat into the office central heating system, as a first step towards reducing energy waste. Yes, this would be a less efficient heat pump than conventional chillers, but since all the energy used in pumping also ends up in the heating system, that shoud not be an issue - the saving on the fuel not burned in the building's boilers will be greater.

As for the other six months, it ought to be possible to pump the heat into the ground, and then retrieve it in winter. This is already the basis of the least CO2-emissive form of office air-con: ground source storage, pump unwanted heat down into the ground in summer, pump it back into the building in winter. In the UK, the amount of heat needed in winter exceeds the amount pumped out in summer, so adding "waste" heat from computer systems would work rather well.

0
0
FAIL

Smoke n Mirrors

If you use chillers & free cooling with mechanical as a stand by then it's more efficient than the MS example:

1. If the outside temp is suitably low then you use purely free cooling.

2. If it's a bit higher then you switch on the chiller (can use rainwater collection with these).

3. If it's a particularly hot day then you use your mechanical on top.

If you don't bother having chillers then you'd have to switch straight to mechanical when you reach '2' which isn't as economical.

When hosting a DC in the UK or Ireland, 97% of the time the outside conditions are as such that you can get by with nothing but free cooling and the occasional use of the chiller, move further north and that figure will become 100%. Having mechanical cooling that you never switch on is as efficient as not having it.

The headline suggests that MS are achieving a PUE of 1.0 or that they can run without any cooling, 365 days a year, this clearly isn't correct (even if it was you'd still need air circulation and humidity control which uses power). IIRC the best Google has managed for DC efficiency was with a PUE of 1.18, can't see anything in this article that suggests they've been 'aped' by MS.

0
0
Headmaster

@Dennis Healey

Hmm, consultancy rates are all very well, but you might need to contract in an accountant to do your billing for you....

£1000/day equates to approx £2/min assuming an 8 hour day. You have given the client a 90% discount on your consultancy.... best leave invoicing to the bean counters!

0
0
Flame

@Anonymous Coward Posted Monday 28th September 2009 08:02 GMT

> how many megawatts of heat do you think should be pumped into a river? are you suggesting introducing tropical fish also?

~250MW could be dumped into the Thames giving ~1°C temperature rise.

0
0

@Roby 1024

"The computers WILL give off heat, it has to go somewhere."

Isn't that the problem of the Cloud Architecture though, using more than one computer to do the job of one computer?

I work for an enterprise client that is heavily into Citrix to deliver thin Applications that are run on full fat desktop machines, these are then left on 24/7 because of bad GPO settings for power management, and the apparent need to schedule client patching out of hours.

If you are going into a Cloud Architecture, you need low powered clients, such as Revo/Lenovo Q Series, otherwise you are not moving the "heat" that a service generates around, you upto doubling it when one user alone is on a Citrix App server.

I've yet to see a convincing arguement for web based applications, which is what these datacentres are being pumped up for, that and serving adverts.

0
0
Troll

@Tom Maddox & others

"Two: On a more serious note, pity the poor engineer who has to work on hardware in that environment! 90 degrees? Oh, hell no. The pathetic meatbags are going to be the ones who need coolers, never mind the hardware."

How can I put this in todays PC terms.... Only if you are a 'pathetic meatbag' of sub optimal body volume to surface area ratio. Those of us with a body shape defined by nature and exercise rather than McDonalds and IHOP will operate just fine thanks.

0
0
Paris Hilton

Arctic SysAdmins

Its only a mater of time before all System admins along with hardware will be shipped to the arctic (or antartic) on a perminant basis.

I can only note that the iq of the rest of the world will drop with an obvious and contrary increase in population.

On a plus note, the extreme conditions means the I.T. pallor will become more or less mandatory.

Paris cause she will lead the shaganidiotathon.

0
0

@Christopher Key

Looked into using rivers for cooling chillers before, the local authorities will only let you use river water if you return it at exactly the same temperature so no good.

Can use rainwater harvesting though.

0
0

paradise!

People are worried about working in an environment over 25C?! Bunch of bloody wankers! In the summer (3/4 of the year) my *apartment* gets up to 50C every time the A/C goes out (which is a lot). And down to 14C in the winter (about three weeks in February), but only because I like it cold.

Can we convince governments to only let people live in areas where cooling costs are lower? It would reduce our personal footprints a lot! Europe and the west coast of the U.S. would be terribly over-populated, but I'm sure we can solve that without legislation.

0
0
This topic is closed for new posts.