Feeds

back to article Data center efficiency - the good, the bad and the way too hot

Ah, data center efficiency. The big vendors have embraced this topic like an admin embraces a bag of Doritos. Luckily, we're here to separate fact from fiction. Episode 5 of Semi-Coherent Computing has Chris Hipp and me interview Rumsey Engineers founder Peter Rumsey. The folks at Rumsey Engineers know their stuff, having built …

COMMENTS

This topic is closed for new posts.

Typo

"Luckily, we're here to separate fact from fiction. Episode 5 of Semi-Cohernet Computing" -> "Coherent"

0
0
Anonymous Coward

And another....

Doritos (not Dorritos).....

Although I think more SysOPs prefer Pringles :o)

0
0

More typos!

Gasp. It must be Friday. "Dorritos." Indeed.

0
0
Anonymous Coward

typo? who cares?

It must be friday indeed... nothing better to do than bitch about typos? Or a feeble attempt at an insult and demonstrating your self-proclaimed superiority? What next? Typing out the Thesaurus to point out how bad the world is at grammar?

Honestly... get a life.

0
0

Who´s bitching?

Their posts merely point out the grammatical errors in an attempt to keep El Reg up there as the clean cut website that it is....

Yours appears to be the only comment that isnt useful.....

Besides it all a bit of fun, however some "anonymous" people appear to take these sort of things to heart....

0
0

@Anonymous

Shouldn't Friday be capitalised?

0
0

Economics

My 200 MIPS Acorn used to have a 1W StrongARM - no fan required. Good enough for web serving.

Now we have dual core modern CPUs, we pay more for electricty than bandwidth.

0
0
Silver badge

In broadcasting

In broadcasting the switch from air-cooling to liquid cooling already has been made and the difference is amazing. Last year I visited a transmitter site. Which had turned it's old equipment off and the new one on.

From their perspective the main advantages are:

1. It's a _lot_ quieter.

2. Equipment lasts way longer as it's way cooler.

3. It takes less space.

However they also had some problems. Rohde&Schwarz, their equipment maker hasn't been in cooling before. So the amplifier in the transmitter has a lot of transistors bolted onto a copper block. That block has holes inside which are conected by small copper loops on the sides.

There's a valve enabling them to shut off the supply of liquid and when they pull out the amplifier of the rack it will also be isolated. However when they do that, the temperature will rise and eventually the amplifier will turn itself off. This causes the pressure inside the copper block to rise effectively loosening the loops. Doing this a couple of dozend times gets them so loose that they actually fall out..... :(

So when you buy a liquid cooling system make sure it has something to compensate for pressure fluctuations.

0
0

Simple use less power, prevention not cure.

Why do we make this so complicated, spending money on more energy hungry cooling systems. Fix it as source use lower power servers, that are as or faster than existing servers. Also, delete data on disks or archive stuff you do not access daily on to tape. All discussed here: http://blogs.sun.com/ValdisFilks/category/Environment

0
0

Interesting, for more info go here

Good overview of the datacentre cooling issues we are facing.

The Datacentre Specialist Group of the British Computer Society has already been working on this issue, and has come up with some very interesting stuff here http://dcsg.bcs.org/ partic on energy efficiency.

0
0
This topic is closed for new posts.