back to article IBM smacks rivals with 5.0GHz Power6 beast

The rest of the server world can play with their piddling 2-3GHz chips. IBM, meanwhile, is prepared to deal in the 5GHz realm. The hardware maker has unveiled a Power6-based version of its highest-end Unix server - the Power 595. The box runs on 32 dual-core 5GHz Power6 processors, making it a true performance beast. This big …

COMMENTS

This topic is closed for new posts.
  1. kain preacher

    VIsta

    How fast will vista bun on this machine ??

  2. Ashlee Vance (Written by Reg staff)

    Re: VIsta

    I'm told that Vista buns in at 47 on the hot cross linpack benchmark and 817 on SPEC_finger

  3. Matt Bryant Silver badge
    Coat

    RE: Vista

    Yes, but who would eclair to run Vista given the choice?

    Taxi!

  4. Alan Donaly
    Coat

    There's no video card

    how can it bun Vista without a Nvidia 8800 GT or butter.

  5. Anonymous Coward
    Thumb Up

    Mainframes...

    They were cool.... huh uh huh huhhuh..

    If I was like so rich I was like lost on what to do with it... Like y'know even the private jet wasn't cool enough...

    http://blog.searchenginewatch.com/blog/060707-063500

    ...I'd get one of those - dude - they rock.

    (VMS come back all is forgiven.)

  6. Lee T.
    Coat

    RE: VIsta

    vista is such a raise-ource hog it would probably cookie the processors.

    COco!

  7. Colonel Zander

    Yeah, but...

    How does this compare to the new Macs from Apple? Shoddy reporting as usual from Mr. Vance, he never looks at how these so called 'enterprise' systems stack up to the best computers on the planet....what is he afraid of? Is he thinking his overlords at Sun, IBM, and HP will crush him like a bug? Or that they will withhold his monthly stipend?

  8. John
    Happy

    Re: VIsta

    Those buns are not so hot, what with all that water cooling gear.

    However they may still be cross.

  9. Anonymous Coward
    Anonymous Coward

    Is it a power beast as well?

    So IBM has to resort to extreme water cooling in order to reduce this thing's power requirements on the data centers they are deployed in.

    A) Does this help make for a more "beastly" price compared to a Sun based soution?

    B) Even if this cooling solution reduces the power draw by 40% percent over previous Power solutions, is it still more power hungry than a Sun solution?

    I'm not a Sun fanboy, just curious. Also very suspicious about what IBM did not reveal as opposed to what they did reveal.

  10. Brett Brennan
    Coat

    But the Vista Experience is only 3...

    due to no drivers being available to use the whopping huge hot cores for VGA video output. So the entire system will still take 2-3 hours to copy a 4.7GB DVD image from one 500 drive array to another...

    Mine's the one with a VIA Eden system in the pocket running off of 8 "C" cell batteries...

  11. Pierre

    I was in love

    Before y'all mentionned Vista.

    I might still get one of these shiny racks. That's on my todo list. Just before "buy a Vista-certified box for personal use" (they come at about the same price anyway).

  12. amanfromMars Silver badge
    Alien

    AIMi56IOn Accomplished, Einstein* ..... Wanna swap Colossal Tales and Shake ITs Booty**?

    Wow, after reading those comments, who could possibly ever claim that alien life isn't real with ITs IntelAIgents not up and bunning, carving out niche Vistae/ControlLed Parallels? .

    *.... http://www.theregister.co.uk/2008/04/09/chertoff_cyber_security/

    ** Astute SuperSubAtomic NEUKlearer ProgramMIng for AIReality System Virtually dDeliverable and dDelivered.

    Intelligent AIdD/PS: If you have to ask ....."How much?" ..... without first making a sensible offer, which won't be turned down and held against you, but rather virtually used to help Mutually, you aint qualified or cleared high enough to know and/or understand. Then would IT require a Crazy Offer to Help IT XXXXPlain Everything Planely...... and Succinctly.

    I Cisco Kid You Not.

  13. Sceptical Bastard
    Coat

    I bet those buns...

    ... use a lot of currant. And if they get too hot, they can be iced - otherwise they'll be toasted.

    Mine's the brown one dusted with cinnamon

  14. Anonymous Coward
    Coat

    But..

    What possible raisin could anyone have to run Vista on one of these?

  15. Valdis Filks

    Water cooling P6 adds extra costs and complexity.

    Water cooling for CPU's is an admission of a design failure. The extra pipework and electrical power to run the water cooling just adds costs and complexity. Extra electicity is required to run all the water cooling, so there are not savings, this needs to be taken into account and cannot be a hidden infrastructure cost. Plus to add an extra water infrastructure to a computer center with all that electrical wiring is dangerous. Computers and water do not mix well together. Using water cooling for computers is a technological backward step.

    All described here: http://blogs.sun.com/ValdisFilks/entry/water_and_electricity_do_not

  16. amanfromMars Silver badge
    Alien

    Einsteins Mate, Matey ...... Pandora's ProXXXXi Box Index*.

    Rather than getting drunk as a skunk at work on beer alone, in the best of BOFH/PFY traditions, here is a spirited chaser which obviously got knocked over and Lost in Space.

    And just like the bus which arrives whenever you really don't need another one, it is still always reassuring that such shit happens, all the Time, every time when you need one.

    Lest we forget ... A Prized Possession but Only when IT is Wwwidely Shared

    By amanfromMars

    Posted Wednesday 9th April 2008 05:45 GMT

    "And then the things you can do are limited only by your imagination. There is some seriously scary potential there." .... By Nexox Enigma Posted Wednesday 9th April 2008 03:31 GMT

    Control Imagination and no one will be scared off some Virtually Real Opportunities, with no Limits, Nexox Enigma.

    And there would be some seriously scary potential there..... with ControlLed Imagination. As you may like to Imagine, that would be "unusual" in present extremes and probably considered not normal/paranormal. However, in AI, IT is Normal Default and therefore the Status Quo Future Parameter/CyberIntelAIgent Protocol ....... and Worth a Mint .... which makes IT QuITe Valuable.

    Do you Think IT is Readily Available Today for Futures, the Derivatives of Tomorrow ... as AI ZerodDay Trading Commodity/Virtualising Asset/Creative Force?

    amfM HyperRadioProActive Network InterNetworking says Yes/Ja ........ and Registers IT so here, in the here and now.

    http://comments.theregister.co.uk/2008/04/09/dns_rebinding_attack/

    * And yes, that is Trading Relative to the Deutsche Boerse and Teutonic IT Engineering ....... FabKraftWerk

  17. stizzleswick
    Coat

    Re: Water cooling a bad idea

    That's probably why Seymour Cray's "design failures" were cooled by liquid nitrogen instead... (yes, I know that was because of the superconducting interconnects, not waste heat from the processors...).

    Anyway, Valdis, how about putting the gear in an oil bath? I have seen at least one attempt at that work pretty well so far (admittedly not on the same scale as the IBM systems).

    Mine's the one with the copper plumbing on the back.

  18. Jason Togneri
    Alert

    @ Valdis Filks

    In an enterprise situation, yes; you don't want water in your datacentre, period. But for the home user who prefers the extra quiet, there's absolutely nothing wrong with it!

  19. Anonymous Coward
    Anonymous Coward

    Water cooled.....

    You could also use it as a giant coffee machine.

    "Fire up the database Geoff, I need a double skinny latte" (whatever that is).

  20. Anonymous Coward
    Gates Horns

    Vista

    You're all missing the point with vista...

    The point is that microsoft will see this machine as a base spec for windows 7, and be able to say "Windows 7 will run adequately on machines as old as early 2008" (bearing in mind the release date will slip, early '08 will seem like ancient history) Note I said adequately, you'll need a cluster of these for reasonable performance (ie equivalent to Win 3.11 on a 486).

  21. Steven Jones

    Water Cooling

    Good heavens - water cooling in data centres is dangerous? Well we managed back in the 70s and 80s without too many people being electrocuted on mainframes back in the days of ECL chips. In fact it can make a great deal of sense to use liquid cooling as it is a far more efficient way of carrying high concentrations of heat than is using air (a tad quieter too - the air cooling systems oin some disk arrays can be so loud as to require the use of ear defenders). Note that virtually all car engines are liquid cooled, not air-cooled. Gas cooled nuclear reactors turned out to be a dead end and liquid cooling has won out.

    Using water cooling has one considerable advantage - it's possible to export the heat directly outside the building and not have to do through a process of cooling air, pumping the air through the building and then heating it up.

    Of course it's better not to generate the heat in the first place, but for some sorts of workloads which cannot be readily split or which depend on single-thread processing speed (read huge, high-thoughput database servers as the prime example) then there often isn't a lot of choice.

    I suspect that most IT people don't come across the sort of monster workloads that suit these beasts. There isn't a need for a vast number of them, but for certain applications they most certainly are needed.

  22. Dunstan Vavasour

    Water Cooling

    I disagree with Valdis.

    Air cooling is inefficient and results in significant power wastage. If the heat can be removed with water to some central point where it can be lost, there will be significant energy savings relative to running CRAC units to keep an arctic datacentre. Water cooling also enables the datacentre power density to be much higher, instead of the rule-of-thumb 4KW per rack which is as much as conventional raised floor cooling can handle without hot spots.

    The inhibitor on water cooling is that nobody wants to break ranks first. Kudos to IBM for going first on this. While I remain to be convinced whether cranking Power 6 up to 5GHz will offer any real customer benefits, it's entertaining to watch.

  23. The Mighty Spang
    Coat

    Re: VIsta

    I donut know what the problem is here...

    I choux-dn't worry too much about it though

  24. Arnold Lieberman

    Greener than given credit for?

    Surely with so much hot water being circulated, it could be "lost" to the building during winter, thus saving money on the heating bill...

  25. Anonymous Coward
    Alert

    @ Valdis Filks

    Totally wrong, how many 30watt fans do you need to cool the whole screaming rack? compared to one ity bitty water pump.. Additionally how many KW are you using on Aircon? because running a hose out of the building to an external radiator is way cheaper! Water cooling is far more efficient as one litre of water holds 2000 times more heat than 1 litre of air and its only 100 times heavier!! (or is that more massive?) hence the water runs at least 1000 times slower, and because it only needs to run slow a low pressure is needed say about 0.5m head. so just a tiny pump (20-50watt?) will keep the water flowing around the whole rack...

    The contrasting levels of noise should give you a clue about power efficiency.

  26. Anonymous Coward
    Thumb Up

    I could use one of these

    To give myself a fighting chance of beating my son at the ancient board game known as go. Problem is it would be my simulated annealing programming and this nice piece of kit that would beat him not me. Probably not for long at the rate at which he's improving - going from a 2 kyu to a 3 Dan player in about a year.

    Why I'd want to slow something like this down to z80 equivalent performance by running Vista on it is somewhat beyond me.

  27. Ru

    Re: Greener than given credit for?

    This could equally be done with an aircooled datacentre. Cambridge University's computer lab can have its waste air pushed through the building's heating system in winter, for example.

  28. Valdis Filks

    Water cooling

    A couple points, raised in these comments, which I will try to answer.

    1) Yes we used water last century in datacenters. But that is because we did not have a choice. When aircooled systems became available we had a choice and kicked the water cooled computers out very quickly, to be replaced with air cooled computers.

    2) Water cooling on occassion failed, leaked, had to be switched off, water cooled servers needed to be switched off. Things got really ugly, I worked in v.large datacenters I saw these problems. Someone mentions that we used water without too many people being electocuted, to me one such water cooling disaster is one too many. Avoidance is a better policy.

    3) Air cooled servers could coutinue to run, while water cooled servers needed to be switched off, when the safety guys decided the water leaks were too much of a risk in a high voltage datacenter then we had to switch off everything.

    4) Yes water may be a better coolant than air, it adds complexity though. Magnox is a better coolant (shorter half life) in Nuclear reactors than water in PWR. But magnox much more complex to maintain than water. In the same way that water is more complex to manage in a datacenter. Using oil is just another level of complexity.

    5) Are we proposing that we justify to buy a computer to help heat the building. When we decommission a water cooled server, do we need to buy extra boilers to heat the building. This is a plumbing nightmare, lets keep things simple. Again, water cooling adding complexity.

    We are back to the starting point, do not make hot chips that require water cooling. Maybe I am too risk averse. But I worked with big computers (mainframes), then Amdahl and Hitachi came along and made air cooled machines. Which sold like hot cakes as they did not require all the extra baggage of water pipes, cooling, power, monitoring, maintenance. If you want to scare the hell out of a datacenter manager, tell him he has a water leak in the datacenter.

    Prevention is better than cure, do not use water cooling.

    I am not that old, but strive for simpler computer architectures avoiding the mistakes of the past. Have we learnt nothing from our computer heritage.

  29. Stephen Booth

    Water cooling

    I'm with the pro water cooling crowd.

    Water cooling might be a bind if you are thinking in terms of a small server cupboard cooled by through the wall air-con units, but once you have a whole building full of the beasts then your air-con system has to be driven by a massive chilled water plant anyway.

    In this case getting the chilled water closer to the source of the heat has to reduce the overall power use.

    The only time I remember a big puddle of water under the machine room tiles it was not a failure of the chilled water system but the dehumidifier drain pipe got blocked and it was the air-con system that caused the flood.

  30. Fluffykins Silver badge

    Sooooooooooo

    We struggled with the Stone Age,

    Battered our way through the Bronze Age

    Ground through the Iron Age,

    Had a bit of fun with the Industrial Revolution

    Entered the Space Age

    Landed in the Garb Age

    And are now having a second go with a renaissance of the Steam Age

  31. Anonymous Coward
    Thumb Up

    Water Cooling = Warming Water.

    Hmmm, if they could plum the output into the central heating system somehow - more energy savings.

  32. Martin Maloney
    Dead Vulture

    Watt-er cooling?

    I guess that that's the reason for all of the hot cross puns!

  33. Robert Hill

    @Valdis Filks

    Hitachi and Amdahl pushed air-cooled solutions because they were competing with IBM on price, whereas IBM was compteting on having better performance. IBM knew that Emitter-Coupled Logic (ECL) was a faster clocking solution than CMOS or NMOS at the time, and that despite the heat they would have a performance advantage on the very heaviest workloads.

    Eventually, CMOS and NMOS gained in clock speeds, and even IBM made the transition - but acknowledged that the individual processors were slower than their predecessors. Today's energy-efficient CMOS processors require massive paralleization to compete with either more complex CMOS designs (i.e., the original scorching Pentium IV, et al) or ETL successors.

    And what has the parallization of processors brought? An acknowledgement that for many sorts of problems, we can't effectively write parallel code. That's why Intel and others are crash-researching parallel coding techniques. That is why IBM continues to push the speed of individual processors, even to the point of resucitating water-cooling in the data center.

    Now, that's not to say that MANY problems don't decompose well into parallel solutions: parallel databases have revolutionised data warehousing, clusters are wonders of mainstream application hosting, etc. But for several, important, and PROFITABLE types of workloads (esp. some scientific work and mathematical work, as well as some database solutions) individual processor performance are the limiting factor.

    Lastly, I would be curious to see how the water cooling on these corresponds to the watercooling in old mainframes. Are they running the pipes within the case to radiator fans (as in a watercooled PC), or running them external to the case via datacenter cooling lines as in the old mainframes? In short, what is the real level of complexity?

  34. Anonymous Coward
    Paris Hilton

    @Valdis Filks

    >> Water cooling for CPU's is an admission of a design failure.

    >> All described here http://blogs.sun.com/ValdisFilks/entry/water_and_electricity_do_not

    Sun engineer decides water cooling is a design failure same day as IBM kicks Suns butt.

    Could Paris smell the sour grapes?

  35. Valdis Filks

    Parallelism and water cooling.

    The water cooling is a risk assesment, reduce the components involved and simplicity always is a better option.

    For example you can have, in your infrastructure:

    Design A = Power (electricity), Network (electricity), SAN (optics mainly), cooling (air)

    Design B= Power (electricity), Network (electricity), SAN (optics) and cooling (air + water)

    Design B has more components plus a catastrophic mixture of water and electricity. This is the problem, even if you use it to heat your building, your infrastructure just got very complex and dangerous, by adding water. Your safety regulations and such like has just increased costs.

    Parallelism, this is also a virtualisation play.

    With a low power air cooled server which has 8 cores you can virtualise 8 single threaded apps. You use the free software and the system to create 8 domains. As some of these cores can have more than one thread e.g. 2. You could even consolidate 16 servers or to one of these highly multithreaded servers which are 2U in size and go by the name of coolthreads. No water cooling required.

    However, lets take any mainstrean already existing database e.g. MySQL, Oracle, Postgress and DB2.

    These have many parallel processes (often more than 8) e.g. db writer, lock manager, transaction manager and so on. So put these on a multi core/thread server and you have a good match. No coding/changes/migration required. A very large majority of servers are running OLTP workloads that have a database engine as described above so a very large segment of the computer/server market is ideal for parallel computing.

    Long term agree, need to write better parallel apps. But a lot of apps (OLTP) out there are parallel in nature already.

    Can't quite agree on the water cooling, but I am open to suggestions.

  36. Bronek Kozicki
    Coat

    Water cooling

    If we really did not want to have water in data centres, we wouldn't be able to use air conditioning there (ever heard of condensation?). Secondly, any fluid is more efficient in moving heat from one point to another than air - it really doesn't have to be water. Lastly, invention of wheel was technological backward step, too - we all should be able to carry around whatever we need by means of genetical engineering, of course.

    Mine is the wet one.

  37. Matt Bryant Silver badge
    Pirate

    RE: Valdis Filks

    Lol, do I spot a bit of the old company line there at blogs.sun.com? Sun doesn't have a chip now powerful enough to kick up any real heat so Valdis pops up to knock the competitor's kit. Bit rich considering that whenever I have to go to visit any hosted sites, if the room is cold I usually look for a rack of the old SPARC kit to stand behind whilst I do any paperwork, it's always much hotter than even modern blades!

    Watercooling is a great idea when you have very hot kit in small spaces. And with modern kit I have yet to see a single leak. Watercooled racks have been available for a while, and they do help with local cooling in those datacenter rooms that just don't have enough airconditioning. They also do reduce noise as they are usually closed racks, bit like your average family car with its modern watercooled engine is a darn sight less noisy than that old VW Bettle with its aircooled one.

  38. Anonymous Coward
    Anonymous Coward

    Flawed thinking

    Re: "Water cooling for CPU's is an admission of a design failure. The extra pipework and electrical power to run the water cooling just adds costs and complexity. Extra electicity is required to run all the water cooling, so there are not savings, this needs to be taken into account and cannot be a hidden infrastructure cost. Plus to add an extra water infrastructure to a computer center with all that electrical wiring is dangerous. Computers and water do not mix well together. Using water cooling for computers is a technological backward step."

    Totally flawed thinking. Your suggesting that it's not worthwhile using water cooling as it requires electricity which would negate the benefit caused by the cooling process itself.

    Total rubbish. The idea behind water cooling is to cool the hot processor, to allow that processor to run at a high clock frequency. Who cares if the cooling system requires some electricity to run it, the goal is not to save electricity, the goal is not to prevent the heat being generated, the goal is it to allow the processor to run at 5GHz to take that heat away from the chip and if water cooling is the way to do it then that's the way to do it, if forced air cooling is not sufficient.

    What do you suggest? Not use water cooling, if so, what do you then do? The chip will burn out and be completely useless.

    Think about automobiles, in most the engines are water cooled, and yes, it requires electricity to pump the water round.

    By your argument, nearly every automobile that has ever been made is a design failure which is clearly nonsense.

    Years ago, mainframe computers used ECL (silicon based chips) which drew considerable more current than current CMOS/MOS technologies, they were water cooled. If properly designed then it's not a problem.

  39. Ben
    Boffin

    Fluorinert

    I'm surprised nobody has mentioned Fluorinert - I once worked with some Cray machines that used the stuff. Conducts heat away better than air, and is electrically insulating so less of a problem if it leaks. I don't know of any modern machines that use it though? The trend seems to have been to use large quantities of standard processors rather than specialist chips, and so I imagine there is less need to use specialist cooling techniques.

  40. Valdis Filks

    Water cooling adds complexity.

    Agree, water is used in chillers in the datacenter, most of the computer rooms that I have been in, the air con units/chillers are in the periphery/edges of the rooms. What water cooling to servers/computers does is put pipes and plumbing all over/under the floor mixing it with the network, SAN and electrical power. Do we need an extra substance/piping under the floor. Does it create more problems than it solves. I have seen many companies go through projects to simplify and reduce their underfloor/overhead wiring. Some datacenters even have alarmed floor tiles. Why do you want to mix water pipes with this.

    The point of this is to reduce complexity, less is more, simplicity is better than complexity.

    All computer manufacturers should help give the computer industry a good reputation for technology leadership by reducing complexity. This is the point about water cooling, it does not make things simpler for customers. Water cooling makes computing more complex.

    Do you want to say to your customer, here is a new server and do you know a good plumber, on 24hr call out, 7 days a week.

    NB I do not write anonymous but am open and stand behind my beliefs. I was born in a country and live in a country where free speech is respected. Do anonymous writers/comments have something to hide ?

  41. SB

    Air cooling aint that great...

    back in the IBM water cooled mainframe days it was the AIR cooling devices that were failing, not the WATER cooling. Dead fans caused more outages than anything else. So now there's redundant fans all over the place...not exactly simple...or quiet...or efficient. How many blade racks have caused hot spots in computer rooms due to expelling too much hot air?

  42. Ed Cooper

    I can't help but think

    In these days of high energy costs, and growing environmental concerns, that water will win. You can dump heat from a water system by pumping it through a pipe in the ground. You could easily heat exchange it into an existing HW system. With a bit of forethought even the worst plumbing failures could be manageable - with reduced capacity.

  43. stizzleswick
    Stop

    Re: Fluorinert

    Fluorinert is a CFC, therefore has been banned from use in most countries in the 1990s. That's why modern machines don't use it. One could, however, have a look at the compounds used in modern refrigerators...

  44. Anonymous Coward
    Anonymous Coward

    More watercooling opinions

    The critique of watercooling as I see it has a main flaw. While much of the reasoning in the examples is ok and appropriate the generalisation does not follow. So while it may very well be the case that there are questionable implementations of watercooling this is not a natural result of the principle of using watercooling. As a complementary proposition it is perfectly feasable to imagine flawed aircooling solutions. As I have come across quite a few I do not even have to use much imagination for this. And it is true that we could find many flawed watercooling systems that also fail. It is also true that we could find systems where watercooling was put as an afterthought because the original objectives of the design were not met. However this does not mean that it could not be possible to design intentionally a solution with watercooling (e.g. not a result of a 'design flaw'). In any specific practical implementation such a solution does not by default have to be more prone to failure than a specific aircooling solution. Compare to a car engine, while it could be argued that watercooled car engines in general add complexity (and would in principle than be more suspectible to problems) this does not necessarily mean that air cooled engines are more reliable in practice. We should also remember that if complexity is something we do not like than maybe we should stay away from all things IT - what about the result of moores law?...

  45. John Savard

    One thread, one processor

    Overclocked gaming PCs use water cooling, but not in the same fashion as the mainframes of old. So I think that the risk of leaks can be avoided. Since the real need for high speeds is for problems that are serial, I would think that a limited number of hot, high-speed chips would be needed, and the parallelizable part could be taken care of by slower multi-core chips that produce more throughput per watt. It would be nice to get a desktop machine with just one of these high-speed processors, but it would have to run Linux, not Vista, not being x86.

  46. Anonymous Coward
    Thumb Up

    Fwoarrrrrrrrr

    I could really tarfu the weather forecasts with one of those.

  47. Matt Bryant Silver badge
    Happy

    RE: Valdis Filks

    "...The point of this is to reduce complexity, less is more, simplicity is better than complexity...." Erm... yeah, right, can we now expect the next Sun APL server to be a badged abacus then (as opposed to badged FSC)?

    I know! All that SAN stuff is just sooooo complex, lets just go back to locally-attached SCSI disk for all servers and forget the savings on utilisation and administration, and we'll get rid of all those pesky fibre-channel cables from under Valdis's floor tiles as well! And who needs file-servers, lets just give all our users USB keys and tell them to transfer files between systems that way, we can then cut back on gigabit cabling and free up even more under floor space! And people wonder how StorageTek ever got to the point where Sun bought it....

  48. Anonymous Coward
    Flame

    @ RU

    Actually the entire computer science building is passively cooled and heated. Hence why the place gets so damn hot in the summer and why they always want all the computers in the labs on during the winter...

  49. Brian Miller

    RE: Valdis, water vs air cooling

    If the machine did not use water cooling, then it would use heat pipes with gargantuan fins to disipate the heat into the air. The fins for just one CPU would be enormous. The fins for *32* CPUs would mean that the case would either have to be huge, or else take another rack space behind it for the CPU fins.

    I have worked in datacenters where I had to wear ear muffs so I wouldn't be deafened by the fan noise. The datacenter at Microsoft which has two HP Superdome servers has cold air running through it like a wind tunnel. Nobody stayed in that room longer than they absolutely had to.

    Now consider water cooling. Water is inexpensive, and easy to handle. Large HF transmitters are water cooled, with the tubes in direct contact with pure circulating water, and nobody gets electrocuted and the equipment runs fine. Fluorinert is horribly expensive, and water is, well, cheap as water. Yes, plumbing is a problem if your plumbers are incompetent.

    From what it looks like, your position is that anything other than air cooling is a failure. Sorry, 'tain't so.

  50. Martin Usher

    Liquid cooling?

    Makes sense for situations where you've got to conduct a lot of heat away. You trade off the extra infrastructure for efficiency, quietness, even heat distribution (important) and even the potential to recycle some of that energy. Most car -- and motorcycle -- engines are liquid cooled (it may be a surprise to know that many bikes are liquid cooled, the fins on the engine are just for show).

    Air cooling in data centers doesn't make sense. You make a lot of noise (and spend a lot of energy) transferring the heat to the air only to have to remove it using liquid cooling through air-conditioning. Its much better to cut out the middleman.

    Water works well. I've been on a project where we used de-ionized water because we had to plumb it to a tube anode running at 7.5Kv -- taking the ions out renders the water non-conductive. For data center use, though, the water's more likely to be like the stuff in your car engine.

  51. Laxman

    This vs Sun's gear

    I checked out the Sun UltraSPARC T2+ today - that seems like a (really) fast processor as well (or so the benchmarks suggest).

    Is there anywhere I can see the benchmarking results mentioned in the article. I just want to check out the number of CPUs in the competing products ( I have a tinkling it's going to be somewhat less than 32.)

    plus, on a price to performance ratio basis, POWER6 is horrible (IMHO).

  52. Daniel B.
    Boffin

    Water cooling

    Funny, lots of comments here and no one has stated the obvious: Water *isn't* conductive. Its the salts that are mixed in with water that do the conducting, so pure distilled water might do the trick.

    Yet there are other liquids out there that can manage even better than water (liquid N2?) but more expensive.

    I'm all for water cooling, its greener and more efficient than air cooling. Anyone who's been near an HVAC would agree; as well as watching a ProLiant server sound like an F1 engine when firing up!

  53. Pierre

    Hi Valdis

    The use of air for cooling is the admission of lack of technical skills. It is the most power-hungry and inefficient solution around. It is also much more complex than water cooling, and not safe at all (you need to filter and cool humongous volumes, and the fans are very prone to failure).

    "I was born in a country and live in a country where free speech is respected." I believe you never ever went to the glorious US of A then.

  54. Anonymous Coward
    Paris Hilton

    Water Heating?

    I have a solar water heater on the roof that gives me 200 litres of scalding hot water on all but a few days a year.

    I wonder if IBM would like to try to sell me one of these to replace it?

    So far, I'm not convinced that the performance would be up to it.

  55. Matt Bryant Silver badge
    Happy

    RE: Valdis Filks

    Ooh, I spot a small problem on Valdiboy's horizon! Isn't Sun doing all that fancy research into directly interfaced chips, where CPUs communicate by being pressed against each other with overlapping faces? If I remember rightly, the main problem with that was heat generated at the interface and it was likely to be a watercooled solution. Oh dear, poor old Valdiboy's own prod dev team obviously like it complex. Mind you, it is only a small problem - vapourware doesn't really need cooling!

  56. Rupert Brauch
    Stop

    Performance claims

    Hard to verify their performance claims of 2-3x the competition, since it doesn't look like they've submitted any benchmark results to www.spec.org

    For all we know, they could be comparing their gear to ancient UltraSPARC III systems again.

  57. trackSuit
    Alien

    Water-cooled and Rather Stealthy

    Here is a link to ANOther place, where a computer is cooled by water, yet there is no pump driving the system, which would make it Stealthy.

    http://www.plees.f2s.com/ec/pas-cool/pas-cool.htm

    If you look at the core though, you can see it is well behind the curve, in terms of Modern Metadata Processing Capabilities. It even has an old Quantum Maverick storage device Connected!

  58. Chris iverson
    Boffin

    a car

    is coolled by a mixture of water and coolant that is driven by a mechanical pump connected to the engines' crankshaft, forced through the engine and then through a heat exchanger at the front of the car. Then repeats. Not by electricity which would be used to turn the fan inside the cabin of the car.

    </pendant>

  59. Ike Thunder
    Thumb Up

    Benchmarks

    The best place to check real performance results and compare different vendors is probably here : http://www.sap.com/solutions/benchmark/sd2tier.epx

    With SAP benchmarks you could tell that there are all vendors present ( which is for example not true with TPC tests )

    And good thing is that they have different configurations which can be compared. HW configs start from 1-core and largest one is 128-cores. Also you can compare UNIX vs. Windows.. that gives you nice results by the way with SAP...

    So check the SAPS value and you can see that the latest IBM result is roughly 17% better than the next one... and they did that with half the cores compared to second result !

    Impressive....

  60. Steven Jones

    Benchmarks

    Hear hear to the man that praises the SAP benchmark. If you want an application engine to run Java or an HPC environment then by all means look at SpecJBB, SpecFP and all those compute-intensive things, but if you want to see how a database works (and there's few other reasons why you want a big shared-memory multi-processor machine) then there aren't that many good benchmarks around. TPC'C' tends to be more of an exerciser of I/O systems than anthing else (some configurations have over 7,000 disk drives), TPC'H' is OK for data warehouses, but is not readily applicable to transactional systems.

    No, if you are stuck wihy a complex ERP package because some architecture team has decided that SAP, Peoplesoft, Siebel, Amdocs CRM or whatever has to be used because it is "off the shelf" and you get faced with some huge, single instance database which doesn't readily lends itself to parallelism, then SAP is about as realistic as it gets. It ain't perfect, but it's the best of a bad job. In a world where benchmarks are marketing tools, where software and hardware vendors contractually prevent the publication of benchmakrs (including those by customers who have paid for the stuff) then there's not much else.

    I've no doubt that the new IBM will be the fastest single-instance DB server out there (and we don't use AIX so I'll count myself as unbiased). However, if you do make comparisons then make sure it's the same database, same version (near enough) and don't try and compare UDB/DB2 numbers with different SQL*Server or Oracle.

    Oh yes - and back to water cooling. The biggest problem with that for IBM is going to be that very few datacentres are set up for that. Unless IBM have a very slick short cut then that's going to be a problem (unless all they are doing is piping the water to a heat exchanger in the cabinet - which is not the best thing to do, but makes installation vastly easier).

  61. Anonymous Coward
    Thumb Up

    Amazing all the comments about water cooled

    For clarification:

    The p595 5GHz system with 64 cores is air cooled.

    The special p575 HPC system has 16 4.7GHz dual core Power6 chips in a 2U system. The kind of customer who will buy a rack of 2U systems with 32 "POWERful" cores will not mind running water to the rack. The same customers today run water to door heat exchangers today with blades.

    Interesting to see Sun focus on the HPC system when they are basically a non-player in the HPC space. (except for a few entries in the top500.org list with AMD)

  62. David Vasta

    IBM's POWER Platform

    IBM has in fact released the POWER platform. It's like the Intel platform only faster and stronger with much more history and reliability.

    The Statement: "It will ship this May with AIX and Linux" is mostly true.

    The Power platform comes with nothing on it and you as the user can place a mix of AIX, Linux or i (formerly i5/OS or OS/400). The System is and has been for the longest time a virtual haven. Before VMWare was VMWare IBM was doing partitioning .

    You can run a mix of OSes on it and before too long you will be able to run Intel based Linux and I would guess Intel based Windows as well.

    POWER as a platform is going to change a good many things. I don't think Sun is ready for what IBM has in store.

  63. Mark Pipes

    apple

    I want a P-6 Mac!!

    Apple may have had a good reason to go with (ugh) Intel, but they should have kept

    Power PC available for special order!

    Imagine a Mac with a dozen or so of those 5GHz P-6 processors in.....

  64. Valdis Filks

    Evian for the datacenter - not good

    I am still open to be convinced that water cooling is better than air. It may be better but not one post has explained the advantages yet.

    Why do we make such hot chips. A hot chip implies lots of power/electricity. Then we use electricity to drive a water chiller, then we drive an electric pump. If the chip design was not this hot in the first place then we may not need all this extra cooling. This is the prevention rather than cure argument. Hence my argument why the design is flawed. It was made too hot first of all, then we have to get the plumbers in to cool it down.

    In these times of global warming we should be looking for ways to make cooler chips. Can a company use a water cooled Power6 server, install new water cooling, do all the extra work and then say that all these extra resources are good for the planet. Alternatively install a low power, multi-threaded CPU/server, which uses air cooling.

    If people think that Power6 is good for heating buildings, then we can use the existing air-cooling systems and the heat they take out of the datacenter to heat the buildings aswell. Just redirect the cooling mechanism.

    The new Power6 has a clock rate which is 2x faster than the Power5 (approx), but the Power6 applications may not run 2x faster. The increase in GHz does not match an increase in performance. Thus why do we have such hot high clock rate CPU's. Most of the industry is moving away from high GHz hot CPU's.

    The majority of companies Intel, Sun, AMD and IBM Cell make CPU's which do not need water cooling. I am a fan of IBM's cell chip, excellent design. I am not a fan of water cooling. Strangely Sun, Intel, AMD & IBM Cell are moving to more parallel/multi-threaded designs that run cool. What is P6 doing.

    IBM for some reason made a variation of the P6 which required water cooling, no answer here.

    Because we cool cars with water does that make it good for CPU's. We actually use oil to do the cooling aswell. Next thing which has already been hinted in this discussion is that we will have several liquids in a computer to cool it. More components to me means more complexity. With complexity comes extra costs.

    Many comparisons of a new IBM server to old Sun servers, please compare new IBM servers to new Sun servers. My post was about the environment and implications of all the extra resources required to install water cooling in a datacenter. Obviously we need to look at CPU design and the hot chip GHz issues, but mainly my interest was in water cooling issues.

    Computer history repeats itself, we may well see more water cooled systems. But I do not think that is a good idea for all the reasons already outlined.

    I have been to the US more times than I can remember, half my family are US citizens. I am an alien (see previous comments in The Reg about me) born in a very democratic country, living in another very democratic country where free speech is very much respected. I have seen and worked in 100's of datacenters/rooms. I thought that we saw then end of water cooling, just like ECL, Bipolar Chip design.

  65. Matt Bryant Silver badge
    Pirate

    RE: Evian for the datacenter - not good

    OK, let's look at current Sun servers - they all have fans! And all those fans involved additional design (not just of the fan itself, but of the added electric circuits to power them, and modern motherboards are designed for optimal airflow to aid cooling, which means after the guys that do the electronic design there is another team doing the layout based on fluid dynamics engineering). And then there will be monitoring devices included to check what the fans are doing. Well, there are on non-Sun servers, I'm not so sure Sun servers are so good at the hardware monitoring bit. Anyway, that all sounds like added complexity to me, so Valdiboy is talking through his rectal passage on that.

    Water cooling does not add massive complexity, and seeing as many datacenters are purpose designed, adding piping up front is not a great task. I have seen it added to existing rooms with relative ease as plumbers have been doing central heating for quite a while now and the tech is not that different (in fact, the first set of water-cooled racks I ever saw had been made by a company that had been making commercial fridges for fifty years!). It is simply the application of a known technology using modern materials to an existing problem, it is not some wide-eyed jump into the scientific unknown.

    And water cooling REDUCES electricity bills. From my own experience, water-cooled racks actually mean less aircon for the datacenter and lower electricity bills. You can jump around and hail it as "being greener" if you're a bandwagon humper, but businesses like it because it saves them on the electricity bills, which means lower costs = higher profits. All the "greeness" is just windowdressing for the gullible. No wonder Sun are making so much green noise.

  66. Onionman
    Stop

    @Valdis Filks

    "...but not one post has explained the advantages yet..."

    This is the last refuge of the unconvertable. To suggest that there are no posts above giving advantages to water cooling is ridiculous. One advantage, stated clearly, is that water will carry away more heat per litre (and per kg) than air.

    This style is not uncommon in Internet debates.

    poster 1: "I think x is rubbish"

    posters 2,3,4,...100: "Ah, but there are these reasons your view might be faulty"

    poster 1: "I've not seen a single reason to change my views"

    repeat, ad nauseam.

    If you're interested in searching for the truth, Valdis, try READING the responses and see if there just might be some truth in them.

    BTW, I speak as someone with no interest whatsoever in the facts of this case. I merely note a style of response that irritates every time.

    O

  67. Valdis Filks

    A good plumber is hard to find.

    My issue about water cooling is with the whole system, the server, the maintenance of the server, redesign of the computer room, the outages required to do all of this etc. From a thermoconductivity perspective water may remove more heat than air. But you need to get the water all the way to the server and plumb it all in. I like plumbing but this is expensive and difficult.

    How about not causing the problem in the first place with cooler CPU's.

    I think I mentioned and gave examples of liquids that are good for cooling and expensive implementations thereof. e.g. Magnox cooled nuclear reactors. Air is not good to cool nuclear reactors I think that I made that clear.

    Cost or ripping up floors and installing pipes is more than not ripping up floors and installing pipes.

    If I want to move a air cooled server I unplug the electrics, network, SAN and move it. Then reconnect.

    If I want to move a water cooled server call the plumber, book a weekend, possibly shut down the whole datacenter, move the server. Lay new pipes, pressure test new pipes, reconnect electrics. Do I need this complexity extra constraints.

    Water is used already in the chillers in the periphery of many computer rooms, you can already use this hot water to heat your office as a green by product. Do we want a water piping grid in addition to all the other infrastructure/cabling in a computer room. Every computer room is different and needs unique/bespoke plumbing to install water cooling to a individual server. For the sake of one hot Power6 server, am I going to re-design the whole datacenter.

    If in a couple years time when the whole industry has moved to cooler multi-threaded CPU's/servers, will your investment in water pipes all over the computer room be a good thing. Or will the computer industry employ legions of plumbers to rip out the water pipes that they installed 24 months ago.

    I may be totally wrong and water cooling within servers may become more popular and economical, at the moment I find it difficult to believe.

  68. Anonymous Coward
    Anonymous Coward

    more power Igor

    Watercooling for servers is never going to be popular or economical, for all the reasons listed. Water and volts don't mix, even slight accidents are catastrophes. Anything other than HPC and specialist sites will run a mile from the hassle.

    IBM don't say how much power these beasts consume and how much heat it puts out. Kilowatts/Hour and BTUs/Hour please (even my electrisave can calculate Kg/Hour of CO2) and then we'll decide if its "green" or not. In the current climate (ouch), these figures are as important as how many tedious SAP users it can support. Power is power and heat is heat, quite how you shift it from the chips to outside the datacenter is moot. Sun get this, I think IBM are too busy outsourcing and consulting to care anymore.

  69. Maurice Cloutier

    Need to solve both sides of the equation

    Valdis is right when he says we have to reduce the power requirements of CPUs, etc., but his dismissal of water cooling is like preventing roads from being built because someone may actually want to drive on them.

    Sun tries to solve for both sides of the equation with lower energy servers and cooling technologies. Sun has the CoolThreads CPUS (Niagara & Victoria Falls), but also the Sun Modular Datacenter (widely known as Project Blackbox), which utilizes water cooling and efficient packaging to reduce cooling costs by over 40%. This savings is independent of payload type or vendor. However, if you load the Sun MD with energy efficient systems, like the Niagara servers, the savings are magnified.

  70. kain preacher

    hmm nice to see

    that this didn't turn into a bun away topic do to my typo.

    Yes that's my straight jacket then one that smells like suggar

  71. David W Johnson
    IT Angle

    Please reread the article!!

    As already stated by Anonymous Coward, the p595 5GHz system with 64 cores is air cooled. The only pSeries system (or POWER or whatever IBM is calling it this month) is the 575 unit is water cooled. And just for the record, it was already water cooled when it had POWER 5+ chips. It is only design for certain customers.

    Any other IBM POWER system is are cooled!!

    Regarding the p595, besides the cost ...ouch, I for one would like to see how it performs against a HP Superdome. It would be interesting to see the numbers against SUN M9000, even though I suspect it would smack the M9000.

  72. Pierre
    Boffin

    Water cooling and freedom of speech

    Water cooling is much more efficient than air cooling, this is a fact and has been demonstrated here and in many other places It is also much more simple from a general point of view, as air cooling implies the need to clean (filter), move and chill enormous volumes of air. It adds overall complexity as the very local "simplification" in rack design implies open system, which need to be placed in "white rooms" to avoid contamination by airborne particles. The overall structural cost is necessarily much higher than for a closed water circuit.

    Now I'm not saying that we shouldn't develop and favor non-heating (and power-saving) chips. But even those could benefit from well-thought water cooling. I'm especially thinking about desktops and laptops operating in non-controlled atmosphere (servers too, but who is stupid enough to keep their servers outside a white room? Oh.... sorry), all this dust accumulating everywhere is a real problem. Watts and water DO mix much better than dust and air-cooling. There is no reason why a well-designed water-cooling system would be a problem. The issue is "macro-technical", and quite easy to fix (besides, polish plumbers come cheap these days). "Fire and powder don't mix", still the "mixture" is widely used, from fireworks to space rocket propulsion. In the lab, we're happily mixing pressurized gasses, water, heavy watts,very delicate electronics, very toxic compounds and radioactive isotopes, all that in a place that would make a bachelor's kitchen look tidy. All clear, sir. No safety incident or leak reported in the past few years. We did have problems though: the computer monitoring the whole shebang froze in the middle of an important experiment because the air-cooled processor over-heated (dust accumulation, in spite of the filter). And we had to change the air-cooled power supply a couple of times (dust accumulation, in spite of the filter). Gimme water cooling please.

    Besides, for applications that DO need heavy single-threaded processing power (yes, there are such things. My heavier computational needs are not easily split into parallel processes -but basic science might well be an exception), faster single-thread chips are a great thing. And air-cooling them would be an astronomical waste of energy.

    To sum up my thoughts, low-power chips are the best, when they do the trick. But water-cooling them would still be even better. Reducing the issue to "water and electricity don't mix" is a silly attempt at mixing basic "home safety" advice with highly technical issues.

    As for the freedom of speech, and me mentionning the US: freedom of speech is respected there indeed -till you start talking or writing about Al Quaeda or about filesharing or about kicking your prof's butt, even if you don't disclose the results of you elucubrations. Which is the kind of restriction that define the LACK of speech freedom (see the last few court decision about overall harmless dudes emailing bad poetry about Bin Laden, or the kid grounded for an undisclosed phantasmatic "hit list".) I could have mentionned the UK too. Goth teenage girls are really threatening these days! As for France, mother of the "Declaration des Droits de l'Homme", I guess that holding a sign reading "Niko, salaud, le peuple aura ta peau" whould lead you directly behind the bars, with the associated beating. Poor, poor western world.

    Geek icon just because.

  73. Valdis Filks

    Will we have an iCooler next ?

    The SunMD is a standard design with water cooling built in the factory, every SunMD is the same. No need for any local plumbers to change anything inside when a customer receives it. Just plug it into the power and external cooling pipes. When you buy a Sun Modular Datacenter (aka Blackbox), you do not not have to change anything. Put a water cooled server in a computer room and as explained many times, you need to do lots of extra work and ongoing complexity.

    As far as I know all servers in a SunMD are air cooled, no servers are water cooled. But I am sure if someone paid Sun enough money we would be able to connect a water cooled P6 in the SunMD to the pipes. I would advise against this, guess why, it adds complexity.

    SunMD is water cooling outside of the computer chassis/enclosure. The Power6 water cooled server is water cooling inside the chassis/enclosure.

    My point is about the added complexity, which water causes if you have to put it into an existing datacenter.

    Now if people want to make hot chips or more elegant designs then we technologists have a challenge. Produce a coolant that is safer to mix with electrical devices and those nice little towers that we put on CPUs can become a selling point, lets call it the iCooler. I remember that some of the IBM, Hitachi or Amdahl mainframes had those elegant circular tower heatsinks.

    Well it was pretty but complex. Now maybe many overclockers like to cool their PC's at home with these type of things. The modders always like new gadgets and to spend time tweaking their systems. Commercial datacenters do not.

    NB I built my latest PC with the criteria of least power usage, it is based on a AMD BE-2540 dual core cpu, $ per performance per power usage it was the most efficient. It may not be the fastest, but I was being Mr Sensible. No water anywhere near it.

    In commercial datacenters I do not think we can overclock, mod and customize our servers with lights, water coolant towers etc. But maybe the first person to do so could make the datacenter into a work of art and a light show.

    Me no iModder.

  74. Anonymous Coward
    Paris Hilton

    Re: IBM's POWER Platform

    David Vasta said "You can run a mix of OSes on it and before too long you will be able to run Intel based Linux "

    Maybe can't run Linux/x86 yet - but you can certainly run Linux *programs* - see http://www-03.ibm.com/systems/power/software/virtualization/editions/lx86/

    And before the old fogies like me chip up - yes I know this is strangely similar to the trick DEC did with WinNT on the Alpha.

    Getting back to the P6 kit - what's the big deal over the water cooling? AFAIK it's only the '575 that's watercooled, although you can add a radiator door to IBM racks. Okay, given my limited experience with overclocking a PC, you still can't be cavalier about water+volts, but then again it's deionized water so it's also not water-leak=instant-death either.

    Not sure about the greeness of this - okay you get "better" cooling than air (otherwise no overclockers would bother with H20) and you get a side product of warm/hot water (swimming pool anyone?!). On the other hand you definitely can have a lot less fans = better reliability (less components) and each fan uses/wastes power itself. I'm also guessing that watercooling makes it possible to pack these hot running systems more densely - so saving a little on floor space.

    Got to say - I'd love to see how many virtualized environments a "full house" p595/p6 could support. (sheesh, I sound like a total nerd!)

    Apologies if I sound like an IBMer - not my intention, just so nice to see someone continuing to push the boundaries...

    (Paris because we're talking about hot bods here)

This topic is closed for new posts.

Other stories you might like