back to article Some like it hot ... very hot: How to use heat to your advantage in your data center

Heat has traditionally been the sysadmin's enemy. We may have turned technology to our advantage and chipped away at heat's wasteful nature over the years, but our old foe has remained. From turning data centers into walk-in fridges, and hot/cold aisle separation to cold aisle containment and positive pressure, we've tried …

  1. Jon 37

    Geoscart apparently went bust on 1 July 2015

    "UK: Greenfield Energy trading as Geoscart™, the company behind the pioneering technology that Sainsbury’s uses to recycle heat from its’ refrigeration systems has gone into administration."

    And their website is no more ( ) so I assume the company is gone.

    1. Pascal Monett Silver badge

      Sounds like another brilliant example of cooperation between government and private industry then.

      I've watched too much Yes Minister, I think.

    2. Anonymous Coward
      Anonymous Coward

      Re: Geoscart apparently went bust on 1 July 2015

      Was this another scam by Jim McCann and his crew, following on from the business in Belarus?

      What's really irksome is that the Greenfield Project Management plan for remediation of areas of Belarus affected by fallout from Chernobyl would have worked, and the £15 million funding that Greenfield Energy raised could have been put to good use in saving energy and delivered a genuine return. Both the Belarus bioethanol scheme and the Sainsbury heat recycling plan were, on the face of it, financially viable. Sadly, the ripoffs are likely to discourage investment in novel and ecologically beneficial schemes.

      1. Geoscart

        Re: Geoscart apparently went bust on 1 July 2015


        I am pleased to say that Greenfield Energy (as was!) was nothing to do with the Greenfield Project Management team...

        The £15m which Greenfield Energy raised was put to very good use reducing the energy and carbon footprint of numerous UK supermarkets for the afore mentioned client, and they are still performing excellently!

        Please get in touch if you want to know more about this and if we can help with keeping any Data centers cool!

    3. Geoscart

      Re: Geoscart apparently went bust on 1 July 2015

      Hello Jon,

      I am pleased to announce Geoscart is alive and well! our new website is not up and running yet, but hopefully you will see this once it is.

      If you have any comments or questions regarding Geoscart do not hesitate to get in contact!



  2. Nunyabiznes Silver badge


    If you tie your data center into your building HVAC you will suddenly be at the mercy of a: the HVAC guys who designed the system that cools in the winter and heats in the summer, and b: the facilities manager who thinks this is a fantastic idea because it is much cheaper than the other way around.

    I'm sure there are plenty of anecdotal reminisces about this so I won't bother sharing my notably prolific incidents.

    1. Geoscart

      Re: Facilities

      I could not agree more! and I am of a HVAC background.. The experience we had with Geoscart in food retail was very much like being from ACAS... we had separate meetings with the refrigeration specialists and the HVAC specialists and designed a Geoscart interface to suit and optimise the energy efficiency of both. The client made it quite clear how expensive all the frozen goods were (and put the LADs in the contract) and if we caused them to melt... We therefore have a system lead by, and designed by a refrigeration expert with all the fail-safes necessary which we interface with to provide improved condensing conditions.

      I share your sentiment that there will be lots of stories of failure where this has been tried.... but that doesn't mean the concept is wrong... just the people executing it!

  3. Knoydart

    Being done already

    There are numerous examples of data centers piping off their waste heat already. A certain London internet exchange building has head exchanges installed at the start to help provide heating to the residential block next door.

    Waste heat reuse is tricky (its a very low form of energy) but can be done. Swimming pools and district heating systems are easyish ways of doing it, but getting it back to electricity is not beyond the realm of technology and means you can reuse the energy either on site or resell to the grid.

  4. Jim O'Reilly

    Data centers can be run much warmer

    With today's servers there's no reason NOT to raise the maximum inlet temperature to 40C (104F). Airflow is typically well-designed, so this means that for most of the year, most places DON'T need chillers!

    Disk drives used to restrict temperatures a bit, but recently drives running up to 65C have been the mainstream, so they are able to handle the inevitable temperature increases inside a server or storage box OK.

    I've delivered COTS servers with specs up to 50C inlet air, without excessive cooling support, so 40C is safe in the general commercial space. The lack of chilling is a huge saving in power costs. The only issue is filtering ambient air to keep out the dust of the prairies!

    1. DougS Silver badge

      Re: Data centers can be run much warmer

      Yes, I was going to say much the same thing. I was surprised at the author thinking 23C was some sort of crazy inlet temperature. Is the UK behind the times in moving away from the "your datacenter must be so cold you wish you had a winter coat" mindset of the 90s?

      Modern dense rack/blade systems pay very careful attention to airflow, and could probably do free air cooling in most of the world. The problem is that unless you know for sure all devices in your datacenter can handle that, it is better safe than sorry. This is where the converged systems guys could really help. They need to design for and advertise this capability, and hopefully that will push the rest of the industry in that direction.

      It won't change things overnight, because a datacenter that is designed with HVAC isn't likely to switch to free air cooling, and would have legacy stuff whose ability to handle the heat is unknown. But you have to start somewhere, and getting new datacenters built for free air from day one. Obviously you still need condensers for humidity control, but aside from a small 'cool zone' for legacy equipment you could design/spec for free air cooling if it becomes standard for enterprise equipment to advertise their ability to handle it. C'mon Dell, you just swallowed EMC, how about being the first to do this and force the rest of the market to follow?

      1. Naselus

        Re: Data centers can be run much warmer

        " The problem is that unless you know for sure all devices in your datacenter can handle that, it is better safe than sorry"

        That's the real issue, I suspect. Every company has that one box built by hand in 1986 which is, for some reason, still vitally important but impossible to shift onto something newer. It's so inefficient that it's using half the power input of the DC to produce one thousandth of the compute, it requires a temperature 10C cooler than every other box on premises, and it's OS is so ancient that you have to pay the one remaining man in Britain who knows how to use it a small fortune in retainer fees just to service it. But if it stopped working for 30 seconds, then the sky would fall in, so literally everything is designed to take it into account.

        1. DougS Silver badge

          Re: Data centers can be run much warmer

          A handful of legacy systems that need better cooling is an easy problem to solve in a greenfield datacenter design (or even a refit) Build a small room in the corner that is kept cold, and put stuff that needs to be cold there. The remaining 99% is free air cooled.

          1. ratfox Silver badge

            Re: Data centers can be run much warmer

            I do understand that the current limit on DC temperatures isn't set by the machines, but the people servicing them.

            1. razorfishsl

              Re: Data centers can be run much warmer

              It is set by physics, something called the "Arrhenius equation" and relates to activation energy.

              All those little chips and shit are made from chemicals and are bound by the laws of chemistry and physics.

              1. Stephen Booth

                Re: Data centers can be run much warmer

                The aim is not to increase the operating temperature of the components but the inlet temperature of the cooling. "Cooling" is not about temperature it is about energy removal. Energy removal depends on the specific heat of the cooling medium, the flow rate and the temperature change. If you use a cooling medium with a high specific heat (water) and move a lot of it really close to the source of energy you can get that energy out without having to chill it to really low temperatures first. Secondly there are other important factors driving equipment replacement cycles if you are planning to replace the equipment faster than it wears out you might live with a higher degradation rate if it reduces your power bill. This kind of thinking really only comes into its own for HPC or hyperscale where you are buying equipment by the room rather than by the rack. Free-cool and hot-water cooling are not exactly the same thing. Freecool requires highish input temperatures (higher than ambient). hot water cooling is about having a high output temperature so you can use the waste heat usefully. The strangest proposal I ever saw (unfortunately I can't seem to find a reference) was to use the waste heat to "digest" cow waste producing methane to generate more electricity.

    2. razorfishsl

      Re: Data centers can be run much warmer

      Please don''t talk crap

      For every 10 Deg above 25 you half the life of electronics, this is a GIVEN.

      The fact that disk drives claim to run at 65 deg is a fallacy and goes against the well established physics of electronic design, go ahead write data onto a disk surface at 65 deg.... Then see what happens to your error rate on a cool day.

      Just because you "delivered" kit does not give any indication of the failure rates, show some data, do you have any long term analysis available?

      1. Kevin McMurtrie Silver badge

        Re: Data centers can be run much warmer

        The efficiency of a datacenter is calculated as computational power per cost. Extra cooling makes that ratio worse and it's for nothing. Why would you want to extend the life of a system from 5 years to 20 years with extra cooling? Old systems use more power and space than they're worth.

      2. woodman

        Re: Data centers can be run much warmer

        Google have done a lot of research into this using real world data , ie their own hardware failure rates.

        I read one of their papers a couple of years back and they found that disk drives failed less frequently when they ran at an input temperature of 37C (if memory serves me correctly) as opposed to 19-20C.

        This was using a sample rate of tens of thousands of disks.

        So check out their research for some real data

  5. JeffyPoooh Silver badge

    Hmmm... An idea...

    We have 'a box' (can't say more) that runs at about 60°C or higher (removable media literally too hot to handle) and is as reliable as a wood burning stove. Reason being, it was designed and built with industrial grade ICs and other bits that don't care about such high temperatures.

    Maybe it would be CHEAPER to design servers to run hot reliably to save vast sums of money on the cooling side...

  6. John Robson Silver badge

    My "data centre in my loft" used to get so hot...

    ...that the rubber feet on the keyboard melted into the top of one of the PC cases.

    Only occasionally the CPU temp alarms went off (set to 85) despite all CPUs being maxed out by SET@Home, and the loftspace was seriously toasty!

  7. Gravesender

    Nothing new

    When I was at college 50 years ago I picked up beer money by working as a transmitter minder for various broadcast stations around Pittsburgh. It was common practice there to use waste heat from the transmitters to heat the transmitter buildings.

    There was one AM station that used plate modulation. The 5KW modulation transformer made enough noise that you could monitor the programming in cold weather by the sound coming from the heating vents. In the summer, when the transmitter was vented to the outside, you could hear it all over the antenna farm.

  8. Norm DePlume

    Bowmore have been heating the next door swimming pool with the waste heat from the distillery for years. The IT connection? Apparently it's computer controlled.

  9. John Jennings

    Brew beer, if its kept constant at 20C.

  10. Stevie Silver badge


    I invite the readers to look back through my comment chain for the original thinking on this "ambient cooling" theory.

    Or as I call it, the "For fuck's sake stop howling about a busted A/C and imminent server outages and open a bloody window before the SAN appliances melt" theory.

  11. Anonymous Coward
    Anonymous Coward

    Interesting.. But I've also seen two studies that make me think twice about raising the inlet temperature..

    The first was around reliability. Basically heat is an enemy of electronics, and the reliability of your systems in the DC goes down as temperature goes up. You won't notice this on day 1, but over time you will. I've even seen studies where systems at the top of the rack are more likely to fail then at the bottom. (heat rises, and top of the rack is hotter than bottom, effecting reliability)

    The second study was energy consumption in relationship to ambient temperature. Did you know that a server will use more energy as the inlet temperature increases ? Oh, and guess what happens to that extra energy. Yes, it's converted into heat.. Modern servers have many mechanisms built in, and may also throttle CPU speed if temperature increases. So your performance goes down. (Or you need more systems to do the same work, resulting in more heat)

    So while increasing ambient temperature may help reduce the energy bill for the air conditioning, it will increase the energy bill for your actual computing, and reduce the reliability. So it would be interesting to see a study that combines these effects to get a view on overall effects.

    1. Anonymous Coward
      Anonymous Coward

      > Did you know that a server will use more energy as the inlet temperature increases ?

      Yes, I've been able to correlate UPS load vs outside temperature for our server room. We have no chillers, just ambient air cooling - with all the issues mentioned, it's surprising how dirty "clean" air is :-(

      I can tell how warm the server room is by ear.

      (Anon for obvious reasons)

      But back to the article. Effectively it's suggesting a return to combined heat and <something>" schemes that have been around for ... well longer than I have. As pointed out, the problem is that the temperature isn't really high enough for a lot of uses. At say 30 to 40˚C it's just not warm enough without re-engineering your requirements - in simple terms, fitting massive radiators in place of the old slimline models.

      Boosting the temperature with heat pumps would be a good system - feed the warm water about and let householders extract what they want, either with massive radiators or water-water heat pumps to boost it to suitable levels for things like domestic hot water (which needs to be around 60˚C or more to ensure legionella bacteria are eliminated). With a nice warm-water heat source, a heat pump could probably heat a store to 60 or 70˚C quite cost effectively.

      But look at it from the householder's budget POV. Depending on tariff, lecky is typically something like 3 to 4 times the price of gas per unit of heat. Allowing for the heat pump having a COP of around 3 (which it probably would be for those operating conditions), and it's costing about the same in lecky as it would to run a gas boiler. So no running cost savings then - assuming you get the warm water for free.

      Run the central heating directly off the low cost heat (fit bigger rads) and you can save - again assuming you don't have to pay too much for the warm water.

      But whichever way you slice it, the savings are not huge (or even non-existent), but carry significant up-front investment - replace all the rads, install a heat pump for the hot water.

      And unless you get a guaranteed heat supply (which you won't if you are paying nothing for it), then you'll still need the gas boiler you already had for backup !

      And don't get started on the costs of digging the roads up to get the bulk pipes in !

      Now, for greenfield installations the economics are a bit better as you can design the systems from the outset to suit the heat supply. But it'll still cost more than "throw in a gas boiler", for possibly small savings.

  12. Ken 16 Silver badge
    Thumb Up


    Could we insert that term in a few more places, I think it's been underused.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019