back to article Cool-headed boffins overcome sticky issue: Graphene-based film could turn heat down

A graphene-based film could help to cool down overheating microelectronic devices, scientists have claimed in a new study. Researchers at Chalmers University, led by professor Johan Liu, have developed the 20 micrometre-thick film, which apparently has a thermal conductivity capacity of 1,600 W/mK – four times that of copper …

  1. James 51

    Could it be used in central heating too? Now if there is a form of graphene that can absorb heat as well could lead to air conditioning /central heating revolution.

    1. Anonymous Coward
      Anonymous Coward

      WTF?

      It absorbs heat from a higher temperature surface, it isn't a sponge that can suck heat in and push it out whenever you want! If you're looking for Maxwell's demon, keep looking, graphene film isn't able to do this any more than setting a chunk of copper or aluminum in your living room can.

      1. Six_Degrees

        Re: WTF?

        Damn. And that refrigerator-size block of pure copper I put in the living room wasn't cheap.

      2. James 51

        Re: WTF?

        Which makes it sounds like a good material to make radiators out of and probably light enough to start doing some interesting things with the design.

        1. Anonymous Coward
          Anonymous Coward

          Re: WTF?

          Just your single rad will cost more than your house

  2. Bronek Kozicki

    That's interesting, but of course this being research there is no way to tell how many years are we from commercial applications (if at all)

  3. John Smith 19 Gold badge
    Thumb Up

    Note it's 4x copper and a *static* system

    Heat pipes can do this but there is always the risk of a leak.

    Now can they mass produce it?

    1. Mark 85

      Re: Note it's 4x copper and a *static* system

      I guess you haven't heard that Special Services will be re-working the plumbing in your house/building? Once they start making pipes of the stuff, we'll schedule Archibald Tuttle to come by and do the work.

    2. cray74

      Re: Note it's 4x copper and a *static* system

      "Now can they mass produce it?"

      A good question, and the answers for similar carbon allotropes like carbon nanotubes has kept them out of commercial markets for decades. Which leads to another question, "Why not start with graphite or diamond?" Both have been in development for longer, have more manufacturing options, and can deliver similar thermal conductivity. Diamond has the advantage of less anisotropic thermal conductivity so you won't have to worry about in-plane, out-of-plane thermal conductivity like graphite and graphene.

      Or...

      "Heat pipes can do this but there is always the risk of a leak."

      Heat pipes can do this and much, much higher thermal conductivities. Depending on the size and working fluids, a heat pipe can have 10 to 10,000 times the thermal conductivity of copper, which translates as about 2.5 to 2,500 times the thermal conductivity of graphene.

      1. Anonymous Coward
        Anonymous Coward

        Re: Note it's 4x copper and a *static* system

        And Corning has a process (for quite some time) that can plate out diamond by using a MASER and carbon-monoxide gas. They've never done anything with it so far as I can tell.

  4. Tromos

    If the LEDs were highly efficient...

    ...they wouldn't be in need of cooling!

    1. Mage Silver badge
      Boffin

      Re: If the LEDs were highly efficient...

      1) You'd be able to turn up the power

      2) cooled LEDs are more efficient.

      Though presumably the Graphene needs to be on the rear of the chip.

      1. Robert Helpmann??
        Childcatcher

        Re: If the LEDs were highly efficient...

        Though presumably the Graphene needs to be on the rear of the chip.

        Not necessarily graphene is fairly transparent, absorbing about 2% light per layer. The addition of APTES to the mix shouldn't make that much difference as its transparency is due to it being so thin. It might make sense to coat the majority of an LED if the loss of efficiency from covering the business end was more than offset by gains from heat dispersion, especially if it is easier to manufacture them that way compared to only coating a smaller area.

        1. cray74

          Re: If the LEDs were highly efficient...

          Not necessarily graphene is fairly transparent, absorbing about 2% light per layer.

          2% per layer of what, individual graphene carbon atom sheets?

    2. Six_Degrees

      Re: If the LEDs were highly efficient...

      No. That would be true if they were claimed to be completely efficient. They can still be extremely efficient while emitting some heat.

  5. Charles 9

    Correct me if I'm wrong, but an innovation such as this is meant to be a better means of transferring heat produced IN the chip to the exterior to be drawn off by whatever means are at hand? IOW, it's not intended to replace heat sinks and heat pipes but rather to complement them and make them better at their job?

    1. Anonymous Coward
      Anonymous Coward

      You're correct, but the important thing to note is that as chip dimensions shrink the hot spots become more and more concentrated, and because silicon's thermal conductivity is less than a tenth of this material, it doesn't spread laterally through the chip very well.

      By having a high conductivity material affixed to the silicon die, the heat will be drawn out of the silicon and into the graphene film - where it can then be drawn out of the graphene by your existing heatsink. The heatsink would have a lower thermal conductivity than the graphene film but its mass will be far greater than the chip+graphene and thus be able to soak up a lot of heat while still maintaining an acceptable thermal gradient for good performance (which can be helped via fans, heat pipes, etc.)

      I wonder if it would be possible to put the graphene layer in between stacked chips, as heat removal from stacked dies is one of the limiting factors of the current state of the art.

      1. Bronek Kozicki

        Yup exactly, that's what is interesting about this technology - removing heat faster and more efficiently from inside the chip will make it work at lower temperatorus, that is more efficiently. Faster GPUs, CPUs and memory - ahoy!

        But, as I wrote aboute - we have no idea when this will come to fruition (if ever)

      2. JP19

        "high conductivity material affixed to the silicon die"

        Trouble is they are talking about 20um thick film so there is almost no temperature difference across the film anyway.

        If I got the sums right the thermal resistance of this film under a 1 square cm chip would be 125 uK/W.

        So dissipating 100W there would be 0.0125K across the film and 0.05K if it were copper instead.

  6. Kaltern

    Another day, another use for Graphene that will never be actually used in practical ways.

    For a 'wonder material', it's pretty underused...

    1. FelixReg

      Never is a long time

      Kaltern, true enough, for now.

      But consider this: Back in the '70's there was a similar bit of tech that had been "promising" for so long that its name was synonymous with "cool, but useless".

      The name? "Lasers".

  7. Six_Degrees

    "Researchers at Chalmers University, led by professor Johan Liu, have developed the 20 micrometre-thick film, which apparently has a thermal conductivity capacity of 1,600 W/mK"

    "Apparently"? Didn't they measure it?

    I'm hearing bold predictions about uses of graphene on a monthly basis, and have been for years now. But I'm not seeing any of them come to market. Please let me know when someone says graphene WILL do something, rather than COULD do something.

  8. bob, mon!

    W/mK?

    Watts/milli-Kelvin? That's quite a thermal conductivity.

  9. Chris Evans

    Where?

    For those wondering. Chalmers University of Technology is in Gothenburg, Sweden.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon