back to article Facebook 'open sources' custom server and data center designs

Facebook has "open sourced" the specifications and design documents for the custom-built servers, racks, and other equipment used in its new Prineville, Oregon data center, the first data center designed, built, and owned by the company itself. On Thursday morning, at Facebook headquarters in Palo Alto, California, the social- …

COMMENTS

This topic is closed for new posts.
  1. JeffUK
    Flame

    The temperature will range from 65-80 degrees!

    OMG, and we all thought Google were being cavalier when they let theirs get to 26 degrees!

    Flame because facebook's DC is LITERALLY on fire.

    1. Tom 38
      Boffin

      @JeffUK

      I think they are talking Fahrenheit rather than Celsius.

      In real measurement units its 18°C <-> 27°C

      Or at least I hope so, for the sake of their techies!

    2. Anton Ivanov
      Grenade

      Literally, if it will be on fire, there will be no way to extinguish it

      A battery in every rack? No boom today. Boom tomorrow. There will always be a boom tomorrow. BOOM!!!

      I would not want to be anywhere near this if it catches fire. What's next? Operating a datacenter dosed in petrol? Who cares about the cooling, how did they manage to achieve fire safety (and health and safety) for the fire suppression system.

      As far as the higher temperature for air intake that is actually not so bad. Because of the wider spacing and overall better cooling for each shelf you can probably have even >26C air intake and it will still operate within thermal norm where it matters - at the silicon.

  2. LaeMing
    Thumb Up

    LaeMing likes this.

    :D

  3. Adus

    Nice

    I don't much like Facebook due to their cavalier attitude to privacy, but as a network programmer who works on cloud services frequently, it's infuriating that everyone has different server designs when for the most part they are all doing the same thing, I share their hope that this will help standardise things.

  4. RJ

    Not being a data center expert

    Is that some sort of basic airlock in a hot isle / cold isle setup in the second picture?

    1. Getter lvl70 Druid
      Badgers

      Badgers.....

      ...are held against their will behind there until the Lvl2.01 GrandWizard Zuck requires them for an arcane ritual, held every month on the second Tuesday night in the local Apple Store basement, that removes their paws.

      Where PETA is on this, I don't know....

      ;)

      PS: Don't know why you got downvoted - upvoted you to even it out. Have a great day!

  5. DaemonProcess

    EMR ?

    Thousands of open motherboards.

    I take it that the designers have checked about potential safety issues with all those microwaves. Or do the engineers have to wear Faraday suits?

    1. defiler

      Would it be cheaper that way?

      Option A - put metal shielding on *every* server.

      Option B - keep a couple of Faraday suits on hand for when people need to go in there. And you shouldn't need many people to go in there terribly often.

      If it's cheaper, then why not?

    2. Anonymous Coward
      Anonymous Coward

      "all those microwaves"

      What microwaves?

    3. Steven Jones

      Microwaves?

      Microwave frequencies are generally defined as starting at 110GHz. Given that the system bus speeds are generally in the few hundreds of MHz there's not much danger of significant amounts of microwaves being emitted. Once they start powering buses from cavity magnetron, that's maybe the time to worry.

      However, I'm sure they will have had to pay attention to RFI issues in the motherboard designs. Otherwise it's not conducive to reliable operation.

  6. petur
    Boffin

    motherboards on shelves

    This is exactly what Google has been doing from day one (their first server consisted of a rack with shelves, each shelf hosting 4 motherboards)

  7. Anonymous Coward
    WTF?

    Why not vertical?

    Never understood why people don't build these data-centres with the motherboards vertical, to aid natural air circulation. Seems the horizontal layout gets in the way of air flow that is trying to happen anyway. And why aren’t the racks configured as a chimney/plenum chamber, to further aid natural cooling?

    1. Anonymous Coward
      Anonymous Coward

      Hmm

      Wouldn't that mean that the motherboards at the top are getting lot less cooling? At least in the hot/cold environment I think it depicted in the picture.

      In a hot cold each server has equal access to cold air in front of the rack and the hot air passes out the back having cooled only that horizontal plane.

      (ASCII side view, fear my ascii skills)

      If you had vertical then the warm air from lower servers would migrate up towards the servers in the top before they have passed through to the back which would cause trouble.

      Not a datacentre expert but that was a good question which did make me ponder for a bit.

  8. Anonymous Coward
    Go

    err

    I don't understand register's animosity towards google

    they have contributed more to open source than facebook ever will

    just cause some things are kept hidden shouldn't give you a reason to chide google

  9. Anonymous Coward
    Stop

    277 volts to the board..?

    The picture on the 2nd page shows the cut-down dual-socket AMD board that Facebook are using. Below it, the text states "There are two input connectors on the motherboards, one for the 277 volt input and another for the 48 volt battery backup system. "

    I am highly sceptical about that. I would have thought that the tracks and spacing on a circuit board like this would not be sufficient to handle and isolate such large potential differences.

    Also, one would expect to see large regulator components on the board such as those found in traditional PSUs that are responsible for stepping down hundreds of volts to the rather more meagre 1.5V required for the core.

  10. Anonymous Coward
    Anonymous Coward

    Hmm...

    Every datacentre that I've ever worked in has had it's UPS in a separate room for good reason. An explosion in a UPS is a very bad thing indeed, you really don't want to distribute lead/acid batteries round a datacentre, it's very dangerous. Anyway, how is a UPS a single point of failure, if you're running from multiple power sources to each rack/piece of hardware?

    I wonder what they do about EPO, if they're running distributed UPSs. EPO is a very important datacentre safety feature and has saved many a life.

    I'm also wondering about the statement that there are no chillers, just an evaporative cooling system, which would be a chiller, surely?

  11. Anonymous Coward
    Anonymous Coward

    power supply and fans?

    at this generic server scale you think they would just eliminate individual fans and power supplies on the server just have big suction at the back of each rack and a single power supply for each rack or more.

  12. William Wallace
    Big Brother

    30 years on

    Can't help, reading this, being reminded of the "Naked at 30: Osborne 1 stripped to its chips" story I perused just a couple of minutes ago - especially the picture of the Osborne and the MacBook Air. What size will data centres be in 30 year's time?

  13. Anonymous Coward
    Anonymous Coward

    Storage???

    Anyone notice that there is no mention of what storage vendor is used??? Who cares about the "servers"...lets hear about the storage!!!

  14. Getter lvl70 Druid
    Happy

    <evil> hehe

    Back in the late 90's we were flashing our 33.3k US Robotics modem racks up to 56.6k (woot! that was exciting at that time!) and to do so, had to run down to our colo at Qwest facilities in Tucson. Security was not much and you got to walk by hundreds of modem racks separated by chain link fencing to get to your cage amongst all the other ISP's like AOL, Compuserve, etc.

    You can't help but wonder how much damage one man with a supersoaker could do in that situation lol.

    Open racks are easy to work with but damn! One awshit can really mess up your day/week/year/life. And ahshits are as likely as death and taxes.

    Off topic and meandering memories coming back.... sitting in that colo with the lights out and watching thousands of people connecting and disconnecting to the racks, seeing your call fallover work repopulating cards after flashing firmware... kinda profound wondering wtf they are all doing at 3am in the morning. Good stuff.

This topic is closed for new posts.

Other stories you might like