Feeds

back to article We flew our man Jack Clark into Facebook's desert DATA TOMB. This is what he saw

In the high desert of Oregon, Facebook will store in a cold low-power digital morgue the photos of you, me, and everyone else on its content farm. This tomb is not a typical one, but instead a "cold storage" addition to the company's sprawling data center complex in the state's city of Prineville. We first reported on the new …

COMMENTS

This topic is closed for new posts.
Anonymous Coward

Good God, that's depressing

I've been to too many sites consisting of nothing but me (usually struggling to find a KVM, for the Windows boxes), a security guard and a lot of servers (and the telcos have some rather large places). Fortunately, it doesn't happen often these days as most things can be done remotely. A tip if you're ever stuck in one of these places and cold: find a rack of HP servers. Nothing blows hot air like a rack of HP servers.

4
1
Bronze badge

Re: Good God, that's depressing

Any rack of servers should keep you nice and toasty on it's hot side. I regularly used to have a little kip behind the server racks in my old job, the fan noise is basically a white noise generator.

2
0
Silver badge
Pint

Re: Good God, that's depressing

I find these places very becoming to meditation. It's as close to the mathematical substrate as you can currently get.

Additionally, the plasticky smell, never used kitchenettes and the often very affordable snack and drink dispensers are a plus!

2
0

Re: Good God, that's depressing

Thats true - our old DR site had the best hot chocolate machine I've ever encountered - left our local cafes for dead, let alone the chains.

0
0
Anonymous Coward

Impressive. Especially when compared with MySpace's data centre: http://m2.i.pbase.com/g1/19/728119/3/113764222.7KHOM2L6.jpg

8
0
Bronze badge
Joke

re: Especially when compared with MySpace's data centre

Nah!!!!!

i thought that was a picture of the hosting site for Mickeysoft's Office 365 servers.

1
1

"a digital archive of rarely accessed but useful, static information."

2 out of 3 ain't bad :-P

2
0
Anonymous Coward

It takes a lot of resources

To be truly Evil like Facebook. They make the NSA proud.

0
0
Anonymous Coward

Re: It takes a lot of resources

But at least they use Anderson connectors for high-current DC --- they do have some taste.

0
0

Expansion

If the existing SMR drives are so sensitive to vibration that other drives spinning freak them out, then how does FB manage installation of new racks? Do they have to shut down whole racks of disks?

2
0
Silver badge

Re: Expansion

Maybe they run a RAID-1 type arrangement with another rack, far away, as a mirror. Or something like that.

0
0
Silver badge

Re: Expansion @The Dark Lord

These are telco style machine rooms, no suspended floor and wiring from above.

The floors are solid sealed concrete, so probably don't vibrate too much.

2
1
Anonymous Coward

Re: Expansion

I'm wondering if that's the reason for so much space between individual racks in the photos.

0
0
Silver badge

Re: Expansion @The Dark Lord

> These are telco style machine rooms

That's interesting, I know the local incumbent operator puts its switches onto a raised floor. This may of course vary with location...

0
0

Re: Expansion

Us IT engineers aren't getting any thinner you know!

2
0
Silver badge

Re: Expansion

"If the existing SMR drives are so sensitive to vibration that other drives spinning freak them out, then how does FB manage installation of new racks? Do they have to shut down whole racks of disks?"

It's mentioned in the article. One drive of the fifteen on each 1U blade can be powered up at any time. There is enough physical vibration separation between each blade to allow this. From my count (from the photos), there are 32 blades per rack, so that's 32 drives powered up at any one time, out of the 480 drives per rack.

Extrapolating the maths, of the stated total of 1Pb per rack, that's 30Tb per blade, 2Tb per drive. It's also stated, the facility has a total capacity of a thousand of those racks. Total 1Eb for the facilty.

That's a WHOLE lot of kitten photos...

1
0
Silver badge

@Destroy All Monsters

My knowledge of telco machine rooms may be rather dated, but it used to be that almost all telephone exchanges put the kit directly on a concrete floor because of the weight (a practice evolved from having vast and very heavy mechanical exchanges). With a solid floor for load bearing, it made sense to take the cables up to the ceiling. Old habits die hard, and many modern exchanges were installed in old buildings.

It may be that modern electronic exchanges more closely resemble computer machine rooms, but in this case, you can see from the picture that it is a solid floor, with the cabling to the ceiling.

2
0
Silver badge

Re: Expansion @John Tserkezis

the Dark Lord was commenting about moving new kit into the machine room (normally this involves rolling it across the floor, which on a suspended floor would cause significant vibration, certainly more than having the disks powered up and the head moving.

I would actually have thought that the main reason why the disks were powered down was because of power consumption and temperature, rather than vibration. Disks are not that fragile.

480 drives in a rack is not that dense. 384 disks in a 30" rack mounted 4U enclosure is a much higher density (I have a rack with 5 of these disk enclosures in each of the HPC's I look after, totalling 1920 disks in 20U of space - about half a rack), and all of these are spinning all the time.

0
0
Anonymous Coward

Re: Expansion @John Tserkezis

The SMR "Shingled magnetic Recording" disks apparently are that sensitive. They do not seem to be your standard consumer disk drives.

http://www.hgst.com/science-of-storage/emerging-technologies/shingled-magnetic-recording

1
0
Bronze badge

Impressive. Brings to mind thoughts of the library at Alexandria.

Another thought is that the data stored is essentially a log of human life. Even if we don't care about those drunken party photos, historians 2000 years from now might. The next step would be to get the data off of magnetic entirely and moved to a true long term storage that would survive everything from nuclear strikes to the degradations of time in order to be useful for the future.

2
0

I believe DNA is the big hope for super-duper long term storage, that stuff holds for millions of years.

0
0
Bronze badge

Also DNA is great fun when you need to make a backup \child copy ...

0
0
Silver badge
Headmaster

DNA only holds for any amount of time if it has been well-packaged in a protective (no oxidizer, low-radioactivity, not too hot and probably needs chaperoning molecule magic, too) environment and duplicated to hell and back.

0
0
Thumb Up

Use Papyrus we know that lasts for thousands of years!

1
0

Mmm

I love a nice datacentre. And I very much like the ideas around power consumption, I've got some tough targets to hit.

1
0
Silver badge
Mushroom

Fox Mulder mystery venue

Anyone have ICBM coordinates?

Pretty sure Amazon's EC2 is right next door, too.

Nuke pic because appropriate, I would NEVER hurt a datacentre.

1
0
Bronze badge

Seems to be an Oversized Data Room, not efficient as it could be.

In addition to "free cooling" using cold ambient outside air (The reason this building is in Oregon) the latest idea to reducing the cost of cooling, is close up the racks and deliver the air directly to each rack at the bottom and duct the hot air away from the top of the rack. This reduces the overall cooling CFM required by a factor of 5 to 10 times. Instead of cooling lots of empty space you bring the air in ONLY where it's needed. The old idea is to do "hot aisle cold aisle" and that is not very efficient as you rely on an open rack, open room, no air velocity or real directionality. Closing the sides of the racks allows all of the fan volume to blow cool air over the servers and all of the hot air is sent to exhaust so it cannot add heat to the room. The fans are ECM so they can operate with variable speeds, reducing the cost electricity further.

The next change is to use lower power factor servers and processors and raise the allowable maximum temperature to around 75 to 78 F from 50 to 60 F. That is a huge difference in cooling cost alone and the latest cpu's are efficient enough now to allow the increased minimum room temperature.

These modifications could drop the electric usage for cooling by another 10-15%.

However, now some server manufacturers are introducing liquid cooling direct to each server in the rack. This will add another 15% efficiency at least as air is a horrible heat transfer fluid while liquid cooling is highly efficient. This would allow the main refrigeration system to directly reach each processor without having to change cooling media from liquid to air and be another 8-10% better

0
0
Silver badge
Facepalm

Re: Seems to be an Oversized Data Room, not efficient as it could be.

Yes, I'm sure Facebook didn't think about any of this stuff. Too bad you weren't on their team and could have saved them more money.

0
1
Silver badge
Bronze badge

Re: Seems to be an Oversized Data Room, not efficient as it could be.

Not to be a prick, but I have been doing Process Automation & Instrumentation for 23 years and for the last 7 years Building Automation so I have some credibility on the subject of how to setup a data room cooling system. You would be surprised at how many companies just "Do what we always have" instead of investigating newer concepts that could save alot of money. That picture of lonely racks spaced far apart just proves it.

FWIW Yahoo just built a large open air server farm in Lockport, NY 10 miles from home, based on the same principles of "free cooling". The buildings look like chicken coops with cupolas and louvers running the length of the building above the main roof line and at ground level. Racks are spaced one unit apart (18") from the next.

0
0
Megaphone

Can the Register writers use up to date international standards and get the Editor to proof read it?

In Britain we use Centigrade, we threw out Fahrenheit with the dinosaurs guys. :)

Or is the author the only IT journalist still using units from the Stone Age? :)

Alternatively use both units please.

1
2
(Written by Reg staff) Silver badge

Hi Stuart!

"Can the Register writers use up to date international standards and get the Editor to proof read it?"

Well, I didn't want to ruin the flow of the copy but just for you, I've shoved in Celsius :-)

HTH,

C.

0
0
Silver badge

"In Britain we use Centigrade, we threw out Fahrenheit with the dinosaurs guys"

You're still in the dark ages, it's called Celsius now.

http://en.wikipedia.org/wiki/Celsius#Centigrade_and_Celsius

1
1
Bronze badge

Fahrenheit vs Celsius

Perhaps the fact that Facebooks server farm is located in the State of Oregon, USA might have something to do with the choice of temperature measurement units?

Sorry to be a Dinosaur but since Farhenheit has approximately twice as many basic measurement divisions as C does, it is roughly twice as accurate a measurement scale as Celsius. Also, I'll take the Germans over the French any day.

0
0
This topic is closed for new posts.