If you want to be a player in cloud computing, you have to build data centers. Server maker IBM probably thinks it will sell a lot more internal cloud than external cloud, but it realized that customers will want a mix of public and hosted cloud infrastructure to run their workloads. Hence, IBM has shelled out $362m to build a …
Could do better than 50% in a lower temp state
In the U.S. North Carolina is considered a pretty warm part of the country. They could have saved money and gotten better PR in a state with an over all lower average temperature.
Should have done better in North Carolina
Yes, the data centre could have been built elsewhere but they should be able to do better than this even in North Carolina. Perhaps they are stuck in the stone age and providing close control humidity and low air temperatures for their mainframes and midrange and thus compromising the entire data centre design and massively overcooling for normal equipment which is perfectly happy at much higher temperatures.
Probably has more to do with RTP
RTP is unincorporated, so I think the residing companies see tax breaks. Also, NC has relatively cheap electricity due to its many coal-powered plants.
IBM, your HVAC vendor
The IBM repairman now carries a pipe wrench and has a bad case of plumber's butt.
Our IBM CE has been doing pipework for the last year, but then we have water-cooled Power 6 systems!
But it's mostly pre-installed in the racks at the plant, and just connected to the customer provided cold water feed. All the frame-to-server pipework is intricate zero-leak connectors, that allow servers to be removed from the water system without loosing water, or introducing air into the system.
I'm waiting for the day when people start submerging these containers in lakes and stuff to keep them cool
In Canada, they use water from a lake, which happens to be at 3ºC year-round, to cool an entire building or downtown area. Or so I heard.
Wouldn't it make sense to build these places where it's colder?
Or by a river?
Or the sea?
IBM (and many others) is(are) missing a bet.
One acronym: GSHP.
It works, when implemented correctly.
>In the U.S. North Carolina is considered a pretty warm part of the country.
North Carolina is considered very comfortable, not "pretty warm."
>I'm waiting for the day when people start submerging these containers in lakes and stuff to
>keep them cool
Not going to happen. Not a lake anyway, power plants are under pressure here to move away from open circuit water cooling to air cooling (using water or other coolant in a closed circuit with an air-heat exchanger). Warmer water reduces alters the marine ecosystems.
Needs more iced tea
"North Carolina is considered very comfortable, not "'pretty warm.'"
From early June until about mid-September, I don't think I'd call Raleigh's climate "comfortable." Routine 90F(32C) days, and 75F (24C) on cooler nights, all with dewpoints hovering in the lower 70s. Pretty much the entire I-95 corridor from DC to Florida is pretty miserable during the summer, and would render any attempts to use outside air to cool futile.
All that said, for probably half the year, using outside air to cool is probably feasible.
Paris, because she'd complain about the heat, and what the humidity does to her hair. (Seems to be common for west coast women who travel out this way.)
reminds me o0f something closer to home.
Hooray, a future of the UK
It's always cold here, IBM could build a DC in Cardiff or somewhere in Northumberland.
Finally I can see the future becoming bright again (or is that the glow of the raid clusters?)
A future in Scotland
there is one datacentre and a second being planned near Ecclefechan, should be plenty of cold air there for 50% of the year, all they need to do is open the loading bay doors :)
"It would seem that what data centers need are less expensive IT as well as better data centers. But to suggest that is, of course, heresy to IT vendors that want to preserve revenue streams and keep riding Moore's Law to boost capacity every two years "
Of course, that's where Google comes in - custom developed data centres aimed at reducing IT spend, by engineers at least as competent as the IT vendor's own...if IBM et al want to compete, they will have to match Google's cost of ownership eventually, or Google will own the clouds..or at least a large thunderstorm of them...
I live in RTP...
...within walking distance of the IBM compound. No kidding, it's a compound. A good portion of it is behind a razor wire fence, guarded by armed security. It makes perfect sense for IBM to build here. Laws are very business friendly and they already have something like 11,000 employees here.
As for power, most of RTP is hooked up to Duke or Progress Energy, both of whom operate nuclear plants in the area.
"No kidding, it's a compound. A good portion of it is behind a razor wire fence, guarded by armed security." Maybe you should drive around to Davis Drive, where you could just about drive through the gate any day of the week. It is not a "compound" and has about as much security as any other company in the area.
And yes, IBM does have a few thousand employees in RTP - in fact, it is the largest IBM site in the world (by both employees and square footage). The buildings are absolutely massive - and with all the layoffs in the last few years, there is plenty of empty space in those buildings.
IBM has been in RTP for over 40 years, so I doubt tax incentives had anything to do with them picking RTP for the new site. Energy here is cheap. And the server group is based in RTP (as in, they design server software and hardware here). When you put it all together, it makes perfect sense for them to build in it RTP.
Laying Your Hands...
..on the hardware is a tough habit to break. Especially during the design build phase.
Even Google assembles her Podified data centers in buildings full of tech types.
But burining black bituminous bites. Sites should stay situated so solar solutions send sparky.
I think google has a lead with spanner, although moving threads around the data center to balance heat loads is not that much different than shuttling bits across the continent. Once in service, data transmission is the cheapest part of the equation. Electricity costs have traditionally driven Data Center location.