I recently stumbled upon a transcript from a very recent interview with HPC luminaries Jack Dongarra (University of Tennessee, Oak Ridge, Top500 list) and Horst Simon (deputy director at Lawrence Berkeley National Lab.) The topic? Nothing less than the future of supercomputers. These are pretty good guys to ask, since they’re …
If it would cost $450m per annum to run an exascale machine, why not site it somewhere that has cheaper electricity, like Iceland? What would that do to the figures?
Re: Electricity costs
I think the problem is the required output - 450MW is the output of a small coal-fired power station so you'll need both the available power source and the infrastructure to deal with this.
And then the cooling to get rid of all that heat.
HPC = using electricity to produce heat... co-generate heat and crunching horsepower?
Not my invention, I admit - was it here, where I read about a supercomputer that had a secondary function of a heat source for a university campus?
Now if you needed to build a supercomputer with hundreds of MW of heat dissipation... you could as well use it to provide central heating to a fairly big city. Or several smaller cities. Such as, there's a coal-fired 500MW power plant about 40 km from Prague, with a heat pipeline going all the way to Prague. The waste heat is used for central heating. Not sure if the pipeline still works that way, it was built still within the commie era when such big things were easier to do...
The trouble with waste heat is that it tends to be available at relatively low "lukewarm" temperatures. Computers certainly don't appreciate temperatures above say 40 degrees. Then again, there are heating systems that can work with about 30 degrees Celsius of temperature at their input. Probably floor heating sums it up - not much of a temperature, LAAARGE surface area. No need to heat the water up to 70 degrees or so.
The obvious implication is: generate heat locally, so that you don't need to transport it over a long distance, which is prone to thermal losses and mechanical resistance (for a given thermal power throughput, the lower the temperature, the larger the volume of media per second).
The final conclusion: sell small electric heat generators, maybe starting with a few kW of continuous power consumption, electric-powered and interconnected by fiber optics, where the principle generating heat is HPC number crunching. Build a massive distributed supercomputer :-) Yes I know there are some show-stopping drawbacks - but the concept is fun, isn't it? :-)
If running costs are the primary issue, why not site the machines in somewhere with cheaper electricity, like Iceland? What would that do to the $450m per annum figure?
And for power mad HTPC fans, no exascale for you either because you're f****** bananas and keep dribbling into the fan vents and chewing on the cat.
Just build a nuclear plant.
Of course research leading to more power efficient HPC needs to continue, but what's wrong with building a new 2GW power plant (a quick googling says ~$2-5 bn) with a new exa-number cruncher on the side. The extra power could be sold to help offset the computing hardware costs, and as HPC efficiency improves, more computing capacity could be added. Not to mention that the life time of the plant far exceeds that of any particular HPC setup, and it you get the added bonus of packing that many more smart people under one roof (physicists/mathematicians/comp engineers etc). To get around the NIMBY/eco-nut problem just tack the whole thing onto an existing research lab such as Oak Ridge or Sandia National Laboratories.
Re: Just build a nuclear plant.
to build a nuke plant in the states these days requires 2 acts of congress and 4 acts of $diety. So while technically doable, your faster bet would be to wait for the tech to catch up...
Will continue while 1/2 the power is wasted driving clock lines.
I think people estimate it takes about 1 petaflops to simulate the human brain.
And the human brain's power consumption is around 20W.
Sounds like they have a long way to go does it not?
20W for that much neural computing horsepower...
To me, it seems pretty wasteful to emulate a rather fuzzy/analog neural network on a razor-sharp vaguely von-Neumannian architecture (albeit somewhat massively parallel). Perhaps just as wasteful as trying to teach a human brain (a vast neural network) to do razor-sharp formal logic - with all its massive ability to create fuzzy and biased associative memories and search/walk the associations afterwards, with all its "common sense" being often at odds with strictly logical reasoning ;-)
Ballistic/optical/quantum computing. Oh, we could use DNA computing for massively parallel problems, but not sure if the energy requirements scale on that one... as it might have a big appetite. ;)