Fujitsu's K computer has confirmed its place as emperor of the Top 500 supercomputer list with an incredible 10.51 petaflops. The K computer – RIKEN, a Japanese government science and technology research institute, after the Japanese word for 1016, "Kei" – was conceived in 2006. Detailed design took place from 2007 to 2009, when …
is this the Milliard Gargantubrain
i know what resources used to be required to run a simulation in a 1000x1000 matrix. It boggles the mind how much power it would take (or perhaps, be left over on this machine) to simulate an underwater earthquake and its aftereffects...
I bet it would run Crysis just fine.
(beer because there's no saki)
Re: (beer because there's no saki)
You're right... there is no precedent
What OS is it running?
That's a lot of hardware to track.
Re: What OS is it running?
Wiki says Linux based enhanced operating system.
Rare people who still thinks only poor who can't afford windows runs Linux should think about it. seriously, it is 2012, not 1996 and they genuinely think that way.
Pedal to the metal
I wonder what it would be like, playing GTA on this bad boy.
even this colossal machine can't run flash.......
alright alright i'll get my Mac
I like the MONSTER caps - good choice!
From vacuum tubes and valves to this, feck me. I hope it's got some flashing LED's on it at the very least?
Four times the runner-up's performance
Not entirely sure of the numbers, they seem to dance. Tangentially, GFLOPS comparisons proved elusive, when asked of a nearby search engine. Must be the hour, I'm sure.
Disagree that not any single company could buy and operate such a thing. The price tag is what, 1.2mrd and 10mio per year? Several companies are sitting on piles of cash big enough to build a dozen of these things. Apple, for one. They could do it more easily than the US government, even (not counting the uncountable black ops budgets). Of course, few companies would do it just for the heck of it, and then nobody else would get to play with it. So that's where you need government, or something else to manage a shared resource. But that's a problem of distributing utility.
And then the salespitch. Of course. Though much of the trouble with them nukular reactors and the tsunami appears to've been simple negligence and corruption. Emergency generators not quite secured enough, just the wrong interconnects available, been pointed out for ages but ignored by officials, that sort of thing. Spendy supers aren't going to help against that. Even so.
What would be more interesting perhaps is figuring out if it'd make sense for, say, amazon to deploy one of these things, instead of a comparable number of cpus in commodity boxes currently in their datacentres. The big difference is that they're going for rock-bottom cost, not so much for interconnect speed. The latter is the big selling point of the K computer and it's not something amazon tends to sell. Might it be? Discuss.
I note that one of the 6 areas where it will be used is ''New material and energy creation''. If they can do that then the physicists would be *really* interested. All that they know how to do at the moment is to convert one form of energy into another form. If these guys would be able to bust the conservation laws and create new energy - that would be great!
Predictions can be wrong
Until 2011, the earthquake predictions centered offshore south of Tokyo. So need need tp fix the reactors dead on the fault that actually went in 2011. Still worshipping these digital monsters (mine has more cores than yours) even though there's no electricity to run the damn thing.
Wait no flash?
Not even for bootloaders and management systems?
So I guess it would cost like $30B to run Oracle EE on it w/ a few options and an 85% discount to start....much more over five years;-)
Maybe PostgreSQL would be a better choice.