But can it run Crysis 2
at max resolution?
Oak Ridge National Laboratory, one of the big supercomputer centers funded by the US Department of Energy, has tapped Cray to build a monster cluster that will weigh in at 20 petaflops when it is completed next year. According to a presentation (pdf) by Buddy Bland, project leader for the Oak Ridge Leadership Computing …
at max resolution?
. . . Flash still freezes
Yeah, but it freezes so much faster!
That would most likely cause the 'mini black-hole' that the Hadron Collider failed to deliver.
Just enough time for a pint before the end of the world...
i.e how many olympic swimming pools will it heat?
not like the one in Portmouth...
full of beer
@"will scale from 100 to 250 petaflops"
Considering a high end mobile phone CPU plus its GPU can now already beat a top of the range early 1980's cray super computer, I look forward to the day future mobile phones have 250 petaflops. :)
needed to play Angry Birds
about the Met Office's 400Mhz Cray 1, back in 1980 or so.
My blackberry is now more powerful than that was. Thus, in 30 years time, we can look forward to this monster on your wrist. This will one day be viewed by future kids as stoneage.
My laptop conundrum, that all this three tier bollocks will fade away as power increases to the point where all worldwide computing requirements can be served from a single Dixons laptop, in a client server fashion, is coming true.
That said, if the american invisibles, who I'm sure know who they are, started doing Buckingham Palace type tours round their datacentres, just to impress people with the scale of what can be done, I'm sure I could fit it in alongside a trip to Disneyland, and would promise not to do an Eddie (Shoestring.)
But at a much faster rate, no doubt.
... and after the TITANS came the GODS
Why do they need to keep updating their systems every 2-3 years? Is what the already have not good enough because most scientists would love to have just a fraction of that power. Maybe it's time critical work and the amount of data is increasing all the time so they need to plan ahead now.
Well at least the tech is filtered down eventually somewhat to ordinary consumers. Intel's consumer (1st & 2nd gen i7's) and server based CPUs are amazing and seem to only make my jaw drop further! My 8-minute 1080p video rendering on 2600+ 4.8GHz (16GB RAM 1.8GHz) takes only 20mins at max quality rendering on either MainConcept mpg2 (8-bit @10MB/s). It used to take 1hr 55mins on QX6700@3.3GHz. This is simply using the CPU and not the onboard CPU/GPU chip. Shows you it is well worth upgrading for the performance increases to be had from CPU's with a 4.5 year gap.