With the supercomputing community in the Western economies freaking out just a little bit that China has come out of nowhere to take the lead in supercomputing, and the US supposedly getting ready to allocate $5bn in an effort to push up into the exacale realm, IDC could not find a better week to deliver its report to the …
And as soon as they are done, India will "come out of nowhere" and kick some ass. Then Russia maybe... But as the author mentioned, it's good that there is come competition.
Still I suppose one could say that China has assembled a super computer rather than having built it themselves.
Coming from behind, using all the knowledge there is, is such an advantage. Why build on AMD64 when there is the RISC architecture and so forth.
In other words big western companies like Intel, IBM and similar should just fund new companies giving them a billion and free hands and the single task to beat them as they are stuck.
Stuck with an organisation they cannot change, stuck with bean counters that will no allow any risks and a hell of a lot of happy 11 to 16 people.
Bravo for a very well put argument. It depresses me just how often it does just turn into a cheque to build a machine specifically designed to address the gigaflop harlotry of the top 500. Considering the rich history of real innovations that did come from Europe, there is no reason why it can't be got right.
Missing the point
Europe has done very well largely by avoiding national or European "champions" that cannot survive without state funding. To complain about lack of computing power for the region that hosts CERN, launches Ariane and has some pretty impressive telescopes is simply disingenuous. In fact the budgetary constraint of not having a department of defence willing to write a blank cheque has encouraged the co-operation between research institutions. Not that the lack of cash doesn't cause problems or lead to underfunding - the current discussions about maintaining funding for fusion research and therefore cutting back on other areas being a case in point. There is more to excellence in the field of computing than how many computers you have in the Top 100.
Is this continent dick waving
or is there some useful (and efficient) purpose to these machines?
Surely, an array of synchronous gigahertz clocked dynamic logic based processor cores' energy consumption, is going to affect the weather prediction it is running.
Rather than having a single thread multifunction ALU, have many single function arithmetic and logic resources, shared by many threads. Use some sort asynchronous data-flow type configuration, and critical path optimisation. C source code to logic compilers have already started to appear, but it is difficult to describe multi-threaded algorithms in C, except for a small number of areas: terms of an expression can be paralleled, but iterations of a loop are not adequately described.
"That is what China is up to. And, for that matter, that is what the US is doing when it gives Cray and IBM so much money for petaflops supers these days."
IBM will outsource all of the development to India and China.
not so hilarious
Timothy, the so-called hilarious Bull company has already designed and produced 2 petaflops-systems, one is today operational and the other will be it on end 2011. Moreover, Bull is working on future exascale systems with partners such as Intel. Open your eyes: there're still good players in Europe!
5 billion dollars for a computer that will depreciate at something like 40% a year?! How about rolling out 4 bilion dollars worth of 1Gbps fiber optic cables to US homes so we stop free-falling down the broadband rankings? The contract for getting the link could be that you must buy a fast computer and let the government process tasks on it for 10 hours a day. The other 1 billion dollars should be enough to build a core computer to coordinate tasks.
We've already got an exascale* computer and no-one is using it...
The Internet has two billion CPUs and uses more electricity than most countries. Average utilisation of all that computing power is about 2%.
(*it's actually zettascale)
it can only be used for problems that admit partitioning into discrete batches that need little inter-node communication.
Not all problems are like that.
Of course, but it doesn't matter
The list of suitable apps is very long (limitless, even) and includes plenty of Grand Challenge problems. Who cares what's not on it?
Harness the Internet, save the world.
(Paris, cos it's that simple...)