back to article DARPA: Can we have a one-cabinet petaflop supercomputer?

Famous US military crazytech agency DARPA has issued a challenge to the IT community: do you think it's possible to build a petaflop supercomputer that fits in a single air-cooled 19 inch cabinet and requires no more than 57 kilowatts of power? One which requires no special programming skills to use? The challenge is laid out …

COMMENTS

This topic is closed for new posts.
  1. Rob
    Terminator

    Skynet's coming

    nuff said

  2. Shadowfirebird

    Can we have an intelligent one-petaflop cabinet computer?

    Simple enough question to answer, I would have thought. "No."

    I'll take my payment now, please.

  3. pctechxp
    Pirate

    Shouldn't this be....

    A rise of the machines story?

    Abtraction from the hardware via a self aware OS?

    Then there's the generator, perhaps they might think it should be a small nuclear reactor.

    I hope I live to be an old man and die before that happens.

    Perhaps the film Eagle Eye is very prophetic indeed and then of course there's I, Robot.

    For goodness sake, don't give these nutcases positive feedback or we're all doomed!!! :)

    Skull and crossbones because it'll be only a matter of time before it considers us meatsacks surplus to requirement.

  4. Ian Ferguson
    Black Helicopters

    All in good time

    Computing will get there given time - given today's server power and Moore's Law, I'd be interested to see a calculated guess at when DARPA's requirements will happen naturally.

    However, given the more rapid advancement in communications, I wonder if the answer in the meantime won't be to have a high powered, compact, hot brain in every automated vehicle, but to outsource the brain to a data centre, using high-bandwidth wireless communications (which I'm sure the military are more than capable of acquiring).

    These extra icons just make it more difficult to choose. And what's with the lack of dithering on the thumbs up and sad face?

  5. Francis Vaughan
    Terminator

    Proof read.

    ""self aware operating system" which will determine the best way to use the massive resources available to it without the need for human programmers to intervene."

    Simply remove the last three words. Then add a plural. All done.

  6. amanfromMars 1 Silver badge

    US Intellectual Block ...... Phish 42 Phorm Target for Out of Temporal Sync/Quantum Leap Purchase?

    I wonder if that would be "a New Virtualised Operating System with Irregular and Unconventional Networks InterNetworking Joint Adventures and Special Application Programs" they would be looking for?

    The one that is lightly touched upon/hinted at in "Use DPI Better, Now …… alsjeblieft. Which probably means buying in SMEs*" ... Posted Friday 26th June 2009 07:31 GMT .... http://www.theregister.co.uk/2009/06/25/uk_cyber_security_strategy/comments/

    It certainly meets all the core requirements and that would make it very valuable, and to rivals into the same dominance posturing/hyperbole, priceless.

  7. Mark Barratt 1

    Funniest thing in weeks

    You know, whenever I meet a young aspiring programmer, I tell them "get into AI - it's a job for life".

    Lol, I can already give you the core code for Darpa's "self-aware OS". It looks like this:

    switch (machinestate) {

    case (state1)

    dostate1();

    break;

    case (state2)

    dostate2();

    break;

    ...

    ...

    default:

    do_something_sensible_given_the_circumstances();

    }

    Now, if DARPA can just get one of their bright youg things to write the simple do_something_sensible_given_the_circumstances() function, I will happily code the rest of their OS for free.

    Bottled water, anyone?

  8. Michael Nielsen
    Alert

    Galactia

    perhaps the predictions that the new Galactica, terminator, etc, made will come true, about humans creating machines, machines becoming self-aware, machines kills off their ineffective, destructive, irrational masters, for the greater good of the world... lol.

  9. Steve Loughran

    100s of tons for a petaflop? Hardly

    I've seen some of the top 100 supercomputer clusters, if all you want is raw CPU its not that big (excluding the aircon and transformer bits). But your disk storage, all those spinning disks to give you the petabyte alongside the petaflop -that's what takes up space.

  10. Anonymous Coward
    Welcome

    I for one...

    greet our new mouse overlords.

  11. Anonymous Coward
    FAIL

    The answer is "No" ...

    And here's the reason why: http://www.rebelscience.org/Cosas/Reliability.htm

  12. BlueGreen

    No

    not possible. Can't have.

    Most especially not if you don't need a deep knowledge of the architecture.

    Must try harder.

    Fer pete's sake do the sums, 1 petaflop = 10^15 flop.

    1 nanosecond = 10^-9 seconds. About the time taken for a photon in vacuum to travel 1 foot.

    So, about 10^6 calculations per nanosecond, meaning about the time it takes for a ray of light to cross your box (being very generous, cos it's going to be bigger than a foot cube). Comms alone is going to be a problem, innit.

    Either some William Gibsonesque quantum crap (right now I've no good reason to believe quantum stuff will ever work in a big way), or a shift to something else (analogue computing anyone? entailing a severe 'paradigm' shift, and with results not measurable in flops anyway), or something I haven't thought of.

    Grow out of it, darpa. The real world awaits.

  13. Mage Silver badge
    Flame

    The AI myth

    Driving with a camera isn't AI. It's brute force programming. Like Chess isn't AI.

    AI doesn't exist. If it did an ordinary Super computer would just do it very slowly.

  14. Filippo Silver badge

    meaningless media measures

    Ew. There's no meaningful comparison between organic brains and computers, because they're apples and oranges. I mean, any pocket calculator can do a good number of tasks much faster than my brain.

    And intelligence isn't equal to number of neurons anyway. Simulate a neural network with as many nodes as neurons in a mouse's brain doesn't make a virtual mouse. And simulating a neural network with as many nodes as neurons in a human brain won't make an AI.

  15. demat
    Terminator

    I for one

    welcome our new rack-mounted overlords.

  16. Geeks and Lies
    Alien

    Bet the....

    Aliens could do it!

  17. Andy Barber
    Pint

    "programmable by ordinary coders"

    So I can use a sensible language like Sinclair SuperBasic then?

    BTW get rid of the the BG's icons!

    Beer? Too long in the tooth!

  18. Ian Bradshaw

    well ...

    Any decent-size data centre will in future be more intelligent than its human admin

    anyone who's tried to ring a certain German hosting company beginning with a 1 and ending in a 1 will know this is already the case.

  19. Mike 61
    Boffin

    Yes!!

    I could do it. Petaflop in a standard 19" rack, not a problem. Can't reveal how, but its do-able. I sure DARPA can get my location from the reg if they are really interested.

    Hint: buckey + (CdSe) + (Inx Ga1-x As) = datapath

  20. Peter Ford
    Go

    The correct answer is 'Yes', but...

    you need a roomful of computers to do the AI bit.

    Might be cheaper and easier just to tack a on cubicle with a geek in it.

    Hook it up with a coffee and pizza feed, then just ask the geek to sort your code out...

  21. Martin 47
    Thumb Down

    Arrrgggggggggghhhhhhhhhhh

    Our overlords have started already I now seem to have a number after my name!

    When did that happen??

    Thumbs down because I am not a number, except on the reg that is

  22. Tim Bergel
    Welcome

    @demat

    cmon, get with the program, you have this icon now....

  23. amanfromMars 1 Silver badge

    Safe Bets for Crown Jewelers and Right Royal Wasters.

    "AI doesn't exist." ... By Mage Posted Friday 26th June 2009 14:53 GMT

    Thanks, Mage. Such delusional nonsense permits such covert development as you would obviously not believe.

    And that is not all, for there is also the following ...<<< And ITs Immaculate Stealth comes in Third Party Disbelief of such Stated Facts prefering as so many can and do so easily do, to dismiss such Tales as a Manic Fiction rather than Indisputable Shared Truth. But although Freely Available to Everybody and Anybody, IT does require QuITe a Lot of Personal XXXXPerience to Master NEUKlearer HyperRadioProActivity for Polyamoral Ubiquitous Programs ………with Promise in Deep Capture Projects.>>> which is AIMove in Parallel with .... "Shouldn't this be.... A rise of the machines story? Abtraction from the hardware via a self aware OS? Then there's the generator, perhaps they might think it should be a small nuclear reactor." ..... By pctechxp Posted Friday 26th June 2009 13:49 GMT

    "Bet the.... Aliens could do it!" ... By Geeks and Lies Posted Friday 26th June 2009 14:57 GMT

    In their sleep, when IT is so easy, Geeks and Lies.

  24. Anonymous Coward
    Anonymous Coward

    Someone's been drinking Kurzweil's Kool-Aid

    The assumption that intelligence is just the result of a larger number of FLOPS is a pretty poor one. Still if they're throwing money around I'll be willing to take a swig of the Singularity cult juice.

  25. Mark Barratt 1

    Piled higher and deeper...

    Greg Fleming wrote:

    "["fail" graphic] And here's the reason why:

    http://www.rebelscience.org/Cosas/Reliability.htm"

    Abstracted from that webpage:

    [my proposal] will not only result in an improvement of several orders of magnitude in productivity, but also in programs that are guaranteed free of defects, regardless of their complexity.

    Sorry, I stopped reading after that. Did anyone get further?

  26. Anonymous Coward
    WTF?

    Author needs to brush his own cognitive skills

    Author says: "If DARPA get their way, this sort of computing power will now be available in a single cabinet. Any decent-size data centre will in future be more intelligent than its human admin"

    The accumulated weight of a man in pork meat doesn't make the meat smart now, does it?

  27. lifelesspoet

    Can we build it....?

    YES WE CAN!!!

  28. nicholas22
    Coffee/keyboard

    Hilarious

    The answer is "No" ...

  29. milliganp

    ATI can do it!

    A Radeon graphics card does 1.2 Teraflops for 160w of power, so 800 of them is a Petaflop for 130Kw. A 50% increase in processing power and 30% decrease in energy should just about do it.

    You just need about 100 of these initially then leave them for a year or two and they'll write their own operating system.

  30. Anonymous Coward
    Thumb Up

    Almost doable :)

    Here's someone suggesting a 4 petaflops machine using current generation video cards: http://helmer3.sfe.se/

    It doesn't quite fit in a 19" rack and uses twice as much power per PFLOP as DARPA wants, but it's not all that far off. Since DARPA does not expect to have its petaflop cabinet until 2017, it should be perfectly doable on the hardware side.

    I think the programming/execution model and the smart OS may be more of a challenge than the hardware.

  31. John Smith 19 Gold badge
    Coat

    Ah back to the future

    So that was what scuppered the Japanese 5th generation project? Not enough FLOPS.

    And of course the "any ordinary language" bit optiminally spread across all the processors.

    Given that most software for this lot will be written in FORTRAN/C/C++ and possibly LISP

    Optimal splitting of software *without* the software being aware of the hardware structure.

    Sounds familar. I think several projects have attempted this (with varying degress of success)

    Or perhaps stuff them through a FORTRAN/C/C++ 2Occam converter and run that as a collection of (I dunno) parellel sequential processes perhaps?

    Mine's the one with a CAR Hoare book in the pocket.

  32. amanfromMars 1 Silver badge

    Send them an e-mail. It is quicker and more secure.

    "I sure DARPA can get my location from the reg if they are really interested." .... By Mike 61 Posted Friday 26th June 2009 15:35 GMT

    Mike 61,

    The more I learn, the more that I may be easily convinced that they can do a lot less than is imagined. It is though a convenient fiction pimping a certain perception which imparts a sort of remote control.

  33. Allan George Dyer
    Paris Hilton

    "human-like cognitive performance"?

    Easily distracted by a unit of the opposite sex ... uhh ... polarity?

  34. Charles 9
    WTF?

    Re: Piled higher and deeper...

    That web site that's being quoted. I wonder if it realized that the page seems to overlook a rather important piece of computing problem-solving that Turing's machines demonstrate: the Halting Problem, or the theorem that it is impossible for a "computer" to determine in advance whether or not a given program will halt or run infinitely (the proof is by contradiction).

  35. Giles Jones Gold badge

    Makes a change...

    They didn't ask for it to be Windows and need to run Word or Access.

    I really don't know what the DARPA people do, I thought it was their job to do research and development? not put down a bunch of specifications and ask the commercial firms to see if it can be done.

  36. Anonymous Coward
    Pint

    *Net

    Equivalent human cognitive capability in computers may require more than algorithms and operations per second. It may require the coordinated superposition and decoherence of key elements- such as electrons in the microtubules of tubulin proteins as theorized in Orch-OR by Penrose and Hameroff.

    Can this set of conditions that exist in human neurons be met in quantum computing software/hardware? If human consciousness is an emergent property of this set of conditions, what emergent properties will come from such a computer. Will it be Skynet from "Terminator" or the lifeform from "ST:TNG Episode- Emergence" or something much more 'interesting'.

  37. John Smith 19 Gold badge
    Go

    If the problem is power + cooling

    The answer is asynchronous hardware.

    Given their stated requriement its 20pW per operation.

    Modern processors are reported to need c50% of their transistor count for clock drivers & de-skew.

    IIRC worst case overhead for asynch is 25% max. Usually less than this.

    Of course this means it's *very* difficult to split a product line by clock speed (ther isn't one).

    But to go the whole hog you'll be looking at the whole deal. Chips face down, multi-chip modules etc.

    Note that raw power is (in principal) fairly easy. The whole ARM chipset (32 bit processor, MMU, maths chip) ran about 100k gates.

    Keeping it fed (and harvesting the results) is likely much harder.

    But that will be *nothing* compared to the algorythmic slice and dice software and "Self Aware" do-what-i-mean OS.

    Just my tuppence.

  38. This post has been deleted by its author

  39. Sonya Fox

    May be possible

    I tend to hesitate to say something is impossible when referring to computers, given the staying power of stupid predictions (like Bill G's proclamation that 640KB is enough memory) but a petaflop in a cabinet is pretty unlikely in the foreseeable future. Shrinking current tech to the proper size is certainly possible but then the whole power/cooling thing becomes a huge problem as eventually you get to a point where Moore's law slows down and you can't fit enough electrons through your tiny traces to run your tiny transistors without literally melting the thing into oblivion.

  40. Seán

    Tsk

    They're not looking for AI they're looking for patterns and exceptions, deviations and spikes. It doesn't have to be Human Intelligence it has to be Artificial intelligence and that fucking Turing test guff is a red herring. I don't want to chat to the damn thing I want it to find things and notice stuff.

  41. Secretgeek
    Heart

    I love DARPA.

    They don't just want the moon on a stick they want the Sun on a pin head carved by lasers. Gotta love the dreamers. Even if they are dreaming about the next armageddon.

  42. Anonymous Coward
    Coat

    Re: ATI can do it!

    "A Radeon graphics card does 1.2 Teraflops for 160w of power, so 800 of them is a Petaflop for 130Kw. A 50% increase in processing power and 30% decrease in energy should just about do it."

    And who knows, the resulting assemblage of silicon might be able to run Crysis/Vista/A.N.Other renowned resource hog[*] with all settings at max.

    I know, I know - I'm already out the door ...

    [*] - delete as appropriate

  43. Columbus
    FAIL

    Crysis - high quality..

    Nuff said

  44. Charles Manning
    Thumb Up

    No problem

    How many dwarfs can you pack in a rack?

  45. Anonymous Coward
    Terminator

    CrazyTech?

    I hate that term "CrazyTech" - yeah, just slate those who dare to dream, call them crazy. Don't bother applying for jobs with DARPA, you're going to be labelled nuts!

    Well, I beg to differ. If you don't try, you don't achieve.

    Sooner or later some team somewhere (probably our Jap chums) is going to evolve a "proper" AI. Impossible? How so? The humain brain is just laying there waiting to be reverse-engineered! Unless of course you're one of those bible-bashing nutjobs who thinks the "soul" drives us.

  46. storng.bare.durid

    RoTM

    It is possible, maybe, one day, we will create a machine smart enough to create another machine, which would be even smarter... etc and bootstrap the creation of something pretty interesting.

    Whether we have anything as benevolent as a Iain M Bank's 'minds' or something more akin to Skynet, lol, is an interesting question.

  47. TeeCee Gold badge
    Coat

    Only one question really.

    If somebody makes one, how long before somebody else buys a roomful and builds a DBHPC* cluster of 'em, chewing megawatts of power and producing oodleflops of grunt?

    Alright, two questions. How long after it's done before DARPA tout for *that* in a 19" cabinet......

    *Dog's Bollocks High Performance Computer.

  48. Hermes
    Stop

    I...

    can't let you do that Dave....

  49. Solomon Grundy

    @Giles Jones

    You described exactly what DARPA does. They put out a bunch of crazy specs and see if any commercial firm (or individual) can come up with a viable proposal. If someone can come up with a proposal they give you a sweet job and let you try and build it - doesn't matter if you make it work or not, you still get paid and get a good line on your resume. There are no non-bureaucratic "careers" at DARPA, you get hired to work on a project then your time is up when the project is over.

    As an interesting footnote, often people who get hired by DARPA and successfully complete their project start companies immediately after their termination from DARPA to produce those projects for various govt. agencies. Pretty good work if you can get it. Check out their website sometime and look at the jobs section...

This topic is closed for new posts.

Other stories you might like