back to article 63 TRILLION maths ops a second - in 5 inches? Mm, show me

If you need lots of compute power, you’re either already using GPU accelerators or taking a close look at them. And if you’re serious about tapping into the full processing power of the GPU, perhaps this gear will fit the bill. Dubbed the ioMillennia, it’s a 3U, PCIe Gen3 switch from One Stop Systems that can handle sixteen …

COMMENTS

This topic is closed for new posts.
  1. Steve Todd

    Minor quibble

    The GPU in the iPad 2 was rated at about 19 GFLOPS (see http://www.anandtech.com/show/6426/ipad-4-gpu-performance-analyzed-powervr-sgx-554mp4-under-the-hood) which puts it about in line with desktop costs per MFLOPS.

    1. RussellMcIver

      Re: Minor quibble

      Yes and the integrated GPU in an Intel desktop CPU is somewhere around 100 GFLOPS so the difference is still substantial.

      1. Anonymous Coward
        Anonymous Coward

        Re: Minor quibble

        The difference is still substantial.

        but the numbers are toss, so whats the point in the figure?

  2. Anonymous Coward
    Anonymous Coward

    Can it mine bitcoins cheaper than my desktop?

    1. Rampant Spaniel

      Re: Patients?

      Probably but aren't they all switching to asics?

    2. Chairo

      Can it mine bitcoins cheaper than my desktop?

      probably yes, certainly it will be much faster, too.

      You might even get back your invested money in a decade or two...

      1. JeffyPooh
        Pint

        A new Computing Metric springs to mind: Bitcoins per Bitcoin

        Bitcoins mined per unit time divided by total cost (in Bitcoins, natch) to run the system. The capital costs can be amortized over an arbitrary period, say one Moore period (18 months or a year, feel free to argue). It should come out as a dimensionless ratio.

        Downside it may change over time as the absolute cost to mine Bitcoins is a function of time in the long run.

        Also, power costs vary with time and place.

        Hmmm... forget about it.

  3. paj

    Can't wait for the software!

    This box takes up another step towards commodity HPC. A lot of businesses could afford to put a box like this under their analysts desks, giving them serious compute firepower. The question of course, is what do you do with that? At the minute you need advanced programmers to make use of this kind of hardware, at least if you're doing a task custom to your business, as opposed to, say, running Folding @Home.

    You can imagine a kind of high-end spreadsheet, that puts all this compute firepower in an easy to use package. When someone invents that, all these data analysts will be able to really use the HPC boxes. This would greatly benefit all kinds of financial analysis. And can you take it further? Retail sales data? Industrial sensor data?

    1. JLH

      Re: Can't wait for the software!

      Yes. but the problem is getting that retail sales data/industrial sensor data into and out of the GPUs.

      You still have to have a very capable system which can read the data from disk and display the results.

      Just throwing lots of flops at a problem isn't the stotal solution.

      1. keithpeter Silver badge
        Boffin

        Re: Can't wait for the software!

        "You can imagine a kind of high-end spreadsheet, that puts all this compute firepower in an easy to use package."

        J built for vector processing or Whitney's proprietary Kx?

    2. Christian Berger

      Won't be simple

      To analyse large amounts of data you need to get them quickly into that box, and that's not going to be easy. You'd need lots of very local RAM.

      So it's very questionable if you can do "Big Data" on such a box. Simulations are more likely to a degree, but eventually it's going to be bandwidth starved.

      BTW there already is a simple to use interface to process large amounts of data. It's called SQL and was designed to everyone with half a brain can learn it and create astonishing analyses.

  4. Anonymous Coward
    Anonymous Coward

    Some context.

    This would top the June 2004 Top500 list. Basically, 9 years ago this would have been the fastest machine on earth. Not just the fastest production machine, the fastest machine at all. The top one at that time was the Earth-Simulator in Japan at just under 41 TFLOPs and burning 3.2 MW of power.

  5. Charlie Clark Silver badge

    Very flawed comparison

    The comparison should compare like-for-like workloads. FLOPS is an interesting base for comparison but is just that: a base. The cost of power of the whole system should be factored in and If you need Peta-FLOPS of computing power then it might become a real-headscratcher as to how you can do that with commodity hardware today.

    Long-term actually owning any of this hardware is going to be too expensive for the calculations that "always manage to outgrow the available hardware" but getting a price for say 1000 Peta-FLOPS for 100 days may soon become a reasonable possibility. Isn't this where Google is aiming to be? Could be mucho-millions in it from the scientific community if they, or anyone else, can deliver.

  6. Anonymous Coward
    Anonymous Coward

    I run the IT for an Accountancy Firm specialising in the entertainment industry,...

    in order to calculate the Tax Return for the band "Disaster Area" this financial year, it took 3 of these.

  7. Anonymous Coward
    Anonymous Coward

    But this is parallel speed.

    I'm not sure you can measure it with the same graph as sequential speed?

    It's OK if you want to do many things at once, which is probably quite often for the buyers of this computer, aaaand can I have ten please?

  8. Anonymous Coward
    Anonymous Coward

    Crysis etc..

    Sorry

    1. Anonymous Coward
      Anonymous Coward

      Re: Crysis etc..

      57fps with all gfx options on max.

    2. danolds

      Thank God!!Re: Crysis etc..

      I went back to look at the comments on this story and I was thinking that I'd have to insert my own Crysis comment. Thank God someone else did it...

    3. danolds
      Thumb Up

      Thank God!!Re: Crysis etc..

      I was looking at the comments on this story and was afraid I'd have to insert my own Crysis reference, I'm glad that someone else finally did.....

  9. Anonymous Coward 15
    Devil

    Wow, how many Bitcoins could you get out of that?

  10. I think so I am?
    Facepalm

    What?Why

    6,000 watts? why not use 6 kW?

    1. Don Jefe
      Happy

      Re: What?Why

      Why not?

      1. Christian Berger

        Re: What?Why

        Because 6,000 implies that it's not 6,001 or 5,999, but precisely 6,000 watts.

        6 kW could be anything from 5.5kW to 6.49kW which is probably a more realistic span.

        1. Horridbloke
          Windows

          Re: What?Why

          Can we redefine a kilowatt as 1024 watts? You know, just to be awkward?

  11. David Hicks
    Alert

    You could cut quite a bit off the price...

    ... by stuffing in Geforce GTX Titan cards instead of K20X. One assumes nVidia must have neutered the Titan somehow, but it appears to have the same processor and capabilities, and possibly be clocked a little faster even.

    1. Anonymous Coward
      Anonymous Coward

      Re: You could cut quite a bit off the price...

      Yes, Titans are $1K vs $5K for the K20X. The Titan has 1 fewer SMX, no RDMA, and no HyperQ/Proxy support for MPI, so it will depend on your application if the 5x cost premium is worth it. Should be fine for mining Bitcoins.

      1. Malmesbury

        Re: You could cut quite a bit off the price...

        Double precision support is the big difference for the commercial cards

  12. SirDigalot

    Whats the I/o?

    I thought one of the issues with a "generic" way of doing things is bottlenecking I/O, which the very hot and fast cards can interface with the system well, there is a lot of bandwidth needed to get the same data in and out of the system.

    However the fact things are so powerful now, is very interesting and a slight unnerving, that said no matter how underpowered a cray II is I want one because they look really COOL!

    now a 3U based system so you could potentially cram 14 in one rack... one very hot, and power hungry rack...

    and maybe 5 racks per row, 5 rows in a reasonable office datacenter... hmmm

    oh and the powerstation next door to make it work...

    you could probably reuse the heat generated to heat the building/water/small community.

    1. keithpeter Silver badge
      Windows

      Re: Whats the I/o?

      "you could probably reuse the heat generated to heat the building/water/small community."

      Back to my childhood. Liverpool University used to heat a swimming pool with heat from a mainframe (one of those with big cabinets with tape spools in) in the late 60s early 70s. Happy times.

    2. JLH

      Re: Whats the I/o?

      SirDigalot, your comment about a power station is right on the money.

      There is a lto of discussion on supercomputing lists re. the push to exascale - not because the compute power is impossible but because of the power requirements.

  13. Fred Flintstone Gold badge

    One wonders..

    How long the average crypto will hold up under this assault of computing power..

    1. Christian Berger

      Re: One wonders..

      Cryptography would actually be one of the main uses as a bruteforce attack won't need much data.

      Other than that, decent crypto should still take millennia to break on such a machine.

  14. Trustme

    I bet you STILL don't get smooth transitions playing Solitaire.

This topic is closed for new posts.

Other stories you might like