Feeds

back to article SUPERCOMPUTER vs your computer in bang-for-buck battle

A couple of weeks ago I posted a blog here (Exascale by 2018: Crazy...or possible?) that looked at how long it took the industry to hit noteworthy HPC milestones. Chatter in the comments section (aside from the guy who assailed me for a typo, and for not explicitly calling out ‘per second’ denotations) discussed what these …

COMMENTS

This topic is closed for new posts.

Page:

But

Can you wifes' desktop run crysis?

sorry -- had to :-)

Anyway I would imagine that the bulk of the cost of a HPC is in the design and construction phase since just getting all those components to work together is tricky enough - when you only need to worry about one processor, memory bank and disk drive things aren't as complicated.

Still - it does look as though the industry is passing on the progress to it's users - which is nice.

ttfn

5
1
Silver badge

Re: crysis

An i5 with a decent amount of memory, if it can't then its close. Most likely all it needs is a half decent graphics card.

2
1
Silver badge
Happy

Cool!

I will test the 64 core 512GB single box (4U rack server) we are getting shortly (for processing large astronomical and remote sensing images rapidly). I will compare the cents per MFLOP/s to the figures here. We already know it will kick the backside of the 32 processor Cray SV1e we used to have performance wise, at less that 1% of the cost. I am really curious what the figures will be. Linpack has its limits of course, but it is still nice to know where you stand, even roughly.

2
0

Cool

Interesting article ( though no mention of either a ZX Spectrum, or computers used for Moon landings.)

I think your homebrew hardware is showing where some of the price differentials are coming in: cooling and infrastructure.

I'll bet supercomputers being used today need a bit more cooling that a couple of noisy fans, and more infrastructure than a domestic power socket.

Scale up that Generic PC to even Roadrunner speeds, via an imaginary beowulf array, and you will have a shed full of quarter of a million grey boxes. Going to be hot in there, and you'll need a few 4-way adapters too.

5
0
Stop

Speaking of Cool...

So what are the energy costs?

It's a bit silly to go to all these efforts for calculating cost of performing, but skipping the energy cost? Especially as you note that the seemingly-well-performing ones (wife's desktop, hydra-with-cooler) are clearly power hogs.

It is probably easy to get figures for the home computers (go to Maplin's/Radio Shack, buy power monitor for 15 or so quid), but the larger ones may be tricky to find? If not, it's an easily added column that would actually tell us something. The rest of the TOC (maintenance personnel + parts, expected lifespan) are too vague to add.

0
0
Bronze badge

How about GPU?

I take it all these tests were only using the CPU?

Would be interesting to see the same sort of testing and costings against GPU aware versions of linpack. Such as a CUDA version for nVidia GPU in your Lenovo W510 and a more current 580 or similar running in a desktop.

0
0

Re: How about GPU?

Good point...the wife has a NVIDIA 285 (I think), which should run CUDA no problem. But it's the Hydra machine that I really want to try. It has two NVIDIA 590's and should really be able to pull a good CUDA-enabled Linpack if I can find the code. Maybe I'll reach out to NVIDIA and see if they can point me in the right direction....

0
0
Thumb Up

What about a Beowulf cluster of Raspberry Pis?

A.K.A. "Bramble" - it would be interesting to see what price/performance ratios you could get from that.

5
0
Bronze badge
Facepalm

Re: What about a Beowulf cluster of Raspberry Pis?

A cluster, seems impossible just to get 1 Raspberry PI let along more than one.

7
0
Anonymous Coward

Re: What about a Beowulf cluster of Raspberry Pis?

Would that be a Raspberry bushel?

Can Pi be plural?

1
0
Headmaster

Re: What about a Beowulf cluster of Raspberry Pis?

No, it's a "bramble" - see for example:

http://www.raspberrypi.org/forum/projects-and-collaboration-general/bramble-cost-estimates

1
0
Silver badge

Re: What about a Beowulf cluster of Raspberry Pis?

Price/Performance ratio isn't that great for a bramble, since you cannot access the GPU, and its only a 700Mhz Arm (although even a single Pi is rated faster than a Cray -1!)

However, power consumption/flop ratios are pretty good.

1
0

Accuracy of results

It seems Linpack gives drastically different results to whohasthefastestcomputer.com

Any idea what causes these differences?

0
0

Re: Accuracy of results

Yeah, you're right, my results are only grossly comparable to 'real' Linpack run by professionals. There are many reasons, here are a few major ones: 1) I'm running an abbreviated Linpack on Windows - if I were doing this as a serious exercise, I'd be running it on as stripped down a version of Linux as possible.... 2) I'm not tuning the system or the benchmark at all. I should have run many many iterations of Linpack with different problem set and array sizes to see exactly which config gives a bigger number....3) Theoretical max on Linpack is "cores" x "frequency" x "FP operations per cycle". There are ways to tune each of those factors, none of which I did.

I think I probably got to about half of the Linpack potential on the big machine - maybe a bit better on the smaller ones.

0
0

Re: Accuracy of results

whohasthefastestcomputer.com is just a flash plugin - very little relationship to the true speed of the computer it runs on, and totally unrelated to HPL.

0
0

Other rows that would be interesting in that table...

Playstation 3 (often used for clustering.. or was until they removed OtherOS)

HP Touchpad at firesale price (should be good bang for buck :)

0
0

Re: Other rows that would be interesting in that table...

Pretty sure people are still clustering PS3's (the ones that are doing research). They don't really need PSN access to run nuclear detonation simulations.

1
0
Thumb Up

We’re all using supercomputers.

State of the art computer performance from a little over a decade ago, is now available to everyone able to afford a modern PC. We’re all using supercomputers. Could we be doing more with our computers than playing games and Microsoft Office?

I blogged about this a while back: http://chrisvernon.co.uk/2010/08/supercomputers/

0
1
Bronze badge
IT Angle

Commodity hardware without a fashion label is best value?

Shocker.

And yes echo the above comments - as soon as you try and scale up that commodity PC you'll have massive costs.

3
0

Generic business desktop

Quad core i5 with 8GB? Methinks your wife doesn't work in local government!

1
0
Silver badge

Avoid iFLops

My two year old machine cost <£200 and bangs out >7GFlops

so thats about 5 years ahead of apple in bangs per buck.

1
0
Meh

Modern desktops are excellent...

...which makes the dumbing-down of Windows (Win7 and Win8) particularly annoying. Vast power with an OS aimed at the occasional or "average user" (no insult intended). If you actually want to make use of that power with Windows you will have problems.

6
1
WTF?

Re: Modern desktops are excellent...

I don't get it. AFAIK there isn't anything in win7 or win8 that'll prevent you from using the hardware to its maximum (aside from insanity like using assembly, but you might even be able to do that). What the basic UI exposes has nothing to do with the capability underneath - you could slap a port of Microsoft Bob on a 16-core Linux server with 64gb of memory, but it wouldn't affect what you could do once you used lower-level functionality.

If Win7 didn't let you use the capability of the processor, I wouldn't be doing half the things I'm doing with vehicle simulation, gaming, graphics processing, and so on. I mean, I suppose it's possible that the OS is using 30% of the CPU all the time, but I don't think that's true, and even if it were true, it wouldn't have anything to do with 'dumbing-down' per se.

2
0

Re: Modern desktops are excellent...

What I was alluding with the Windows 7 comments in the article is that I had a full-on general purpose operating system running while I was pushing the hw with this massive benchmark. It wasn't taking up a huge portion of compute cycles - but it was taking up some of them. If I had gone whole-hog and installed the most stripped down Linux o/s I could, then it would free up more cycles for Linpack. From what I've heard from professionals in the industry, it would also give me more knobs and sliders to play with to optimize the o/s to run the benchmark.

1
0
Paris Hilton

If someone made cheaper infiniband or 10GbE switches an above average office could do some serious calculations. Just cluster all the machines together, then give everyone VDI so they don't get too confused. Or is this an argument to revisit thin clients and rent out your spare compute cycles?

Paris, bcause she knows about resource utilisation

1
0

Hyrda

Why spend $10k on a personal project to get yourself the fastest computer in the state?

Ok, you've obviously got the cash to spare to do it so fair enough... but in a couple of years it'll be slower than everything new and will have cost you a fortune for no discernable benefit. And in 10 years it'll be junk, making it $1000 per year. Could've bought a decent new PC every year for ten years for that. Or every 2 years for twenty years, and ended up with a much superior machine at the end of it.

2
1
Silver badge
Mushroom

Re: Hyrda

1) It's his money

2) Maybe it's because he enjoys the process. Lot of people spend thousands on their car, while only increasing its value by hundreds (or for a few numb-nuts with a penchant for underfloor lighting and cornflake boxes on the bonnet, actually decreasing it's value).

3) Would you tell someone who'd spend $2k on a cartier watch they could have bought 200+ casios with that? Actually you probably would.

4) Some people spunk multiples of that on a hi-fi which is indistinguishable from a $3k set up in terms of quality (those that disagree are simply delusional). Some people spunk that on a case of wine. Some people spunk that on a sparkly piece of crystallised carbon.

5) It's his money.

8
1
Silver badge
Linux

Re: Hyrda

Exactly.

There are diminishing returns when it comes to bleeding edge high end hardware but you can still get some pretty powerful kit for not much money. If you hit that sweet spot, you can still have a very powerful machine that will stand the test of time and you don't have to spend 10K on it or even $3600.

0
0
Silver badge
Linux

Re: Hyrda

> 3) Would you tell someone who'd spend $2k on a cartier watch they could have bought 200+ casios with that? Actually you probably would.

It helps to have a clue in these things lest you get taken advantage of.

> 5) It's his money.

Yes, and we retain the liberty to call him a fool too.

0
1
Angel

Re: Hyrda

And there are some REALLY crazy guys who spunk it on women.

0
0

Re: Hyrda

I know animation/ graphic artists who paid $200K for symbolics/sgi/barco presentation and paid for them in months time. $10k is almost bargain in TV industry, there aren't many things you can buy for that price.

Also,it is people like me trusted to Apple for professional work and stuck, he will just change mb and cpu, perhaps memory to upgrade the machine.

0
0

Re: Hyrda

First, if I had bought Hydra from a workstation vendor, I'm betting the all-in cost would be closer to $12,000. I'll check that out, I'm curious now. Second, I didn't actually spend that amount of money on that system. I'm very lucky in that I work in the industry and can get engineering samples and reviewer samples of some products every once in a while. For this system, a very helpful HPC vendor helped me get the Xeon processors and NVIDIA gave me two evaluation video cards. That helped defray the overall cost of the system considerably - phew....There will be more details on this when I start publishing the Hydra blogs...

0
0
Silver badge
Boffin

Re: Hyrda

"3) Would you tell someone who'd spend $2k on a cartier watch they could have bought 200+ casios with that? Actually you probably would."

Yes, I consider those people snobs. However, putting down $10k on a supercomputing project actually serves a purpose; the ubercomputer will actually do stuff faster, while the $2k cartier will have *less* functions than a casio.

0
0

Re: Hyrda

But the longetivity argument still stands. A $2k Cartier watch will still be worth a substantial sum in 20 years time and will still carry out its primary function of telling the time perfectly well. The computer won't be up to any modern task and will not be worth anything either. Stick a 486DX-100 bsaed PC with some incredible-at-the-time graphics on eBay, and also stick the watch on... which would get the interest?

0
0

Re: Hyrda

People are free to spend their money on anything they like, of course. I'd never say otherwise but I'm free to question their sanity :)

Enjoying the process is fine and a perfectly good reason, I don't have a problem with it, I just see it as a waste of money.

The watch though I don't see as a waste of money as its value will last and it'll be just as good in 10 years as it is now. I own a $500 watch and love it. So far it's outlasted 3 desktop PCs and 2 laptops and works as well as it ever did.

0
0
Happy

@ David W

A dubious choice of words there, David....

1
0
JLH
Linux

Bob H

I get your argument re. the amount of CPU power in an average office.

And IB switches are quite cheap these days - see colfaxdirect.com for example

I would conter though with exactly the same argument - CPU horsepower is relatively cheap these days, and it is the effort and wages of the programmers and administrators which is the cost.

So I would say it is better to have dedicated hardware in an environmentally stable room, close to the data. Rather than coping with a mongrel set of desktops, which vary in speed and memory.

Depends on your application of course.

And cloud (ye Gods why did I have to use this word...) changes things - I wouldn;t bother these days to do office level cycle scavenging. Hire those cloud machines by the hour.

At the Sandybridge launch the other day there was a talk by Amazon - their HPC instances when ganged together reached 42 in the Top500

0
0
Alert

Damn you..

...for bringing the word 'cloud' into our nice little hardware conversation...lol...god, is there no way to escape a cloud discussion?

1
0
JLH
Facepalm

Re: Damn you..

God have mercy on my soul for using that word.

0
0

uh, cloud is expensive

you know Amazon's profit margin is HUGE, right?

0
0

Big systems' prices don't scale linearly against processing power. If you need That Much Power, be prepared to pay 10x or 100x of performance total... And then you pay 0.3x to 1.0x of initial price for 24/7 support and maintenance each year.

0
0
Gold badge

There's so many cores out there doing bugger all, I don't know why people can't be paid to process data via some background service. So long as it's not classified of course.

0
0
Anonymous Coward

Depending on their electricity rates, they might come out negative if they're not careful. Plus, a 10c rise in chip temp will drop lifetime by half (or thereabouts). Crank your CPU all the time and you've basically got a space heater turned on 24/7. Except it's a space heater that makes your computer die faster (That said, I can't remember the last time I had a machine fail due to long-term CPU fatigue).

You don't get something for nothing. Now, it might be more efficient to have a whole bunch of people doing the processing when they've -already paid for the infrastructure- (power supplies, environment, upkeep, purchasing, etc) even if you have to compensate them for their electric bills. People will rent out their computers without considering the cost of the house to put them in, the time they spend setting them up, the effort they go to to make sure they get fixed if a lightning bolt puts a hole in the mobo, etc. They'll probably want to be compensated for the electricty (if anything) and consider extra cash as 'free' - somewhat like people considering gas to be the only real cost of driving their cars.

So, a little public myopia might be a big benefit to people who need the cycles and can implement something well.

0
0
Holmes

Welcome to the 1980s

There is very little here that's new or surprising.

In the 1980s it was clear that a cluster of small systems (such as workstations) could often do the job of a supercomputer, so long as you could carve-up and distribute the work. Since then, this has been reinvented with many names: Beowulf, Grid, Map-Reduce etc.

The basic ideas - and related ones such as cycle-stealing - probably go back even further; however it was the arrival of microprocessor-based systems that changed the economics and architecture of supercomputing so that big systems are effectively clusters of small ones, with the added overhead of fast interconnects and other infrastructure.

0
0
Happy

The power of an 80's super computer....

And it still takes forever to load up Word!!!!

3
0

Re: The power of an 80's super computer....

I began using a computer with an Atari STF, 8Mhz, 1MB and i can tell you that "forever" was a lot longer than now.

0
0

Re: The power of an 80's super computer....

My first computer was a Sanyo MBC-550....which was billed as 80% IBM compatible. Meaning that a program would get 80% of the way loaded until it crashed. And you had 1MB of memory?!! I only had 128k RAM - and a single floppy - couldn't afford the dual floppies...and YOU'RE complaining about long processing times? lol lol

0
0
Silver badge

Cue four yorkshiremen...

A floppy disk drive? Luxury! Some of us dreamed of a disk drive while waiting for tape (ordinary audio cassettes), which then gave errors 80% of the way through loading on the system it was designed for. No doubt someone else will come along with tales of punched tape to continue this...

0
0

Re: Cue four yorkshiremen...

Did I say I had a disk drive? It wasn't all that fancy. I had to spin it myself with a foot powered petal to keep it moving. And if I spun it too slow or too fast, it would screw up the read or write and I'd have to start over. I used to dream of having a reliable disk drive or even a cassette tape drive that worked.

Of course, I didn't get much time for computing....in the morning we'd have to get up, clean the bag, and then sweep the road clean with our tongues...

2
0

Re: The power of an 80's super computer....

Elite on the commodore 64 used to take over an hour to load off cassette tape and the load would fail 40% of the time. You had to reserve yourself a couple of hours just to get the game loaded when you wanted to play :)

0
0

Page:

This topic is closed for new posts.