Feeds

back to article Researchers break records with MILLION-CORE calculation

Stanford’s Engineering Center for Turbulence Research (SECTR) has claimed a new record in computer science by running a fluid dynamics problem using a code named CharLES that utilised more than one million cores in the hulking great IBM Sequoia at once. According to the Stanford researchers, it’s the first time this many cores …

COMMENTS

This topic is closed for new posts.
JDX
Gold badge

I bet F1 teams would love a go.

1
2

This post has been deleted by its author

Anonymous Coward

Maybe :-)

Signed - Supercomputer manager of an F1 team.

I would also like the trip out to California as I'm absolutely sure somehow that remote access won't work until I've had a couple of weeks out there on the beach. Ahem. In the server room.

1
0
Silver badge
Anonymous Coward

Re: Erm..

No No No No No.......

0
0
Silver badge
IT Angle

The future of home computing

Modern smartphones today have more computing power today than the Cray-1 had 30 years ago.

Can we therefore conlcude that in about 15 years time this will be the kind of power we will have in home computers.

Shit, will we even have "home" computers in 15 years or will we all be directly connected to the Cloud/Web via Matrix like cervical cords.

<<<---- There is no IT Angle becaue this moves into the realm of the unimaginable.

0
0
Silver badge
Pirate

Re: The future of home computing

Probably. But they still wont be able to get the weather right 5 days ahead or the climate in 5 years...:-)

1
0

Re: The future of home computing

I did a calculation a couple of years ago and found it took about 12 years for the MIPs value of the worlds top supercomputer to be available on a single graphics card.

1
0

Re: The future of home computing

I'm sure this is just a satirical jibe at the met office, but something sciency in me compells me to inform you that the mathematics of chaos, not the ineptitude of the met office, is the reason they can't predict the weather.

What the met office actually do is run a large number of simulations, all with perturbations from the current weather situation and determine the probability of certain events. E.g. say they run 20 simulations and it rains in 15 of them, then they say there's a 75% chance of rain. This was taken away from our TV weather reports because people are stupid and don't what that means. So they say "It will rain today". Then, when it's bright sunshine (as predicted in 25% of the simulations), people say "blah blah, crap forcasters!" and laugh.

8
0
Thumb Down

Re: The future of home computing

But they are crap. They give you a 5 day forecast and change it every couple of hours.

They give a % because the whole process is so complicated they can't give a definitive answer, so why bother?

I have given up trying to predict lottery numbers, it's too complicated. So I guess, and I'm not too proud to admit it.

0
3
Silver badge

Re: The future of home computing

I somehow feel things are heading on a mundane route, where we may have a lot of power, but we're doing nothing more with it than we are now.

1 Million FPS iPhone transition animations! 16000x9000 phone screens that can run 3D games that still don't have good collision detection!

2
0
Unhappy

Re: The future of home computing

Here across the pond, about the only thing the National Weather Service gets wrong to the point that people gripe is exact snowfall locations and amounts (perhaps not something you need to worry about in the UK, but here, there is a bit of a difference between 2" and 6") and tornado warnings that turn out to be false alarms. The former is because a slight shift (a dozen miles, if even) in the upper atmosphere can change both the type and the amount of snowfall, while the latter is erring on the side of caution, as tornadoes not just ruin your picnic, but probably put the sandwiches through the house down the street (maybe a slight exaggeration).

Maybe it's the island location that plays havoc with the weather forecast, or maybe the expectations are so high, it's absurd. Given than the NWS is able to predict the general path and area of impact of most hurricanes five days in advance within a 300 nautical mile error, I'd say we're doing pretty good. Just because it rains on you when you forgot your umbrella doesn't mean it's the end of the world.

5
0
Anonymous Coward

Re: The future of home computing

300 miles is fine across the pond but thats halfway across the entire country here...

0
0
Silver badge
Boffin

Re: The future of home computing

There is a reason why weather forecasting is one of the first practical studies that led to much of the discovery of chaos theory. You change one variable slightly in weather models and you find totally different results 5 days later. Basically the same concept behind cryptographic hash functions where changing a single input bit causes the output to totally change.

http://en.wikipedia.org/wiki/Edward_Norton_Lorenz

0
0
Bronze badge
Flame

Re: The future of home computing

>> Can we therefore conlcude that in about 15 years time this will be the kind of power we will have in home computers.

Yes. We can also conclude these computers will have performance issues when running the latest version of Windows.

3
0
Silver badge

Re: The future of home computing

And HTML 5 will still NOT be standardized.

1
0
Silver badge
Holmes

Can we have a piping hot update on the following problems in computation, please:

1) Classical General Relativity problems. Computing the behaviour of a 4-D warping, lorentzian spacetime? Hell yeah.

2) Lattice Quantum Chromodynamics, which, AFAIK is impossible to parallelize efficiently.

0
0

Googling (2) was fun, but didn't answer my question ... why it is hard to parallelize efficiently? The FermiQCD toolset seems to be based on parallel matrix algorithms but I couldn't get a quick idea of the limitations there.

0
0
Silver badge

What sort of million ?

Would that be 1000 x 1000, or 1024 x 1024 ?

1
0
Anonymous Coward

Thank God..

.. at last a decent machine to run EDLIN..

1
0
Anonymous Coward

Re: Thank God..

The only decent machine to run EDLIN is one that is switched off...

1
0
Silver badge
Coat

Re: Thank God..

> The only decent machine to run EDLIN is one that is switched off...

Yeah, real programmers use vi... in line mode... with a nokia 6310i as a terminal... over an ir link...

1
0
Anonymous Coward

Re: Thank God..

> >The only decent machine to run EDLIN is one that is switched off...

>Yeah, real programmers use vi... in line mode... with a nokia 6310i as a terminal... over an ir link...

Okay you got it right until you mentioned an ir link. Real programmers use an RS-232 cable. Homemade of course.

0
0
Anonymous Coward

Can it model the problem with the 787? I think Boeing would love some help.

1
0
Joke

"Can it model the problem with the 787? I think Boeing would love some help."

If you put in a small enough power supply. But why would you want your computer to overheat and catch on fire?

1
0
Silver badge

Veni vidi duci

To quote and old Internet saying: " Uunngghh!"

0
0
Silver badge

Re: Veni vidi duci

Spell check: I dont haz it.

DOH!

0
0
This topic is closed for new posts.