Robert Heinlein was right to be worried. What if there really is a planet of giant, psychic, human-hating bugs out there, getting ready to hurl planet-busting rocks in our general direction? Surely we would want to know? Luckily, big science projects such as the Large Synoptic Survey Telescope (LSST), which (when it's fully …
Reporting planet-threatening rocks reminds me of the Central Bureau for Astronomical Telegrams - a body whose role has obviously changed greatly since the need for actual telegrams disappeared. I read about it in the Grauniad's obituary of Brian Marsden, who had been its directory for a while (http://www.guardian.co.uk/science/2010/nov/23/brian-marsden-obituary). Quite fascinating to imagine telegram boys delivering news of a supernova on paper, 50 years ago!
In what way is the Milky Way "cube shaped"?
Is this a permenant feature of the galaxy?
It's almost as rediculous as the square wheel, see http://en.wikipedia.org/wiki/Square_wheel
Maybe it works in the same way?
I thought there was a badger icon, where'd it go?
I'll have the laser warning sign, cos it's the prettiest.
re. Milky Way
This 'cube' business is a misunderstanding of some kind. Most people know that a Milky Way is a rectangular block with ripples on its upper surface. Those astronomers need to get out more and go to the shops.
...and that, my liege, is how we know the Earth to be banana shaped.
There is a project for which I would honestly love to be network administrator. To that lucky sysadmin...whomever you are...this pint's for you!
Where can I get
the processing engine for overclocking? I've a couple of games here...
Milky Way is cube-shaped
Oh Nooes! The milky way is actually a sky-box and we are all living inside a 3d game!
Reminds me of some films
13th Floor springs to mind and even The Matrix.
They're reinventing the wheel
The parallel software framework is already in place and working.
BOINC - Berkeley Open Infrastructure Network Computing
Windows, Linux, Mac; and it's open source, so it can be compiled on about any other platform.
And there are already millions of cores working on various projects.
They might want to get in touch with Pasquale at http://orbit.psi.edu
whose project seems to be working at the same purpose, using available telemetry.
Boinc? Not so much
For 3000 nodes, 6.4GB per 15 seconds, that's around 133KB/sec continuous. I for one couldn't handle ethat big a drain on my internet connection. Boinc's great for heavy number crunching, but this is light number crunchng on huge data sets.
"not so much" ? Maybe because
you're off by a factor of over 1000.
Check out the http://www.allprojectstats.com/top.php page.
On the right, choose -BOINC- and "CPU statistics" from the picklists and click Show to filter down to the top 200 CPU types. Just the top 10 CPU types listed there amount to nearly 3,500,000 cores. (disregarding that some CPUs can do multiple threads per core, and counting only the cores.)
It's not clear if the article is using the perverted definition of a GB (1,000,000,000 bytes, or 10^8), instead of the original definition of a GB: 1,073,741,824 bytes (or 2^30) which ISO has designated with the unit "GiB" so HDD manufacturer's can't co-opt it like they did GB.
Still, even if they mean the larger GiB, it would require about 1.2Mb/s (152KB/s) for 3000 nodes (assuming 1 core per node), but that would fall to 138 Bytes per Sec with 3.5 million nodes, using BOINC. Even a POTS dialup could support the transfers for 25+ cores, at that rate.
Milky Way is cube-shaped?
Matt, can you tell us where you got that information? I am having a great deal of trouble believing it ... or am I being particularly slow at getting the joke?
You missed the joke.
That might get you on the road to understanding, however...
I feared I was just being dumb
and sadly it was so. In my defence it doesn't look like I would have had much chance to see this in the UK - only broadcast on children's daytime TV. But still... hat, coat
Hm hope they aren't using code generators, those tend to spoil everything (plenty of ugly, dead code etc.).
Also, simply adding more lines of code almost never improves scalability - the fastest code is the one which is not there, as opposed to more of it.
Finally, it often helps, at least to some degree, to reuse code maintained by the wider community. This way you have some load off your head.
What a fascinating project!
I wonder what spin offs will make it into the commercial sector?
Maybe another arm should be added to the project along lines of: commercial application of research work?
... some stupid sod will hack it just because it is there or is actually doing something useful.
- Fee fie Firefox: Mozilla's lawyers probe Dell over browser install charge
- Did Apple's iOS make you physically SICK? Try swallowing version 7.1
- Pics Indestructible Death Stars blow up planets using glowing KILL RAY
- Neil Young touts MP3 player that's no Piece of Crap
- Review Distro diaspora: Four flavours of Ubuntu unpacked