In an experiment IBM researchers used the fourth most powerful supercomputer in the world - a Blue Gene/P system at the Forschungszentrum Julich in Germany - to validate nine terabytes of data in less than 20 minutes, without compromising accuracy. Ordinarily, using the same system, this would take more than a day. Additionally …
just pipe the input to /dev/null
...makes it run real fast!
Kooooooool But Not Gr8 !
Nice But I have Given up on International Boffins Machine.... They have Disappointed me with Cell cesarean Session ......................
I barely understood any of that...
...but I like to see that data processing may get quicker, even for me.
/where is that 'Boffin to English' dictionary?
orders of magnitude improvement in other analytic tasks
So they can apply this to the XIV's then?
It's not that hard to understand..
Just take a simple example: We examine people leaving a pub and note for everyone the amount of beer ingested and the amount of pee lef in front of the pub. Clearly these two factors have some sort of connection. To get a grasp of these one may produce a special 2-by-2 matrix from the gathered data, a Covariance Matrix. Real world examples involve a lot more than two factors, and as such the Covariance Matrix gets very big. Now, this Matrix can also be seen as a function, that, when applied to some data yields some results. As it is, people tend to have some Covariance Matrix and a lot of those results. And they want to know, what the input was. So they want the inverse Covariance Matrix (which resembles the inverse function).
Up to now the time for calculating this inverse Matrix grew (roughly) by the power of three of the size of the original matrix.
The IBM boffins found a way to do calculate the inverse for which the required time correspondents to the square of the input size. (By estimating a matrix property and using this for an easier calculation). And as a nice plus the new algorithm scales pretty well on massive multiparallel machines.
(At least that's what I read from the abstract, don't have time to look into the paper right now.)
 That's why the abstract says "Cubic complexity" not "Cubit ..." (It doesn't involve Quantum Computing. Yet.)
Thanks for shedding some light
I think your reply was far more informative than the article.
Maybe they just verify the parity bits?
> shots of them standing around looking smarter than any of us.
not all of us... ;o)
- Hi-torque tank engines: EXTREME car hacking with The Register
- Review What's MISSING on Amazon Fire Phone... and why it WON'T set the world alight
- Product round-up Trousers down for six of the best affordable Androids
- Antique Code Show World of Warcraft then and now: From Orcs and Humans to Warlords of Draenor
- Why did it take antivirus giants YEARS to drill into super-scary Regin? Symantec responds...