2 posts • joined 14 Nov 2008
It's not that hard to understand..
Just take a simple example: We examine people leaving a pub and note for everyone the amount of beer ingested and the amount of pee lef in front of the pub. Clearly these two factors have some sort of connection. To get a grasp of these one may produce a special 2-by-2 matrix from the gathered data, a Covariance Matrix. Real world examples involve a lot more than two factors, and as such the Covariance Matrix gets very big. Now, this Matrix can also be seen as a function, that, when applied to some data yields some results. As it is, people tend to have some Covariance Matrix and a lot of those results. And they want to know, what the input was. So they want the inverse Covariance Matrix (which resembles the inverse function).
Up to now the time for calculating this inverse Matrix grew (roughly) by the power of three of the size of the original matrix.
The IBM boffins found a way to do calculate the inverse for which the required time correspondents to the square of the input size. (By estimating a matrix property and using this for an easier calculation). And as a nice plus the new algorithm scales pretty well on massive multiparallel machines.
(At least that's what I read from the abstract, don't have time to look into the paper right now.)
 That's why the abstract says "Cubic complexity" not "Cubit ..." (It doesn't involve Quantum Computing. Yet.)
right in time
As my Koss Spark Plugs just crapped after roughly 7 momnths of light use. Not the kind of quality I expected after owning a Sporta Pro for years.
Well, not just right in time, as it seems i missed a bargain on play..
- Product round-up Coming clean: Ten cordless vacuum cleaners
- Something for the Weekend, Sir? I need a password to BRAKE? What? No! STOP! Aaaargh!
- Episode 13 BOFH: WHERE did this 'fax-enabled' printer UPGRADE come from?
- Vulture at the Wheel Ford's B-Max: Fiesta-based runaround that goes THUNK
- Worstall @ the Weekend BIG FAT Lies: Porky Pies about obesity