Employing graphics chips as co-processors to do tough computing tasks is not as simple as plugging in some electronics, adding a few libraries of code, and letting it rip. But it ought to be something like that, which is why graphics chip maker nVidia and desktop and server operating system maker Microsoft are working together …
Doesn't OpenCL sound further along?
This sounds like a non-event. Something to distract people from OpenCL. OpenCL is further along from what I can tell. Unfortunately I don't work in a software area that can immediately benefit from this tech.
That took some doing...
An entire article about MS asking "oh dear, how do we use all those GPU cores to do more general purpose computational tasks?" and not one mention of the fact that this problem has already been solved in the non-MS world thanks to LLVM/Clang, Grand Central Dispatch and OpenCL. Hopefully, those technologies will gain more widespread use beyond Apple and into Linux et al. I'm no Apple fanboi but I recognise an elegant engineering solution when I see it.
lets face it...
this technology wont be considered mature (or even stable) until i can run the gpu version of folding@home out of the box.
(shakes fist at bluescreen causing thingies (drivers? folding core? who knows...))
- Put down that Oracle database patch: It could cost $23,000 per CPU
- DAYS from end of life as we know it: Boffins tell of solar storm near-miss
- The END of the FONDLESLAB KINGS? Apple and Samsung have reason to FEAR
- Pics It's Google HQ - the British one: Reg man snaps covert shots INSIDE London offices
- Bose decides today IS F*** With Dre Day: Beats sued in patent spat