Boffins at IBM have come up with a better way to embed laser communications onto processor and memory chips using plain vanilla CMOS manufacturing processes, paving the way for three-dimensional chips integrating hundreds of processors, their main memory, and on-chip optical networks that will, it is hoped, allow for the …
very cool but
I honestly wonder why IBM still pours money into its semiconductor division (am sure AMD is glad they do it). They make virtually nothing on it any more and if not for the game console contracts they couldn't afford more than a small pilot fab. I know Big Blue prides itself on R&D but based on comments of workers they have not exactly taken care of their employee lately (management needs their bonuses after all). I am glad my country is still leading the way in the tech industry (yes IBM worldwide but come on there is NY and everywhere else in this company) but you just have to wonder how much longer IBM is in the game.
Clearly you know nothing about IBM's business. They could close all of the manufacturing plants and still make more profit than AMD. They can do this because of the R&D and research. IBM own most of the patents in the chip making business, so every time Intel make a chip they pay IBM some money. Every time someone makes a hard drive they pay IBM some money. There are very few products being sold with electronics in where IBM won't get money back from patents. IBM is one of the few companies you can bet will be in the game when your grandkids ask how much longer they will be in the game
^ ^ This ^ ^
What Lusty said, really. IBM still owns all the patents id had back from the days where you bought a Tandy, Adlib, or 100% IBM compatible clone machine, and they've only done more research (begatting more patents) since then.
as the article, the only thing I don't grasp is why use 130 for the CMOS and 65 for the optics, am I missing something fundamental and simple here? I also don't buy the 100 mill interconnects thing. thats as future-anachronistic as the old shot of "what a home computer may look like" back when Valves and DIY-soldering was bleeding edge nerdcraft. I think by the time optics-on-chip becomes a manufacturing reality you'll see more innovations in the field, and the number will drop accordingly.
Genius for Lusty
I of course know they make a heck of lot more on the R&D patents than they do the manufacturing. Too lazy to look but I bet they make a lot more on their global services than even the patents. R&D has always been a big part of who they are though. Still it used to be IBMers were not so cynical towards the motivations of its company's executives. Generally your R&D nerds are more productive when they don't see their pension disappear because some VP needs to make some earnings per share number for his stock options.
You are probably right about IBM continuing to innovate in semiconductors for a long time. Like you say never bet against IBM. The industry is littered with the long gone corpses of competitors such as Digital and Wang.
Did anyone else...
see the words 'ring modulator' and the very first thing that leaps to mind is...
"COOL! Does this mean we can fit 2,048 MOS6581 SID chips onto a processor as a field-programmable analogue synthesizer of awesome proportions?"
Oh, just me then.
This is innovation worthy of a patent
I compare this to yesterday's report of the the patent application by Apple for 3D display (http://www.theregister.co.uk/2010/12/01/apple_3d_patent/) and it just highlights the difference between companies that talk the talk, and those that walk the walk.
But investors and analysts are won over by numbers of patents held, whether they're backed by real innovation and concrete results or not, so Apple's stock continues to gain.
Hopefully IBM can profit from this research and continue to innovate in the future.
if my computer needs more grunt, I just tip in some more bacteria and sugar, and shake ....