1 post • joined 22 Aug 2008
Dust in the eyes
Integrating everything on one chip is the future but as always Intel makes it the lamest way. PCIe graphics ?!!?!?!! right next to the cpu on one chip ?!?!?! What's wrong with these ppl. Instead of cutting off those pesky useless overhead-only pcie controllers and implement way more efficient on chip communication between both. They simply stuck'em on one chip. One might argue that this will preserve backwards software compatibility, but it's exactly this retarded kind of thinking that got us into the current state of our world where one can spend 1000$ on a multi billion transistor cpu that boots in 16-bit mode with 64k crappy memory segments. But on the other hand a more reasonably thinking man would think that the real reason behind this so called "integration" is not technological revollution but a mere attempt to engage the user to the manufacturer. As amd are also trying.
But hey what do you expect when the product is designed not by engineer-visioners but by greedy economist leeches.
- Pics It's Google HQ - the British one: Reg man snaps covert shots INSIDE London offices
- The END of the FONDLESLAB KINGS? Apple and Samsung have reason to FEAR
- Put down that Oracle database patch: It could cost $23,000 per CPU
- White? Male? You work in tech? Let us guess ... Twitter? We KNEW it!
- Review Porsche Panamera S E-Hybrid: The plug-in for plutocrats