1 post • joined 22 Aug 2008
Dust in the eyes
Integrating everything on one chip is the future but as always Intel makes it the lamest way. PCIe graphics ?!!?!?!! right next to the cpu on one chip ?!?!?! What's wrong with these ppl. Instead of cutting off those pesky useless overhead-only pcie controllers and implement way more efficient on chip communication between both. They simply stuck'em on one chip. One might argue that this will preserve backwards software compatibility, but it's exactly this retarded kind of thinking that got us into the current state of our world where one can spend 1000$ on a multi billion transistor cpu that boots in 16-bit mode with 64k crappy memory segments. But on the other hand a more reasonably thinking man would think that the real reason behind this so called "integration" is not technological revollution but a mere attempt to engage the user to the manufacturer. As amd are also trying.
But hey what do you expect when the product is designed not by engineer-visioners but by greedy economist leeches.
- 'Windows 9' LEAK: Microsoft's playing catchup with Linux
- Infosec geniuses hack a Canon PRINTER and install DOOM
- Game Theory Half a BILLION in the making: Bungie's Destiny reviewed
- Review A SCORCHIO fatboy SSD: Samsung SSD850 PRO 3D V-NAND
- Was Earth once covered in HELLFIRE? No – more like a wet Sunday night in Iceland