1 post • joined 22 Aug 2008
Dust in the eyes
Integrating everything on one chip is the future but as always Intel makes it the lamest way. PCIe graphics ?!!?!?!! right next to the cpu on one chip ?!?!?! What's wrong with these ppl. Instead of cutting off those pesky useless overhead-only pcie controllers and implement way more efficient on chip communication between both. They simply stuck'em on one chip. One might argue that this will preserve backwards software compatibility, but it's exactly this retarded kind of thinking that got us into the current state of our world where one can spend 1000$ on a multi billion transistor cpu that boots in 16-bit mode with 64k crappy memory segments. But on the other hand a more reasonably thinking man would think that the real reason behind this so called "integration" is not technological revollution but a mere attempt to engage the user to the manufacturer. As amd are also trying.
But hey what do you expect when the product is designed not by engineer-visioners but by greedy economist leeches.
- Boffins attempt to prove the UNIVERSE IS JUST A HOLOGRAM
- China building SUPERSONIC SUBMARINE that travels in a BUBBLE
- Review Raspberry Pi B+: PHWOAR, get a load of those pins
- Review Reg man looks through a Glass, darkly: Google's toy ploy or killer tech specs?
- MEN WANTED to satisfy town full of yearning BRAZILIAN HOTNESS