Intel's next generation of desktop - and, undoubtedly, mobile - processors will retain the chip giant's Core brand, the company said this morning. No great surprise there, but they won't be 'Core 3' parts. No, 'Nehalem' architecture chips will be marked 'i7', instead, though Intel further muddied the waters by telling us to …
another corp falls victim to...
I don't care if it should be 7,8, 12 or 23985723. The 'i' annoys me!
I mean - I even found a bloody iBed on the interweb!
iStop iIt! iI iCahn't iTake iIt iAnymore!
All these names...
...make it confusing for the general punter.
I mean really, when it was based on clockspeed you could at least see at a glance which chips were supposed to be faster than what. But it looks like CPUs are going the way of GPUs: Confusing and hard to compare.
For example, comparing the GF5200 with an ATI equivalant... the 9600? Or is that better, or worse, or what? It's very tricky to tell without opening the techspecs, looking at clock & bus speeds and checking the number of pipes it has.
Bleh. No wonder I bought a console...
This is a good thing
It should stop any madness with duo, quad, two, 2, x2.
I'd imagine most readers have, at some stage, had to explain the difference between a core duo and a core 2 processor to someone, only to have that someone nip off to PCworld and triumphantly return with a "bargain" "latest processor" "discount" core 1 chip.
"Chip 5" would be better than these stupid names. Boring marketing is better than confusing marketing.
My bologna has a first name...
...it's B U L L S H
Coat, Hat, Pub.
RE: All these names
Unfortunately clock speed doesn't even come close to telling you which chip will perform faster than another. The design of the chip is so complex that naming has become a real problem. I thought intel had nailed it with the first lot of Core 2 chips. The Letter and number told you which performed the best. Then they introduced Penryn and a whole new level of complexity and I gave up caring.
Codename Gaza Strip
Intel core i7 duo 2
Where are those Intel realtime rendering CPU´s?????
Come on intel kill those insanely hot VGA cards once and for all!!!
hardy heron and gutsy gibbon
Getting a Core solo is not nearly as bad...
as them picking up a Pentium 4, and gleefully exclaiming how much faster their 3GHz Pentium 4 with hyperthreading is soo much faster than your piddly 2.5GHz Core 2 E7200.
It's called Larrabee, but unfortunately for you it'll be in the same form factor, produce the same amount of heat and consume the same amount of power as the other "insanely hot VGA cards".
As for the 'i' prefix thing, technically they had it before Apple. i386, ia-32, ia-64, etc.
I see a few posts describing the problem of the name not clearly indicating how an i7 is better than a Core 2 or whever, but no solutions. So, how are consumers supposed to know that i7 is better than whatever they're upgrading from?
I'm sure some good tv commercials will let everyone know that i7 is newest, but that doesn't always mean 'better' (Vista is newer than XP, but...), and if an average consumer upgrading from a mid-range Core-something or even a high end (but 4+ years old) Pentium 4 HT can't tell whether the mid-range i7 is going to be faster, why should they part with their money?
i is the way to go
C'mon guys, we are okay with "iPod", "iPhone", and "i Anyting" by Apple, then why not Intel.
I guess it is number 7 in some ways.
The PPro was the first P6 and came with a new bus architecture, which it seems is still the same bus they used throughout the Pentium II, Pentium !!! (yeah they really named it that), Pentium 4, Pentium M, and various Core variants (and miscellaneous Celerons) and the Pentium Dual Core. The new design with quickpath (or whatever it is named now) will be a new bus, and hence could be considered the 7th generation of bus interface.
Core i7 does seem like an odd choice though.
I think intel should name all new chips M 4.
M stands for "Model" and 4 because of http://xkcd.com/221/ (I-triple-E can't be wrong).
To be honest, I couldn't give a monkey's what they name it, I just want to know when I can actually buy a chip and board - and a decent Zalman cooler as, despite billions spent on R&D they still can't develop a good cpu fan.
I won't even comment on the clowns at Nvidia. I mean, you spend £300 on speakers, £200 on a top end soundcard - and what do you get - drowned out by the 60dB turbine whine of the GPU coolers. Sheesh!
- +Comment Trips to Mars may be OFF: The SUN has changed in a way we've NEVER SEEN
- Vid Google opens Inbox – email for people too stupid to use email
- Back to the ... drawing board: 'Hoverboard' will disappoint Marty McFly wannabes
- Pic Forget the $2499 5K iMac – today we reveal Apple's most expensive computer to date
- Google+ goes TITSUP. But WHO knew? How long? Anyone ... Hello ...