Intel has released a few more details about the new processors it announced at this week's Game Developers Conference, and which The Reg told you about earlier today, plus provided a "sneak peek" at a 14-nanometer fifth-generation "Broadwell"–based desktop processor with integrated graphics. Intel 'Haswell' 4th-Generation Core …
The only reason I avoid Intel chips is the cost. I may only get 85% of the performance with my AMD 8-core FX CPU, but I only pay 1/8th of the cost compared to a top of the line Intel 7 processor.
I shudder to think what they would climb to if AMD ever went under.
Core i5 4670 doesn't cost more than FX-8350 but performs better overall anyways
Pricing myth busted
Core i5-4670 performs better overall than FX-8350 yet doesn't cost more.
Pricing myth busted.
Wow, it's been 20 years of Pentium chips? I feel OLD!
I have been an AMD stalwart for a long time, but recently I have been frustrated by the lack of even a convincing roadmap, let alone competitive CPUs.
I was just looking the other day at perhaps swapping over to a multi-socket Opteron rig to at least get some more cores and RAM into the picture, but the Opterons are thin on the ground and terrible bang for the buck.
I am due to upgrade and will likely be switching to Intel after all these years.
I am mystified by AMD's silence. I am assuming they have essentially abandoned high-end CPUs and are doubling down on graphics. If so, it's a sad way to end.
Maybe AMD could at least consolidate server/workstation chips so that enthusiasts could put together a four socket 64 core rig with more than 256 GB of RAM.
Whilst I am griping ... whether it makes sense on paper or not multi-threading definitely increases performance. AMD seem to have effectively adopted the idea but in a clumsy way with modules/cores, but it looks and sounds awkward and does not appear to leverage silicon like Intel's hyperthreading.
Sigh. I am hoping AMD fights back, but thus far it seems ... worrisome.
Intel has the best process technology bar none. 14nm CPUs is quite astonishing.
Intel made a big mistake with the Pentium 4 because they thought that dumb would be good enough when they got the clock up to tens of GHz. They found out that clocking Silicon much above 4GHz wasn't do-able, and AMD almost gained the lead by doing a much smarter CPU design (ie, using the available transistor count to accomplish more useful work) despite having to implement it with an inferior process technology.
Intel resurrected the Pentium 3 and worked on it. They got back level with AMD, then overhauled them and stayed there. AMD doesn't appear to know how to do smarter squared (nor does anyone else). It may be a software problem, not hardware: how to automate code generation for very many cored CPUs. Which ARM could make tomorrow, but they presumably know that 128 one-watt cores on one chip wouldn't sell. Heck, NVidia make them, but GPGPUs running CUDA code just point at the problems in using that approach more generally.
My theory is that Intel could put AMD out of business but never will, because it needs to point at AMD to justify not being treated as a monopoly (and maybe to stop itself behaving too much like one!). If the only place to buy high-power workstation and server chips was Intel, they'd end up being regulated as a monopoly, and then innovation at Intel would cease.
Maybe one day ARM will be competitive outside the mobile and low-power arena. Time will tell. Until then, Intel is top dog.
Dear Nigel. Please ask your mum if you were dropped on the head as a child.
"they thought that dumb would be good enough when they got the clock up to tens of GHz". You will find that Intel employs these people in white coats that are called "physicists" who study and understand "physics". They, together with 30-something years of experience would have made it very obvious to them that attaining high GHz was not a breeze.
I am impressed by your collection of buzzwords. I will be even more impressed when you can assemble them into meaningful sentences.
The x86 architecture does, indeed, limit what Intel can do and it is really quite amazing what they have achieved given that limitation.
...we will be introducing our new line of processors code named:
Chasm of Infinite Peril
Valley of Certain Doom
Re: Next year...
No because those aren't real place names on a map.
Well perhaps they should be!
We could rename Barnsley or something...
Secret thermocouple compound
Perhaps with the "extreme edition" they'll return to soldering the heatspreader on, the way it was in the old days (I guess). Or at least use a "liquid metal" thermocoupling stuff (think of CoolLaboratory Pro or Galinstan) rather than the white smear that they've been using since Ivy Bridge...
Myself I'm not fond of number crunching muscle. Rather, I drool over CPU's that don't need a heatsink (and are not crap performance-wise). I like the low-end Haswell-generation SoC's (processor numbers ending in U and Y), and am wondering what Broadwell brings in that vein.
Re: Secret thermocouple compound
Precisely my thoughts. I'm personally hoping for an E3-1220L V4 that clocks in the 1.0 - 1.5 GHz range and gets under 10 W TDP but I'd settle for the equivalent i-core around 13 W.
Re: Secret thermocouple compound
At the low wattage end, re-implementing any current design at 14nm will reduce the power it consumes very considerably. So we can assume that if the market exists, they'll make it.
Re: Secret thermocouple compound
Low wattage is only part of the picture. Most of the low wattage demand is in mobile. There, price is very important too, meaning very low margins.
When you look at some of the ARM slicon out there, you wonder how anyone can eat on it. The AllWinner dual SoCs with graghics etc cost around $10 in low volume, less than that at high volume. The quad/octo core parts cost a bit more.
Intel is a company that geared to building high margin parts. They spend up large for new cutting-edge fabs, but make it all back with high margin chips.
Intel cannot afford to direct their new fabs to low margin chips because they will just lose money.
So the question is: Can Intel be profitable making sub-$20 multi-core SoCs on their most expensive fab?
Hopefully this will bring down the price on mainstream CPUs. No way am I in the market for a $1000 chip.
The customer for these is going to have a GPU. Are they still bonding their high end graphics to the highest end CPUs that won't be using it at all and holding their midrange chips that would actually use it to the lower spec graphics? That was always counterintuitive to me.
Re: Sweet stuff
My guess is that the main market for mid-range ships is office-secretary grade systems where the merely adequate graphics is, well, adequate. Decreasing the wattage is the preferred design trade-off over increasing the graphics capability.
There's still plenty of mileage in adding a fairly inexpensive ATI or NVidia graphics card, if you don't need a faster CPU but do want faster graphics. Or get a system with an AMD CPU - inferior CPU but better (ATI) on-chip graphics. We have folks with boring desktop systems plus beefy NVidia cards, that actually use CUDA to good effect (molecular modelling results displaying).
The idea that "extreme edition" is for overclockers is a half-truth. The extreme edition comes clocked to within a few % of what the silicon is actually capable of. No sensible OC-er gets the extreme edition, not least due to the extreme cost.
What sensible OC-ers do is get the bottom of a line model in a class (lowest clock multiplier), then crank up the FSB/BCLK until either:
1) The FSB limit is reached and everything destabilizes (except on E5 Xeons, which are limited to 6% extra, but they are an exception)
2) You reach the CPU clock speed limits (aiming for the CPU clock same as the top of the class CPU is a reasonably good target to aim for without having to reach for extreme cooling techniques)
That gives you top model performance at bottom model price with sensible cooling requirements. Everything else is either a waste of time, fishermens' tales or just plain old marketing bullshit.
Beer Friday at Devil's Canyon coming up on the 28th... http://www.devilscanyonbrewery.com/
- DINO-SLAYER asteroid SAUR-O-CIDE was terrible bad luck, say boffins
- BEST BATTERY EVER: All lithium, all the time, plus a dash of carbon nano-stuff
- Stick a 4K in them: Super high-res TVs are DONE
- Review You didn't get the MeMO? Asus Pad 7 Android tab is ... not bad
- Russia: There is a SPACECRAFT full of LIZARDS in orbit above Earth and WE control it