Feeds

back to article Intel's Tri-Gate gamble: It's now or never

There are two reasons why Intel is switching to a new process architecture: it can, and it must. The most striking aspect of Intel's announcement of its new Tri-Gate process isn't the architecture itself, nor is it the eye-popping promises of pumped-up performance and dialed-down power. And it certainly isn't the Chipzillian …

COMMENTS

This topic is closed for new posts.
Anonymous Coward

Interesting thought...

It would be interesting to see an alliance/merger of ARM and AMD.

2
2
Def
Bronze badge

Re: Interesting thought...

No it wouldn't. Merging with someone like AMD would kill ARM.

AMD licensing ARM designs would be more interesting, and any of ARM's current licensees moving to the same 22nm Tri-Gate process would effectively kill Intel's chance of ever gaining any momentum in the high-performance, low-power market.

It's about time ARM made significant inroads into the desktop market. Intel have been making our (programmer's) lives hell for long enough.

9
0

Agreed.

If the desktop standardised on ARM, I would actually ditch my sysadmin hat and come back to software engineering.

0
0
Silver badge
Boffin

No you wouldn't

Nothing would change for you in any of both jobs. Or did you mean "software engineering" is about OS and compiler tuning?

The lower-layer jobs are too specialized nowadays.

1
0
Silver badge

"Intel could license the ARM architecture..."

Doesn't Intel already hold an ARM ISA licence from its Xscale days, or is that only for the ARM5 CPU design that they were building at the time?

1
0

Sold to Marvell

I should imagine the ISA licence went with the rest of the XScale stuff which they sold to Marvell

0
0
Scu

Intel's ARM licence is still valid

As far as I know. Should be easy enough to verify, right?

1
0

The last three paragraphs sum it up

Absolutely spot on, the main point of a salesman one may ask. The answer- to blow sugar up your ass.

1
0
Bronze badge

Fudge

"The answer- to blow sugar up your ass."

I'm not sure that's all it's cracked up to be.

4
0

how rediculous, intel making ARM chips

err without calling them by the name they used to have strongArm (via a lawsuit) and XScale..

Can't see ARM merging with AMD, why would they? AMD are now fabless so they don't gain a fab, amd just license an ARM design like everyone else...

3
0
Anonymous Coward

So in essence....

My reading of the performance improvements are 22nm tri-gate will be faster than 32nm planer (just like in previous process shrinks) but more importantly, allow Intel to use a lower voltage without sacrificing performance which will be crucial for competing with ARM.

The real question is what are the performance differences between 22nm planer and 22nm tri-gate if any as this is where we will see the competition between AMD/Global Foundries and Intel.

3
0
Thumb Up

22nm planar vs. 22nm tri-gate

Exactly my point :-) It really seems to me that they've merely found a way to make the die-shrink work out once again, i.e. once again somewhat in proportion to the basic geometry, before they finally have to give up any hope of dragging Moore's law any further using just silicon and lithography. 32nm->22nm = 0.6875^2 =~ 0.47 . By boasting just a 50% cut in power consumption, they're in fact admitting that that they've *almost* made the die-shrink work out up to the theoretical expectations :-D

Thanks for the _deep_ explanation BTW - if it wasn't for El Reg, I'd wade through Intel's PR fog blindfolded till the end of my days :-)

2
0
Gold badge
Thumb Up

@Frank Rysanek

"before they finally have to give up any hope of dragging Moore's law any further using just silicon and lithography. 32nm->22nm = 0.6875^2 =~ 0.47 ."

I think you've hit the nail on the head. The question is how much *better* is this tech *above* what you would expect from a device shrink.

Not much seems to be the answer.

1
0
Thumb Down

No, don't think so...

...I'm sure the problem of leakage increases with reduced nm parts due to the old uncertainty principle or something, and the tri-gate thing is mentioned as reducing transistor leakage due to the improved drain in the off state.

I'd love to see actual figures for planar 22nm, but regardless, I certainly expect them to be considerably worse than the 50% saving Intel has achieved.

0
0

Whatever Intel can do with FinFET then ARM can also do

@"The move to a 22nm Tri-Gate process architecture is an important step for Intel's entire microprocessor line"

It is important because once the ARM A15 design is here, Intel will start to loose the future server market on processing power per watt. (The ARM A15 will allow the design of servers with more processing power for less electrical power than an Intel CPU based design. That's win win for ARM and Checkmate for Intel's bloated x86 design).

Intel's market lead and dominance up until now has largely depended on Intel's ability to define what each new generation of x86 design should be, so they were always first to market with each new x86 generation. That meant each time the x86 design changed, AMD had to spend time playing catch up to add each new addition to the ever more bloated and increasingly complex x86 design. This constantly changing x86 design gave Intel the marketing lead over AMD, because Intel were always going to be first to market with each new generation.

Unfortunately for Intel, ARM are not playing that same game. ARM processor designs are more power efficient than x86 designs. So whatever chip making process Intel uses, they still can't win, as an ARM design based on that same chip making process will win over the x86 design. Ironically for Intel, their bloated x86 design that was so useful for holding back AMD, is now holding them back from competing with ARM.

Which leaves software (not hardware) as Intel's remaining x86 strength and even that is just in its legacy support. Its a pain to recompile programs and some old programs won't be recompiled (the companies who created the code may not even be around any more). But for all new code, its really not an issue. Plus ARM has a lot of software support. For example Linux has been on ARM for years. Android and iPhone already support ARM. Even Microsoft are looking to support ARM. Also ARM software development has had 28 years to evolve to a very high level of industry support with good free tools. So ARM is strong in software support as well as beating the x86 design in processing power per watt.

So Intel are in trouble. AMD isn't Intel's biggest competitor, its the over 200 companies that all license the ARM designs that are Intel's biggest threat, because together these over 200 companies could seriously harm Intel's market dominance. So Intel really are in trouble.

6
0
Megaphone

Except that ARM does NOT have the same mfg process.....

"ARM design based on that same chip making process will win over the x86 design"

The question is if Intel on this new process is faster/cheaper then ARM on the processes available to the rest of the world, not if ARM is better then Intel designs on the same process.

Since Intel is not going to teach the rest of the world how to cheaply implement Fin FETs in production (there has to be a reason why nobody else did, in spite of lab samples being available for about 8 years, per ElReg), ARM chips will continue to be made using existing processes, which puts them at a disadvantage compared to Intel chips.

0
0
Anonymous Coward

"Android and iPhone already support ARM"

Not to mention almost every piece of midrange and highend consumer electronics shipped in the last few years, from broadband routers to NAS boxes to TVs and much much more. Not as trendy as Android or iPhone but far far more ubiquitous.

Anyone remind me what recent successes Intel have had outside the Windows/x86 world? Anyone?

3
0

Sounds like times they are changing

Great article, would have thought this reflects the challenge that ARM and devices like the Ipad are having on the computer market, which is massive.

Intel and Microsoft have had an effective duopoly on the business, timing the increase in Windows bloat and performance demands with processor releases so we all keep upgrading and buying more powerful machines. That cash cow is dying, the other great market was Servers and with the evolution of the cloud fewer standard server designs will evolve and the constant cyclic refresh of servers can be reduced as grid and virtualisation allow mixed capacity systems to happily share workloads and there is probably less spare capacity as workloads are more tightly managed. The economics of energy may push an initial refresh cycle and sales if its a lot more energy efficient , but you wonder about the economics of the size of investment and price competitiveness of efficient chips against ROI, particularly where the high CPU numbers are against low cost ARM cores and you need partner to switch platform to use your cpu/chipsets

Intel therefore is faced with reducing demand in its core markets and a new rapidly growing market that it is poor in. No surprise its throwing cash at the problem, can't wait to see what they do about graphics to partner the faster processors. ARM's advantage at the moment is in some ways tied to its better package as much as the faster CPU and the only strong driver for more powerful mobile devices is to push more engaging games or Augmented Reality at us and I suspect Google is best positioned in that marketplace, if it manages its platform well.

On the enterprise server front with the release of more powerful CPU's with lower energy footprints I think the sales volumes will be smaller than in the past as the overall shrinkage of data centre numbers continues.

Fastest way for Intel to grab market share of fab plant production and increase its stranglehold on the market would be to fab ARM/ATI packages with its 22nm processes ahead of the pack.

1
0
Silver badge

Unlikely?

"Or, for that matter, Intel could license the ARM architecture and start buiding its own ARM variants in its own fabs, using its 22nm Tri-Gate process"

I'd have thought, more like a dead cert, than unlikely.

If Intel has the best process technology for low-power devices, ARM without question has a better CPU architecture for low-power devices like Smartphones. Put them together and what do you get? The best possible Smartphone CPU, that can either double battery runtime, or allow for a large cut in the weight of the phone without any loss of runtime.

If Intel suffers from the "not invented here" syndrome, Smartphone manufacturers will have to choose between i86 architecture running on the best Silicon, or ARM running on less good silicon. It won't be so long before TSMC or some other chip foundry catches up with Intel enough to put ARM back at the front of the pack. Best for Intel if it's Intel that makes the best mobile device chips.

7
0
Thumb Up

Cheers.

The article made even someone as thick as me think that they understand the concepts; thanks.

4
0
Thumb Up

I'm with stupid

Very enjoyable.

2
0
Thumb Up

Agreed

Very good article, but I think it could have benefited from a few diagrams. Go on El Reg, fork out on someone who can illustrate articles such as these which talk about fairly complex architectures.

1
0
Thumb Up

Agree x4

I'm usually critical about The Register's articles, but this was a good read.

0
0
Bronze badge

Intel making ARMs?

I think we'd all win then, even Intel.

0
0

Erin in Rowayton

A TERRIFIC review. And, it is interesting to read the responses involving the David & Goliath issues. In my humble experience, the semiconductor Goliath likely has a big edge. In a few years it'll be interesting to see if Intel can bottle the innovation rabbit: Apple. Thanks much !!!

1
0
Silver badge

Intel better make ARMs

Because Samsung, Texas, Qualcomm etc will make ARM FinGate FETs... At the moment there is no value in putting FinGate FET in ARM Soc as it adds to cost.

The other aspect of x86 vs ARM SoC apart from Power and SoC Integration is much lower ARM prices.

1
0
Silver badge
Flame

FinFET and ARM

Only justifiable if it gives benefit. Will ARM save 50% on power?

WRT the comments about x86 bloat - remember it was AMD who put the 64-bit extensions into x86 whilst Intel was preparing to bet the farm on Itanium.

I can understand why AMD did it, but it's argueable that the net result has been a 5 year delay on PC architecture moving to more efficient CPUs

0
1
Bronze badge
Headmaster

Socket change then? What a treat for us!

Can't wait. New socket type, so time to redesign motherboards, so new RAM, new cards, new PSU with different power requirements, and so on for ever.

Changing the blasted socket every 9 months isn't progress. It's a royal pain in the bum. It's getting to the point where you can't get parts for a 2 or 3 year old machine.

1
1
FAIL

@Russell

Intel has already released the info on their Ivy Bridge sockets. Yep. Pin compatible with 1155. You could drop Ivy Bridge in a P67 mobo or a Sandy Bridge in the new 7-series chipset. (BIOS updates apply, as usual).

I guess they actually learned something about compatibility from AMD's AM socket. Although, I'd suggest doing your research before looking the fool by spouting based on your speculation. Perhaps you got suckered into buying a 1366 socket?

1
0

its the BOTTOM LINE $$$$

What most miss in these times of economic consciousness is that over time money is saved through energy efficiency per watt, and productivity per watt, all contributing to bottom lines. AMD, and future ARM (x86 designs, created by design partnership between AMD & ARM) have/will have, built into the line of chips a cost savings structure that adds to bottom line, as well as an increase in overall productivity- the chips pay for themselves- immediately.....Intel cannot compete against the above benefits of AMD's Fusion, or the threat of low power ARM designs.....22nn at what cost and yields?

asH

1
0

Yeah

Companies really miss that bottom line thing.

I post dumb things too, but I'm at least polite enough to be succinct.

0
0
Silver badge
Headmaster

There is no such bloody

word as "hella".

Now,please stop using it! You make yourself sound like a 16 year old....

Queens English or none-at-all, thank you very much indeed...

4
2
Anonymous Coward

Re: There is no such bloody

May I asked, what exactly is a "cornz"?

4
1

Re: "Another nifty Tri-Gate trick is that you can have multiple fins in the same transistor"

Actually this as much a disadvantage as an advantage.

For planar transistors you can (and people do) vary the width to get different drive currents, but for a FinFET the width (of the fin) is assumed fixed. So the only way to get the equivalent of variable width is to use multiple fins in parallel to get the equivalent of integer widths (1,2,3,...). But even here you lose the option of non-integer widths such as 2.4.

1
0
Gold badge

ARM is *not* Intels enemy, but Intel *is* ARMs.

ARM would *happily* license their latest architecture to Intel (at the right price). They are a fabless design house. Licensing IP is what they do.

Intel don't *want* that.

Customer's would not *pay* Intel x86 prices for an ARM. Why should they?

They want you to buy their x86 *architecture* (as in proprietary, single source with *their* extensions) at their *proprietary* prices.

This is just a way of encouraging operators to do so.

They know if the workload does not *force* people to run Intel hardware (IE The only processor some core apps, including Windows run on) and people are not afraid that AMD will release a competitor with more bugs in than theirs why pay the Intel *premium* price?

The piece missing from this announcement will be how they will "encourage" MS to eliminate support for ARM in Windows versions. Not that they are likely to put that in a press release.

MS and Intel have benefited from having *near* monopolies on the desktop. Intel could *never* have funded this spending spree without the fat profits from controlling the architecture and a monopolistic core software supplier.

No one would pay Intel prices for an ARM architecture without *massive* significant additions to the core.

As for the tech itself.

This seems to be more about *leakage* current than switching. That's important because while you can slow clocks (use clockless methods in extreme cases) and power down sections of a chip you can's *stop* the leakage current (and IIRC it rises with temperature, although at a guess aircon costs more than the electricity you loose in leakage unless you're Google). This is a way of doing SOI levels of leakage (or near there) *without* the SOI issues of lattice strain or creating a high resistance substrate layer *inside* a wafer.

The first rule of monopoly is "We have *no* monopoly. It's a free market".

The second is "do *whatever* it takes to protect the monopoly."

3
0

Totally agree.

I've also been saying for a while now, that Intel is pretty well doomed if it continues its current business model. ARM chips are available from more vendors than you can shake a stick at, which lends itself to extreme price competition. Not only that, but the sheer range and scope of ARM-based SoCs available puts anything based on Intel silicon to shame.

Intel should be familiar with the concept of extreme price competition - x86 PCs killed alternative architectures like m68k, MIPS and, ironically enough, ARM, on the desktop, because vendors competed on price to offer Intel x86-based machines at a cost that no proprietary m68k-, MIPS- or ARM-based competitor could ever match. A similar story happened in server rooms, with SPARC, Power (and even, ironically enough, Itanic) architectures now playing second fiddle to x86-64.

However, in the mobile and low-end space, Intel has found itself in the boots of the proprietary vendor, surrounded by hordes of third-party vendors now offering its principal competitor's - ARM's - architecture.

This would not have happened in the 80s and 90s: Software compatibility used to be the must-have bullet point on the spec sheet, because older software was usually written in low-level assembler or very customised to the hardware (remember CGI, EGA and VGA versions of software, in the bad old days?) These days, it's no biggie - most OSes have a HAL of some description these days, and many conform to open standards like POSIX, with sane driver models. But even so, quite a lot of code is also open source - and even closed-source folk like Microsoft have recently been far more open to porting code to different platforms (i.e. Windows 8 will be available on ARM.) So, x86 compatibility isn't the be-all and end-all anymore - nor is any other, for that matter; nor will it ever be again.

This will sound the death knell for Intel unless it adapts - and it should adapt by splitting itself into two parts - one part that manages licensing and further developing x86 architecture, and another part that manages fabs - both for itself, and for customers. Intel should try to outdo GlobalFoundries on the manufacturing side, and try to compete with ARM on the IP side. Yes, this will mean it has to play fair - but this is a bit like the situation that Grove and Moore tackled in 1985, when cheap Japanese memory threatened to put Intel out of business. What did they do? They changed the business. Ultimately, if Intel is to survive, it will need to change again. If the industry changes to ARM, the jig's up.

ARM should be thanked for doing what AMD rarely ever managed: Keeping Intel honest. There would be no question of seeing these kinds of process innovations if Intel thought it had the whole market to itself.

2
0
Bronze badge

Too complicated

The whole splitting and competing thing are just too organizationally complicated. Far simpler, and probably cheaper, for Intel to buy ARM outright...

0
0
This topic is closed for new posts.