Feeds

back to article ARM vet: The CPU's future is threatened

ARM's employee number 16 has witnessed a steady stream of technological advances since he joined that chip-design company in 1991, but he now sees major turbulence on the horizon. "I don't think the future is going to be quite like the past," Simon Segars, EVP and head of ARM's Physical IP Division, told his keynote audience on …

COMMENTS

This topic is closed for new posts.

Page:

Bronze badge
Trollface

Battery solution:

1x Dairy Milk Bar

1x fishing rod

1x treadmill with dynamo

1x 30-something single woman

Not all that portable admittedly, but I've got a patent pending on a nationwide network of charging stations :)

27
2
Anonymous Coward

Moore's law??

I'm not sure if Moore's law was/is really applicable on mobile processor performance.I would like to benchmark Axim x30 with Intel's ARM V5 (XScale) running at 624mhz from 7 years ago against the latest smartphone around.

If you look at the desktop world and how vast an advancement in architecture and clock speed we got since 2004, well no comparison really.

As an aside, Is intel not kicking itself for selling xscale?

0
0
Silver badge
Boffin

Moore's Law:

"The number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years."

Note there's nothing specifically performance-related there. Yes, in the desktop world those advances were often used to increase performance.

But in the mobile sector, they've been used as much or more for miniaturization, power efficiency, or adding functionality, which is why today's smartphones are smaller, and run longer, than an Axim x30, even though they have to give some of their battery life and space to the relative hog of the 3G/3G+ radio (not to mention the wi-fi, bluetooth, GPS, accelerometer, etc.)

Re your aside, I certainly hope so.

3
0
Bronze badge
Devil

The mistake with invoking Moore's Law, if we assume it really works for a moment,

Is for an employee of any one company to assume it applies to their company...

0
0
Silver badge

Dedicated hardware best suited?

Isn't this rather obvious? The microprocessor exemplifies the concept of jack of all trades, master of none. Frankly the only reason my netbook is capable of showing me animé is because there is enough grunt power to decode the video data in real time. But then my PVR with a very slow ARM processor can do much the same as it pushes the difficult stuff to the on-chip DSP.

Likewise the older generation of MP3 players were essentially a Z80 core hooked to a small DSP, all capable of extracting ten hours out of a single AAA cell.

Go back even further, the Psion 3a was practically built upon this concept. Bloody great ASIC held an x86 clone (V30) and sound, display, interfacing, etc. Things were only powered up as they were actually required. In this way, a handheld device not unlike an original XT in spec could run for ages on a pair of double-As.

As the guy said, batteries are crap. Anybody who uses their smartphone like it's their new best friend will know that daily recharging is the norm, plus a car charger if using sat-nav. So with this in mind, it makes sense to have the main processor "capable" without being stunning, and push off complicated stuff to dedicated hardware better suited for the task, that can be turned off when not needed. Well, at least until we can run our shiny goodness on chocolatey goodness!

9
0
Silver badge
Boffin

re: dedicated hardware

The problem with dedicated hardware for task x is about where to draw the line. Having purpose built chips for every task soon stacks up to be a lot of chips in one device, and ramps the costs up too. Not to mention the design costs for a hardware solution plus the inability to upgrade it later.

Besides, the same problem still applies - he's comparing the cost of a 2G modem with a 4G modem as an example. Even specialised hardware is still going to be more energy intensive - the scale still exists.

2
0

But an ARM is specialized...

ARM chips are RISK processors, specialized towards flow control operations and simple arithmetic. They use pipe-lining the push a lot more operations through per cycle than the CISC chip you get in your Desktop.

But they are rubbish at the kind of high throughput mathematics that is required for video decode, and even wireless networking these days. CISC chips have massive instruction sets, giving access to a combination of DSP hardware and optimised microcode for vector math. It's not as extreme as a vector engine, but its there.

For me, this proposal that packages should contain a range of semi-specialized hardware to carry out different types of generic computing task is a migration back to CISC, a surrender of the RISK concept that has dominated mobile devices.

Backwards compatibility has crippled desktop CISC, and I hope that the new specialist CISCs will be a bit more pragmatic rather than being shaped by the migration from previous hardware. A nice way to achieve this would be for SOC vendors to offer a huge base of C++ libraries, with the proviso that the instructions set was prone to change between devices, and using it directly was asking for trouble...

0
9
Facepalm

@SAM 16

Er.... do you mean RISC and not "RISK"?

RISC = Reduced Instruction Set Computing/ers

CISC = Complex Instruction Set Computing/ers

6
0
Ru
FAIL

RISC vs CISC? Really?

It isn't the nineties anymore. Get with the times, granddad. Whilst you're there, learn about the difference between a DSP and a general purpose microprocessor. Compare and contrast with the sort of highly parallel simple processing units used in modern graphics cards. The semiconductor world is not a simple place, especially when it comes to mobile device SOC cores such as those designed by ARM.

When you're done, I invite you to take a look at the 'crippled' processors of today, and have a quick think about how monstrously powerful they are. A new non-backwards compatible instruction set would make everything sweetness and light, you say? Hello x64! You're not suggesting anything new, or even useful.

2
1
Pirate

y.a.f.t.

Is there actually a continuing market for slightly faster kit at higher cost in the current climate? IMHO most kit has been running fast enough for the last couple of years, despite constant efforts to force us to buy more CPU to support the same functionality.

Extreme gamers can link a few GPUs together, data warehousers can add terabytes of SD disk, and the rest of us can upgrade to Linux or Windows XP running Libre Office ;-)

This article suggests it's time for software to catch up with the hardware.

15
0
Silver badge
Thumb Up

@proto-robbie Good points, I have to agree.

If in addition they concentrated on battery life (we after all are talking about *mobile* computing here are we not?) instead of "my dick is bigger than your dick" "improvements" we would all be better off.

2
1
Linux

Back to the 70's then?

Maybe the way to make these devices to run faster is to tighten up the code. After all we've been getting rather a lot of bloat whilst Moore's Law has applied. In the 70's when processor time cost money it was a time when shaving the time off your code had a distinct advantage, and they didn't have cut'n'paste coders in that era.

I'd predict a trimming back of all those functions that don't get used unless it's the 5th Tuesday in February, to make what does get used rather a lot quicker.

Tux - possibly the home of better software.

13
0
Silver badge

Warning: Implicit car analogy

There's not much of a market for faster kit, but there is certainly a market for kit that is as fast, but consumes less power in the process. Moore's law benefits this too: smaller features require less power to switch on or off. This is why you can run them faster, but it also means that, speed-for-speed, the smaller part consumes less power than a larger one.

From mobile phones to data centres, power consumption is now the number one enemy. It's only really the desktop market that gets a free ride for this; but even here, large corporate buyers are waking up to just how much of their annual electricity bill is spent generating 3d images of pipes through the night, and it's having an effect on buying decisions.

4
0
Silver badge
Go

Dimension Z

He spins a likely tale. But we are beginning to explore the potential of the vertical dimension. Moore's law is safe enough for a good while.

I like the heterogenous cores idea, as I've said here long ago.

2
0
Happy

3D Chips

The future must surely be 3D where layers of a chip are sandwiched together. That also allows cheaper production as faulty layers can be checked for and rejected before the final sandwich assembly is completed. So it becomes a question of finding ways to increase mass production of layers (which can be improved), not a question of reducing the geometries of layers which cannot continue.

That would even work with older larger geometries so older Fabs would still be very useful. (Plus older Fabs are still very useful for a lot of smaller more dedicated chips which is a very big market).

@"since the Intel 4004 appeared 40 years ago this November"

I hope that historic anniversary is recognised by the world's press. Technology has after all totally changed the world in the past 40 years and we all owe a lot to that historically important work.

1
1
Go

titular thingy

Layered chips are already here. Apple use it and have a patent on their particular process, and flash memory cards have an ARM processor for wear balancing layered with the flash wafer.

1
0
Go

Battery Technology

It's interesting to think that if there was any *major* advancement in battery technology in the near future (and we hear about new "breakthroughs" every other month), ARM could be wiped out in the mobile space as there would be no need (or at least, far less of a need) for their power efficient hardware...

A real-world 10-fold increase in battery storage density (naturally involving nanotechnology of some kind) is probably the kind of breakthrough that Intel dreams of, and ARM has nightmares about.

Until then, go ARM!

0
5
Anonymous Coward

title

I'd say ARM would be quite safe in the event of a breakthrough in battery technology because they'd extend that life too. Which would you rather have, an intel based device that would need charging every 2 days or an ARM based one that would need recharging every 5 days? I know which one I'd pick ;)

(Note: Numbers plucked out of rectal sphinctor as a means of giving an example, any resemblence to reality is purely coincedental. And yes, my spelling sucks :P )

6
0

new battery tech

Even if a major break through came out tomorrow that made batteries 10 x more efficient, that would make my current mobile with a ARM chip need charging about every 10 days. With current Intel mobile CPU id probably only get half of that run time and no real other benefit (other than being able to run full fat Windows on my phone which i would have no interest in doing).

Also most phone manufactures have no interest in putting an Intel CPU in their phones as they can get ARM chips from several sources much cheaper than Intel cpus. Look at the recent article about how much Intel want to charge for CPUs for the Ultrabooks to see how expensive Intel are compared to ARM.

It looks like the ARM Vet is saying we need to go back to the design days of the Amiga 500 which had a relatively low powered cpu but lots of custom chips for handling other tasks which coupled with well written software made is seem much faster than PCs costing the same price. Maybe if commodore's management hadn't been so useless running the company into the ground they might have been a major player in todays mobile scene.

6
0
FAIL

Power efficiency is still critical

1. Portable computing devices e.g. Smartphones, Tablets, laptops etc, even with a 10 fold increase in battery, this either allows for:

a) An increase in performance with no degradation in battery life.

b) Huge increase in battery life (becoming increasing important for many users)

c) Balance of the 2.

If you can halve the power consumption of the chip you can use a smaller battery for the same job, making the device cheaper and lighter.

2. Data and Processing centre's are one of the largest consumers of CPU's, their operating costs are mainly power consumption, batteries are not going to affect this and many studies have recently shown that 10% reduction in CPU and memory power consumption has huge implications for their operating costs as there is much less heat generated and resulting reductions in cooling requirements.

You may see as much as a 30-40% operating cost reduction for an equivalent setup.

1
0
Flame

If you can't take the heat

It might be nice to have a mega battery in your pocket, but how do you use a phone with a 50W processor without resorting to oven gloves.

Power is an issue for desktop and server machines as well, mainly because of the cooling problem.

6
0
Meh

Meh!

If battery storage density increased tenfold, the smartphone manufacturers would just fit smaller batteries of the same capacity.

My first phone had six AA NiCads which took up 40% of the volume of the case. My current phone is powered by what appears to be an After-Eight mint which takes up less than 10% of the case.

4
0
Silver badge

Sorry to burst your bubble

but batteries could be ten times better and it would still matter that your CPU burn as few watts as possible.

It's like when guns and stuff get lighter it doesn't follow that grunts get to lug lighter backpacks around.

1
0
Flame

Not only the heat

Being able to produce ten times the electrical power would also involve having ten times the chemical power in the battery when fully charged. Flames indeed when you trip over with one of them in your pocket.

0
0

My first wasn't THAT big, but...

Yeah, that's the way it goes.

Remember back when the phones had keypads where you could press keys with your finger instead of a toothpick?

These days, I have to pat my pockets to find where my phone is. A few years ago, you could feel at least a little bit of weight.

1
0
Silver badge

naturally involving nanotechnology....

I'd hazard a guess that stong halucinogenics would be a more productive route!

0
0
Silver badge
Happy

What on earth has happened here?

A thoughtful, intelligent, fascinating and well written article from which I learnt rather a lot. Without any jokes, satire or the faintest smell of clickbait in it. Have I logged on to the wrong site?

21
3

agreed

That's what I thought. For a minute I wondered if i'd opened Ars Technica by mistake ;)

Great article, and Simon Segars, knows his stuff.

6
3
jai
Silver badge

sometimes

sometimes, you need a break from the trolling and the play mobile and the fanboy wars and it's nice to chill out with a cuppa tea, a digestive biscuit and a nice, geeky tech article on a saturday morning.

i'm sure normal service will resume soon though :)

6
0
Joke

There is always the possibility....

... that your clickbait sensor malfunctioning. :)

1
0
Silver badge

@jai Re "Sometimes". I will admit that that nirvana you discribe (although the.......

........beverage in my case is coffee) is extraordinarily attractive. That was in fact my reaction to the article. It was indeed interesting and instructive and I felt refreshed after having read it instead of totally wound up and ready to bite someone's head off. Obviously El Reg made a big mistake and it won't happen again. No doubt somebody will be disciplined such this type of error is not repeated.

3
1
Go

Re: Ars Technica

Thumbs up for mentioning Ars.

Also, the CPU expert at Ars have been telling their readership for ages that ARM does not have any magic dust they can sprinkle over their chip designs to make them more efficient. The only reason they consume less power is because they are _way_ less powerful than x86 chips. The day the start approaching them in computing power, they will consume pretty much the same. Also, to the CISC vs. RISC debate people: Ars again mentions that instruction decoding into microops today comprises a very very small percentage of the CPU time, and therefore the ABI or instruction set is mostly irrelevant these days when talking of powerful chippery.

Again, just quoting Jon Stokes and the other Ars experts, but they do seem to know their stuff.

2
4
Flame

Power problem

"We want bigger batteries so we can burn more power"

Power => heat. How hot do you want your phone to be? I wouldn't fancy holding a running POWER7 CPU in my hand, even if it had a dirty great heatsink and fan.

Dedicated hardware (yes, this is quite expensive) and highly optimised, clever software (yes, this is also expensive and difficult to get right). Good luck with that.

1
1
Bronze badge
FAIL

I think he was saying...

....that the problem is run time rather than inability to increase instantaneous power consumption.

Batteries are a problem, imagine how things would be before the advent of Li-ion and Li-poly....

0
0
Boffin

Power => heat

Uh, wrong - unless you have some vague meaning of "power" in mind. By the simplest analysis, heat and power are simply two words for the same thing. Perhaps you meant to say that heat <> temperature, except that that would allow you to say, as I can, that I wouldn't mind holding any running CPU in my hand, provided that it was attached to a large enough heatsink.

0
3
WTF?

Heat != Power

In simplest laymans terms they may equate, but let's try some simple analysis:

Power = rate of energy conversion, ie. energy / time.

Heat = form of energy.

Over time, the energy stored in the battery is converted to heat.

The more power the processor requires, the more heat produced in a shorter period of time.

Therefore increased processor power consumption = more heat to dissapate over area of phone = hotter phone.

I really would prefer increased run time over higher performance. My phone does everything required as long as I remember to charge it every day..

1
0
Silver badge
Thumb Up

Chocolate? Chips?

What's not to like?

1
0

Good article

Particularly the foundry side of the business.

Makes me wonder, where are all these miracle technologies and fabricating methods, new materials et al we read about that are going to revolutionise computing. Are they all dead in the water, or are there just no takers because everyone's enjoying silicon?

0
0
Ru

everyone's enjoying silicon?

Give it time. Semiconductors take a very, very long time to trickle through from the drawing board to consumer devices. The stuff in your shiny brand new smartphone started life several years ago, and that was based on tried and tested technology.

1
0
Thumb Up

"wiped out in the mobile space"?

"ARM could be wiped out in the mobile space as there would be no need (or at least, far less of a need) for their power efficient hardware..."

It would take a while.

ARM don't build chips, ARM licencees do. These companies/people have years of experience in designing and building system on chip designs for specific markets. SoC designs that have all the important components a system needs (often including a DSP or two if the ARM's existing DSP capabilities aren't appropriate for the task at hand).

ARM-specific features not found on alleged competitors also bring excellent code density (less memory for the same workload, ie cheaper) and other stuff along similar lines.

Yer man from ARM raises an interesting point re the economics of chip manufacture at ever smaller geometries with an ever smaller number of customers (one solution to which is presumably for Intel to buy outfits like ASML), but I'd have been interested in some hard info on chip and wafer volumes currently being built at specific geometries. I'd be amazed if something tried tested proven and relatively cheap around 100nm didn't dominate the market - but I could be wrong. No one except a select few *needs* ~20nm technology.

Today's Intel have nothing in their bag of tricks once they run out of process enhancements; they've not had a technical success outside of x86 enhancements for decades.

Obviously the legacy commercial muscle of the Wintel empire is not to be sneezed at, but the future is in System on Chip design.

It's what your right ARM's for.

3
0
Bronze badge
Boffin

Moore's law

Its not the end of anything, pretty much this article summarizes what I have been observing for years, that CPU's are pretty much farted out of the FAB full of bugs, and with barely any more optimization than what the smaller geometry inherently gives to the processor.

Intel/AMD and the like just spit things out.

Well I for one welcome the end of moore's law and the beginning of build a good product law.

What this means is that if they can not cram more transistors into less space, they will have to start thinking on ways to make the same number of transistors do more more efficiently.

Perhaps that will also put an end to the huge cooling monsters modern CPU's require.

1
0

African or European Swallow?

1x Cadburys Dairy Milk has 35x the energy of a phone battery.

Are they referring to a 50g bar or the 200g fat b*stard special?

(Well, you have to know these things when you're a king, you know. )

4
3
Facepalm

Energy *density*

You go flying off the bridge, screaming, into the abyss for not noticing that.

4
0
Silver badge
Boffin

Either

35x the energy density, so either.

4
0
Silver badge

Denisity is density...

You think the 200g bar is the fat bastard special?

So who eats the 1kg bar? Apart from me of course?

Could run a laptop for a week on one of those, just long enough for my tummy to stop feeling bad!

Omnomnomnomnom

0
0
Silver badge

Chocolate

Assuming that chocolate is ~100% fat & completely oxidized

1g = ~37kJ

So the chocolate equivalent of my 300g laptop battery would be ~~11MJ

That should keep it going a while !

0
0
Bronze badge
Flame

You could just as easily produce energy for your laptop

By setting fire to the box it came in. But like the chocolate bar, the only way that works is to consume large amounts of oxygen and produce waste products. It takes a little longer to 'recharge' your chocolate bar if you consider the complete carbon cycle. So it's a little facetious to compare its energy density with a sealed battery.

0
0
Silver badge

Plenty of people are thinking about ...

Fuel cells (although AFAIK not chocolate powered ones)

Seriously it's just a device to show how poor the energy density of batteries is compared with (say) the equivalent weight of hydrocarbon. It's only the same as electric car range 50 miles/ diesel car 600 miles

0
0
Silver badge
Thumb Up

Terminator

Heh, that just reminded of that scene in Terminator 3, where one of Arnie's "fuel cells" gets damaged.

Imagine having one of those in your laptop on the train and dropping it on the floor. It would take out 2 or 3 of the train carriages you were travelling in :D

0
0
Boffin

He missed something...

FPGA's

He talks about a move to dedicated hardware, if designers can achieve this balancing act between CPU's and GPU's as he suggests moving to a device like an FPGA may also be possible.

For dedicated tasks that overload CPU's and GPU's, ASIC's can't be beaten, not even FPGA's can match them for size and power usage.

However, ASIC's by their very nature are inflexible, he mentions putting encryption hardware on the chip, how many times a week do we see on this site. "Some team, somewhere has cracked x algorithm or technique".

An FPGA would allow the majority of the performance of the ASIC, but allow for the update to a new algorithm or technique.

As stated in the article, one of the major costs is verification and once the silicon is set it can't be reworked, in FPGA it can, certain issues can be fixed with firmware updates.

Also not all features are used at once so the device can be reconfigured on the fly to serve the purpose in use at a specific time, reducing silicon area and as a result power consumption, without compromising functionality.

0
0

Page:

This topic is closed for new posts.