Feeds

back to article Apple reportedly plans ARM shift for laptops

Apple may - and we emphasis that last word - have decided to transition its laptops from Intel processors to ARM-based CPUs Intel certainly has a fight on its hands in the media tablet market, currently dominated by ARM chippery, but does it need to worry about the laptop space too? It will if the allegation about Apple, made by …

COMMENTS

This topic is closed for new posts.

Page:

Thumb Down

Apple != Laptop market

Apple is the part of the laptop market that _is_ movable. If you count 64-bit versions, it will just be Apple's 6th (m68k, ppc,ppc64, x86,x86_64,Arm) architecture change.

I think the early netbook situation demonstrated that the PC crowd is not very appreciative to everything which is not dog standaard and Windows based.

So even if Apple makes this move, I don't think it is a telltale for laptops as a whole

2
1
Silver badge

Apart from...

...the fact the Windows will apparently run on ARM too.

So not only will users get low-power and portability, but also the rich experience of *the* standard environment.

Pffffttt............giggles

5
0
Silver badge

Could be more telltale than you think

Even if this were an official announcement from Apple, it'd have a little of the 'me too' to it, given that Microsoft has already announced an ARM port of Windows 8. Obviously the difference is that if Apple decide they want ARM then you stop being able to buy an Intel Mac anywhere, but supposing Apple were to switch and to demonstrate gains in doing so then the door will be completely open for companies that ship Windows machines to introduce competing devices into their ranges.

So: Apple's move could start a trend, or at least have more of an impact than just on the tiny OS X audience. Though you'd have to buy into the version of events where Apple are highly influential in everything they do rather than just occasionally influential in some areas; assuming genuine benefits do appear from ARM laptops then I'd expect Windows manufacturers to offer devices anyway, and quite possibly sooner.

0
0
Anonymous Coward

Re: "have more of an impact than just the tiny OS X audience"

To understand the impact don't look at the market share stats, just go to a place like the British Library.

Why? Because it is in a place where thinking intelligent people congregate in a cosmopolitan centre of a city with global clout. You have a mixture of young people, old people, students and thinkers and entrepreneurs using the cafe for ad-hoc business meetings. Last three times I was there, I did a quick survey of the make of machines in use in the cafe. Two out of the three times Apple MacBooks came out at over 50% share, the other time at 40% share. Total sample size is now probably about 50 machines, so statistically significant.

What does this say?. Apple share is much greater than top level market share stats when you look at opinion formers, people with get up and go and deeper thinkers. Given what I know about the market share stats, I was amazed. But then it struck me, of course those stats will be wildly different. They include the PC's running the booking systems of your local car repair service. The PC's of service centre staff who have little interest in their work or the machine they are running. The PC's of office workers counting down the hours till they can go home. Of course PC usage isn't confined to this set but looking at market share alone says very little about the true influence and significance of OSX. PC's are associated with tired thinking and uninspired work far more than is the case for Macs. In pretty much any local Starbucks in London, you will find Mac usage up at at least 30-40% (can't speak for other towns/cities as it's only recently I started noticing the truth of this)

6
5
FAIL

Lol

This argument (AC 07:58) is quite possibly one of the worst arguments that I have ever read. Macs are good because people in Starbucks use them. And Starbucks' clientele are clearly the height of society. Lmao. Oh dear..

5
0
Bronze badge
IT Angle

@Doms

Which part of "the British Library" you cannot understand? Note "the", not "a".

You kind of corroborate the argument, so try again.

0
0

If

there isn't a Starbucks in the British Library, it's probably not for lack of target audience.

1
0
Silver badge

For those stateside

If you don't understand "the British Library", think "Library of Congress".

0
0
Thumb Down

@ AC 07:58

I understand you're trying to suggest that OS X and FruitMachines are popular with those who know of which they speak, and those who act as tastemakers.

I agree to some extent (I'm a sysadmin in a university and it's astonishing how many post-docs and professors want to buy Macs for work usage, as long as they're not paying) but there are two crucial problems with your argument:

1) The "taskemakers" you're talking about don't necessarily know anything about the computers they use, and are just as vulnerable as the rest of the plebeian masses to marketing. Believe me, there are some exceptionally intelligent minds conducting pioneering research where I work, and yet they have all the knowledge/interest in computing of a bored ten-year-old.

2) For the influence of the tastemakers to filter down throughout the userbase, Apple would have to offer computing options for all wallet sizes, and it's evident they have no interest in doing this. Want to know why service centres use Dell or HP or even DNUK boxes rather than FruitMachines? Because Apple machines cost more, without providing a specific advantage to justify the expenditure. Hell, even with the academic discount in place Apple hardware tends to be at least a bit more expensive than similarly-spec'd equipment from rival vendors.

3) As for "posers in Starbucks tend to use Apple hardware", so what? Am I supposed to extrapolate that because they've got shit taste in coffee alongside a willingness to pay over the odds for it, their opinion is important?

1
0
Anonymous Coward

The title is required, and must contain letters and/or digits.

Wow, that's some pretty serious assumption going on then.

I suspect there are plenty of pseudo intellectual posers down the British Library, I personally don't see the point of going to a Library to get information when I've got the fecking Internet.

0
0

Something the Navy could use...

With the Defence review scrapping carriers it may be possible for the Royal Navy to be on the market to buy a batch of ARM powered Macbook Airs...

and call it the "Fleet Air ARM"

7
3
Black Helicopters

"Fleet Air ARM"?

WAFU SNAFU?

0
1
Silver badge
WTF?

wait, hold on..

If you recompile OSX for ARM64 and you keep the APIs identical, why would you need to emulate anything for additional software?

This isn't 1995. We don't target the hardware directly anymore. That's the whole point of HALs, and in fact, Macs have always been a bit like the Catholic Church in that you need the OS (priest) as an intermediary to talk to anything important.

Colour me slightly confused.

4
1
Thumb Up

Remember they once ran on PowerPC

Unless I am mistaken, Apple's OS once ran on PowerPC, so they are no strangers to supporting different CPU architectures with the same API.

2
0
FAIL

Because the HAL doesn't abstract the CPU instruction set?

Obviously, Apple will make the migration easy for new software - potentially as easy as a recompile, if there's no inline x86 assembly, but an existing binary can't run on an ARM system without resorting to x86 emulation.

5
0
Silver badge

They'll use LLVM if they're smart

9 out of 10 of apps really don't care what architecture they're running on.

If Apple are smart they'll offer a LLVM compiler target in OS X. i.e. the app wouldn't be compiled into x86 instructions or ARM instructions, they'd be compiled into LLVM bitcode. At runtime the OS would compile the bitcode into a native binary and cache it somewhere for subsequent execution. It would mean the app would work any supported architecture - ARM, x86, anything. It would mean no more fat binaries, no more worries the next time the OS moves again.

LLVM is an incredibly powerful abstraction layer and I suspect Microsoft will have to do something similar.

5
0
Silver badge

@DrXym

I think you're right; with Clang now fully capable of C++ and Objective-C++, they've switched to a Clang/LLVM pair for Xcode 4, to power not just the compilation stage but the static analyser, the as-you-type error highlighting, and a bunch of other workspace things.

At present they're pushing all the way to a native binary, but it feels like it'd be a relatively trivial step from here to build LLVM into the OS and move towards LLVM bytecode generation being the default in the developer tools.

3
0
Coat

So

Steve Jobs is the pope then...

I hope the analogy with Catholic priests stops there - I'm sure Apple don't want to limit their OS to over-18s only to avoid any unpleasantness.

1
0
Silver badge

@DrXym

MS already have their own thing, MSIL. .NET stuff can be compiled to this, which theoretically runs on any arch. Of course, I haven't tested tis beyond x86 and amd64, so the ability to run in a truly different arch is still in the air...

1
0
Gold badge

Re: still in the air

MS have offered managed code for ARM-based devices for many years, so actually I think the whole idea is completely "grounded" in reality.

The problem is that there's little incentive for vendors to make it easy for customers to move to a new arch. In the closed source world, most vendors would prefer if the customer "upgrades" when they switch. Witness the number (a minority, but not an insignificant one) who offer new versions when a new version of Windows comes out, and *that's* for the same processor arch and after MS have bent over backwards to ensure full backwards compatibility.

OTOH, since Jobs has all the third party vendors over a barrel with his AppStore (tm?) the situation may be different for Apple.

2
0
Thumb Up

@ThomH

Yes, I agree with that too. In fact, they already do this for parts of their OpenGL implementation, which can generate code on the fly for the appropriate target (CPU, GPU), so it wouldn't be too hard to push it towards apps as well.

I for one welcome LLVM on Xcode 4, it's fast and the static analyzer is great, the project itself shows a huge amount of promise for the future and it may well be one of Apple's best decisions along with KHTML/WebKit.

0
0
Boffin

Emulation

I wonder how hard it would be to convert an application from x86 to ARM at, or shortly after, installation time, rather than emulating x86 at run-time. A good proportion of the code should be easy enough to automatically recode, and I would expect that anything dodgy (such as self-modifying code) could be detected and passed to an emulator.

1
0
Boffin

It depends....

On how well written the program is. I've worked on legacy apps in the past that will never get ported to ARM simply because they make too many assumptions about byte order/size. Heck, I don't think it would be practical to port one of those apps to 64bit, yet alone an architecture that swaps the endienness!

0
0

A few things that help...

...first, Mac OS previously ran on big endian CPUs (m68k, PPC,) and now runs on a little endian CPU (x86.) So, endianness is already dealt with in OS X.

Second, any endianness issues that have crept in since the PPC->x86 transition won't affect ARM - ARM is typically run in little endian mode. (And, ARM can run in a big endian mode, too.)

Finally, Apple started a 64-bit transition not long after starting the x86 transition. (And, IIRC, they did the 64-bit transition twice - once on PPC, then once again on x86.)

2
0
Silver badge

Re: It depends

"Heck, I don't think it would be practical to port one of those apps to 64bit, yet alone an architecture that swaps the endienness!"

That would be apps coded by cowboys then?

0
0
Flame

Apps coded by Adobe - more likely

Google for Linux, Adobe flash, 64 bit. One of the things which got discovered when supporting adobe sorry attempts to go 64 bit was that they were doing memcpy on overlapping ranges. With a coding style like that even changing a few things in the underlying libraries will topple the bugware. Rebuilding for a different arch? - Forget it.

0
0
Thumb Up

Why not?

It's only Windows that keeps most vendors tied to x86 (exceptions apply, e.g. Dell in the past had other $$$ reasons for staying with Intel but maybe both partners learnt their lesson wrt fraudulent accounting).

x86 is already largely irrelevant in the non-Windows market, and apparently even MS are smart enough to see that if there isn't a Windows/ARM combo soon, they may be in trouble (even if the MS-touted combo is only a negotiating tactic).

Apple are not in the Windows market, they've already changed platform more times than a late arriving train at Euston, it makes sense for Apple to look at ARM, especially for notebooks.

Go ARM.

2
0
Thumb Down

What about Windows?

A lot of us also need to/like to run Windows on our Macs, either in Bootcamp or as a VM and I think this is one of the big plus points of Intel based Macs. I can't see an Intel emulator running on top of an ARM chip being that responsive.

1
1
Anonymous Coward

Windows 8....

....will run on ARM.

0
0
Anonymous Coward

Running Win 8 on ARM in a VM is fine, but....

... not much use for attracting switchers, or anyone else using x86 based legacy apps on, for example Win XP or Win 7.

0
0
Anonymous Coward

Hmm...

Is this not the reason that Apple are looking at making ARM laptops? They know that a lot of their userbase like to/have to run Windows and don't want to actually force them to make a decision either way?

0
0
Thumb Down

ARM has no 64-bit plans

ARM said they won't do 64-bit, but some extended memory address mechanism. (40-bit ?)

http://www.pcworld.com/article/216472/arm_ceo_pc_market_not_our_target.html

"we've decided it's not been sensible to have 64-bit programs. Extended memory addressing at 40 bits is in the latest Cortex-A15 ... but we haven't had the need for a 64-bit [arithmetic logic unit]."

1
0
Gold badge

"not been sensible to have 64-bit programs"

Outside applications like databases and video editing, this is true for x86 as well. x64 code is larger and consequently slower in most cases, delivering a net penalty to end-users. Microsoft have been strong-arming developers to do Win64 ports for a decade now with only limited success. Even their own Office division *recommend* that OEMs ship the 32-bit version, even on a 64-bit OS.

2
0
Gates Horns

M$ and Intel promoting bloatware?

Who would have thought that Intel could seek to benefit from selling hardware that people don't really need with the help of their buddies at Redmond...

0
0
Happy

How much processing power do you need?

It will almost certainly be the case that an Intel processor in 2013 will be more powerful that an ARM chip, however if the computing power of ARM chips continue to rise as they have been over recent years, then they will provide ample processing power for the vast majority of it's customers.

If this is the case, then the argument for using a more powerful, but power hungry Intel chip becomes somewhat moot for all but the most extreme laptop users. And for those (for arguments sake) running say a Teradata install on their laptops, I'm sure Apple will provide a suitably price upgrade option for a Intel chip.

2
0
Silver badge

Backwards compatibility

Users won't care about the processor driving their laptop assuming their existing apps all run on it. That means it has to have strong emulation. Without that, I see Apple being stuck in the same boat as Microsoft with their ARM aspirations. Yes the larger companies will make ARM fat binaries for their customers but legacy apps won't work and neither will some smaller apps.

Of course being Apple perhaps they'll "helpfully" remove all free will from owners of such laptops and force people to obtain them through the Mac App Store where they will only be able to install the apps that they have presented to them.

0
0

FPU?

I thought that ARM didn't have any FP support in the chip (or FP support was poor), part of the reason it was so power efficient? That's not a problem for people doing web browsing, emails, phone calls, sms etc, but it can be a killer for certain tasks. So, you have the choice of:

- everything moves to ARM, including high-end workstations - things like Photoshop will struggle and prompt a migration to Wintel.

- mobile platforms (e.g. Air) move to ARM leaving high end on Intel, and app vendors have to ship two sets of binaries for their apps

Either way, it doesn't sound ideal.

0
1

NEON

That is true of older chips. Cortex designs have a NEON vector floating-point unit.

2
0
Boffin

No FP support?!

Actually, the Cortex-A series have two different FPUs - VFP and NEON. Not sure how they would compare to the latest Screaming-Sindy-n extensions performance-wise though.

1
0
Flame

And does it need FP support?

With a proper GPU from Nvidia or AMD inside and support for using it for generic floating point in the OS... Hmm... What exactly is that FPU performance problem once again?

The only reason we continue to abuse poor Screaming Syndy for high performance tasks is that code abusing a Tegra or AMD is not sufficiently portable and there is no software emulation for teh cases when GPU/APUs are not present. You never know what you are going to run it on so for a commercial binary executable you end up using MMX/SSE instead.

With Apple controlling the hardware and OS that assumption is no longer correct. You can be sure that it is present and/or emulated correctly. So the performance may not be such a problem as it seems.

0
0
Thumb Up

Grand Central + OpenCL

I believe their Grand Central tech goes a long way towards letting the OS decide where to execute code, including GPUs with OpenCL support.

So using that technology to compensate for the lack of oomph in ARM chippery should not be that hard for Apple.

0
0

Another choice.

You presented two possible choices here:

"But then it either has to convert the range into iOS machines, to run existing apps, or develop yet another emulator to allow new Air buyers to run their existing OS X apps. Forcing them to re-buy new, ARM-compiled versions of apps seems a very unlikely strategy."

1) Emulate existing apps

2) Pay for recompiled apps

If the ARM user market is big enough to encourage software houses to support it, a further, more palatable option, is that you pay once and run the version compiled for the CPU you happen to be using.

2
0
Bronze badge

third option

http://en.wikipedia.org/wiki/Fat_binary

Of course, this doesn't help with existing apps (do I have to put a TM after this since it's an Apple (TM) article?) which would need to be recompiled and someone ultimately has to pay for the recompilation. But at least in principle it doesn't strike me as being too difficult a transition for people to make. The wiki page has more useful observations which would seem to be quite relevant to this article.

0
0

I don't buy it

It doesn't make sense to run Mac OS X on ARM, there are not only cost issues but performance ones as well as compatibility.

* Cost, they'd have to get someone to fab a larger higher performing ARM laptop/desktop processor to get the same kind of performance as current Intel/AMD processors. If they're doing that just for Apple that is a monumental cost, whereas currently Apple gets a very nice cost deal with Intel.

If they're doing it just for a single type of OSX laptop, then that is absolutely insane. The costs to fab the Apple specific ARM processor just for a single line of laptops would be astronomical.

* Performance, ARM processors are designed with power efficiency in mind. Look, I'm a Brit so "rah rah ARM". But they are not designed for performance and will be absolutely hammered by Intel/AMD in this regard for laptop processors.

* Compatibility, Apple would have to create an x86 emulator for ARM. That will be a massive hit on the speed and likely battery life as it runs the processor at 100% to try and eke out some semblance of speed emulating an x86 for all those current x86 applications.

Overall, it doesn't make any sense.

(I know historically Apple has gone this route before with 68k > PPC > x86, but it really doesn't make sense, as they went that route every time for performance and cost reasons)

0
1

Performance? That was the whole point of ARM

"* Performance, ARM processors are designed with power efficiency in mind. Look, I'm a Brit so "rah rah ARM". But they are not designed for performance and will be absolutely hammered by Intel/AMD in this regard for laptop processors."

Have you ever run comparable machines side by side? Even at the beginning with the Archimedes, ARM chips kicked the normal desktoip class processors clock for clock, as an example in 1988 if you took a Mac, an Atari ST and an Amiga and put them up against the Archimedes A305 (The bottom of the range machine), all running at around 8MHz, the Archimedes left the other machines for dead due to the processor architecture.

There are numerous reasons why the Archimedes didn't take over the world, including price, lack of compatability with current and emerging standards at the time, and Acorn's relative obscurity outside of education, especially in the UK, but performance was not an issue.

As for cost of a FAB, there are more ARM chips manufactured every day than any other processor, and there is no reason why an Apple chip cannot be manufactured alongside someone elses chip as the core processor is the same. ARM Technologies own the processor, but other people make and distribute it for them.

Ok, the point on emulation is pretty valid, though did PPC emulation really hit pattery power on a MacBook or MacBook pro? I didn't notice much if any difference, if anything XP under boot camp hit performance the most (And even then not by as much as many people claimed at the time).

IT would be good to compare a 2.4GHz 64 bit ARM chip with a 2.4GHz 64 bit intel chip, but as only one of those exists for the moment, we can only speculate on the outcome.

2
0
Gold badge

1988 was a long time ago

As the article notes "Just as Intel has yet to prove its x86 chips can match ARM for power efficiency in mobile devices, ARM has yet to show it can match Intel - and AMD - chips for sheer compute performance"

I'm told there is a fairly fundamental reason for this. For any given process technology, you can design for half the performance and expect to run at about a tenth of the power. Put another way, the second 50% of your performance will cost 90% of your power. Historically, Intel have found their profits by targetting performance and ARM has found its niche by targetting power consumption. We have seen that Intel's power ratings have improved in recent years as they've started to explicitly target ARM's market and I expect we'll see the same in reverse as ARM start to target Intel's market.

2
1
Bronze badge

Yes, but not quite

That's true if you ignore the efficiency of the instruction set and hence the number of clock cycles needed to perform a given task. X86 is terrible - not it's fault, ancient and of its time 30 years ago, and Intel have worked wonders keeping it going this long. But the ARM instructions set is much more efficient (it is RISC after all), so clock for clock, transistor for transistor ARM will normally outperform X86. Intel might have some advantage in floating point performance, but with codecs being run on GPUs / dedicated hardware, who really does much floating point calculation these days?

You can see some of the effects of X86 from the performance Intel have managed to extract from Atom. That is, not very much. And all for more power and less throughput than ARMs of a similar clock rate are achieving.

1
0
Boffin

Re: Performance? That was the whole point of ARM

"Even at the beginning with the Archimedes, ARM chips kicked the normal desktoip class processors clock for clock, as an example in 1988 if you took a Mac, an Atari ST and an Amiga and put them up against the Archimedes A305 (The bottom of the range machine), all running at around 8MHz, the Archimedes left the other machines for dead due to the processor architecture."

This may be true, although the A305 had the same CPU as the top of the range A440 upon the Archimedes series' introduction, so the "bottom of the range" label has only rhetorical value.

"There are numerous reasons why the Archimedes didn't take over the world, including price, lack of compatability with current and emerging standards at the time, and Acorn's relative obscurity outside of education, especially in the UK, but performance was not an issue."

Actually, a few performance-related things came up: lack of hardware floating point support (solved initially using an off-the-shelf coprocessor solution, finally remedied using a from-scratch coprocessor which was eventually integrated into some CPUs which meant that the 48MHz ARM7500FE in an A7000+ was quite possibly faster than the 200+MHz StrongARM in a RiscPC, but still arguably not competitive with other CPU families); issues with page sizes and address translation tables in systems supporting virtual memory. In addition, the relatively slow iteration period and ARM's refocusing on other customers (notably Apple with their Newton perhaps with various internal projects that never made it to market) meant that beyond the ARM3, the competition was able to close the gap.

0
0
Boffin

Re: Yes, but not quite

"But the ARM instructions set is much more efficient (it is RISC after all), so clock for clock, transistor for transistor ARM will normally outperform X86."

I'm not arguing that the ARM instruction set is cleaner and adheres well to the original RISC philosophy, but CPUs supporting x86 dedicate hardware to translating instructions to the RISC instructions of the underlying CPU core. The AMD Am29000 (http://en.wikipedia.org/wiki/AMD_Am29000) is a good example of where these worlds collide, ending up in x86-oriented CPUs.

0
0
Gold badge

Re: Yes, but not quite

Instruction decode is about 2% of a desktop CPU's area. x86 has higher code density than ARM, unless you use thumb, at which point it is comparable but similarly register-starved. Sorry, but I just don't see the greater "efficiency" of the ARM's instruction set.

In any case, clock for clock or transistor for transistor comparisons don't count. You need to consider the whole product. For example, Intel were never able to clock Itanium chips as fast as they did the Pentium 4, and the latter were then out-performed by slower-clocking successors.

1
0

Page:

This topic is closed for new posts.