It's not proof that Apple has plans to produce and sell such a machine, but the Mac maker certainly seems to have been exploring the possibility of offering a MacBook Air laptop based not on an Intel but an ARM processor. An unnamed correspondent of Japanese-language site Macotakara claims to have seen a sample Air based on …
sell, sell, sell
Is it time to sell Intel Shares/Stocks? Lots of support for ARM atm. Windows 8 is said to be ARM friendly, now apple going that way too?
Also, if Apple are putting ARM stuff in their iPhones, iPads etc, this will be great for them.
Re: sell, sell, sell
> Also, if Apple are putting ARM stuff in their iPhones, iPads etc, this will be great for them
iPhones, iPads do already, and have only ever used, ARM based processors, like 90-odd % of the mobile devices available.
I don't understand why, but there seems to be a great reluctance to say this by device, and chip, manufacturers.
I can't remeber the exact figures, but combined ARM shipments is ~ 100x Intel's, and there are already 2 ARM devices in use for every person on the planet.
"worked surprisingly well"
Why wouldn't it?
If Atoms can run windows... I even find make times for visual studio somewhat ok'ish but I run off an SSD.
I have not much fondness for Apple these days but fair play to them if they go ARM in lappies. May mean better ARM chips ultimately. May mean more ARM lappies elsewhere. More toys to play with.
Can't be bad, ultimately, *IF* this rumour is true.
It's the new ipad pro
..complete with physical keyboard, trackpad, connectivity, folding screen and OS X (not IOS).
Serously though - why would Apple do this? Unless ARM has a new processor due soon which matches (and ideally betters) the existing intel units in processing power and batter useage, there's no bloody point.
Why Apple would do it.
Mobile devices use SoCs rather than separate processors and coprocessors (such as graphics, signal processing etc.). Intel has a limited selection of SoCs so if they don't offer what you want, you will have to add coprocessors on the side. This increases the cost and power consumption of the system as a whole.
By designing their own SoC around an ARM core, Apple can get exactly the combination of coprocessors they want. And by having exclusive access to this SoC, Apple can prevent cloners and prevent people installing Mac OS X on non-Apple hardware, which they generally take umbrage to.
For one, they're keeping their options open. For two, how fast is "fast enough"? Most of the time your computer sits there looking stupid whilst waiting for the meaty thing in front of it to respond...
If it can be made "fast enough" for mobile executives to crank out documents and emails (and of course get onto YouTube to play videos of that 911 with the sports exhaust that they're fancying buying once their bonus comes through), then it'll probably be able to last 14+ hours without a recharge. That's worth a lot to some people. And if they can sell it for more money (on the basis of being able to fly to Bankok without plugging it in), even with cheaper parts inside, then it's goer.
All-day computing? In the Intel world that comes with a power adapter...
How does twice the battery life sound? and that is just with changing the CPU from x86 to ARM. Once you've made the switch and designed a good system on a chip you can free up masses of motherboard space and probably fit an even larger battery in there. Three times the battery life of an existing laptop may be possible!
Not to mention rapid sleep and wake, much faster than x86.
Consumer electronics companies love being able to fab their own chips with exactly what they want, they can't do that with Intel. They have to resort to putting certain chipsets on the mobo and hoping they work well together.
Atom is just too power hungry and inflexible in this modern gadget/appliance age.
I agree, but...
Maybe they made it as a present for the Intel CEO?
Ummm, not ARM and not necessarily better
ARM makes a processor core. To be more accurate, they design a reference design to a processor core which executes what has become a defacto standard, the ARM instruction set. If Intel were to take ARM seriously, it could license the ARM instruction set and replace the x86 instruction set decoder with an ARM one and they could come to market very quickly with a screaming fast Core based ARM chip. Remember that the Core processor doesn't actually run specifically an x86 instruction set, instead it translates the x86 instruction set into the internal native instruction path which is more like a RISC architecture as it is able to stack operations into a single clock so long as it doesn't require mutexing individual registers.
So, let's for the moment say that the ARM instruction set is the key point of interest in this case, not the reference design itself. After all, NVidia, Samsung and other companies are working rapidly to integrate their technologies directly into the ARM core to produce high performance versions of ARM chips even now. I'm sure that the ARM ALU will be completely replaced as the reference ALU, while power efficient is not particularly fast. Simply replacing the multiplier with an adaptive pyramid multiplier would be enough to increase execution performance of the ARM core substantially. But it would use so much silicon that it would pummel the battery life of handheld devices. Also, using a far more advanced instruction reordering architecture (such as that found in the Core series) would improve performance even further. In fact, given that the ARM instruction set has a much larger set of registers, if a compiler were to target ARM and maximize register allocation by fanning it out as much as possible, then a proper instruction reordering system could easily parallelize a much larger amount of code so long as the core itself were to implement enough arithmetic units to process the code.
I visited Apple a long time ago, during the birth of OS X, Just before the formal release of OS X 10.0. While I was there, I saw OS X running on x86, Sparc and PowerPC. The Darwin kernel is just another UNIX kernel and ports really well to any platform which GCC (or now LLVM) targets. Since Apple is the key developer on LLVM (or at least CLang), they are able to target and optimize to a new platform internally with little issue. At worse, they can fall back on the GCC back end to LLVM to generate code while they implement their optimizations. With only a little additional work, they can ignore generating architecture specific code from within XCode and use the LLVM JIT to target the local platform at runtime. As with MS CIL, this technology is no longer the half assed crap sold by Sun in Java. It is the real deal. Therefore, instead of forcing developers around the world to recompile fatter and fatter executables, they can ship a relatively streamlined LLVM IL exectuable. Thanks to the nature of OS which is based on bundles and more or less hidden/transparent subsystems (a technology which stems from OpenStep with did this on many platforms), the transition to this way of distribution would be relatively trivial. I simply would not be at all surprised if they have already done this, but I haven't checked.
Also, let's not make the mistake of trying to differentiate OS X from iOS on a technical level. It would be better to look at iOS as OS X but with a different "shell". iOS and OS X share the exact same Darwin (Mach) Kernel. They share most of the exact same Cocoa (NS) system APIs. They share almost the exact same additional frameworks such as QuicktimeKit, WebKit, CoreAudio, etc... they both support OpenGL (though limited to ES to support older iPhones/iPods on iOS). They both support OpenCL (through LLVM). In fact, with the exception of the system shell and the application packaging system, the two platforms use virtually identical code on every level.
So, so long as iOS runs on ARM with optimized drivers for the A5 support hardware, the only thing which really needed to be optimized for A5 would be the system shell. The system shell however is nothing more than an application itself. It almost certainly doesn't use any assembly language native to x86 or PowerPC. So, an LLVM IL compiled version of it should in theory run entirely unmodified on the A5 (or any other Darwin supported platform) unchanged.
The only issue Apple will need to deal with at this point is likely to implement Rosetta Stone again, this could prove problematic as this time instead of translating the PowerPC instruction set, it would need to translate the x86 instruction set which Intel make take issue with. While doing it should be really easy this time around as there are no endian related issues and therefore translating code from x86 to ARM will be trivial (though with major performance issues regarding unaligned memory accesses which x86 excels at), there almost certainly will be horrible licensing issues involved. Therefore Apple might instead of for an alternative method which is to simply say "Ask your software vendor to recompile their application using the latest version of XCode", and the problem will be solved.
Then the only problem is with software shipped from vendors who are now defunct or who make heavily use of x86 assembly language for optimization.
Let's address battery usage really quick. Battery usage is far less about the instruction set and far more about how each instruction is implemented and how the operating system handles power management. OS X already has fantastic power management, but desktop applications aren't power management aware and therefore often choose to use more resources than are "battery device friendly". An instruction implemented using 5 million transistors uses far less power than the same instruction implemented in 50 million transistors. But the 50 million transistor implementation might be 20 or even 100 times as fast when used. This is particularly the case with multiple path single clock dividers which employ tremendous gate depths as opposed to iterative dividers which can often take hundreds of clocks to handle simple division. Software written for handheld devices which choose to divide instead of multiply or bit shift use MUCH more battery than the alternative. Thus far, since x86 hasn't been used in a handheld environment, there hasn't been a lot of development of operating systems or software which is battery optimized. Applications generally ignore sleep events or even "Screen dimmer" events from the operating system. Intel hasn't actually been trying all this time to make a low power chip that is as limited as the ARM (with regards to things like multiplier performance), but instead has been trying to make a low power chip that runs the existing application base at acceptable speeds while using as little power as the ARM.
ARM has had the advantage that it started as platform for low power consumption (ignoring Acorn which I don't really consider the start of ARM as opposed to EPOC). It grew into a higher performance system from roots where application developers didn't actually count clock cycles since if they needed to do that, then they were coding for the wrong platform to begin with. Intel started from the other end of the spectrum. So while it hasn't been until the last 2 years (of high end ARM platforms like iPhone) that high end applications have been coming to the architecture, it's only recently that low power consumption application development has come to x86. And as a good note on this, try playing Need for Speed or another game of the type on a phone. Watch the battery drain at incredible speeds. It's not just x86 who sucks at power management.
So... in reality, ARM has absolutely nothing to do with this. Apple already has a chip and they already have OS X optimized for it. Their only struggle will be getting all the app developers to target LLVM IL. And that's less of a problem than ever since :
1) Airbook as no DVD drive, therefore software is typically installed from the internet
2) Mac OS X provides pretty much all "Apple Approved" applications through their store and "If it's not in the store, then Apple gives a shit less about it"
3) ALL Mac App Store applications are compiled using a VERY recent version of XCode
4) Using a lot of unapproved APIs for OS X will make getting an app into app store a problem.
5) It is VERY likely MOST if not all apps approved for the App store are in fact LLVM IL binaries and NOT native x86.
6) Apple probably has made a list of all applications in the app store which are in fact x86 specific and will contact the vendors to help them "upgrade" when the time comes.
7) The biggest limitation to iOS over straight OS X is OpenGL ES. But, the Samsung processor used in the 3G and 3Gs didn't support full OpenGL. The GPU in the A5 DOES. Therefore, since this would be the first Apple ARM chip used in the Apple ARM laptops, the lowest common denominator of hardware does in fact support full OpenGL and therefore IS NOT a limitation.
Now... add one more major factor to this....
A MacBook Air with ARM really only needs to be nothing more than an iPad 2 with a keyboard and trackpad. In reality, it would be 100% reasonable to suggest that Apple could in fact distribute full OS X for iPad 2 with a bluetooth keyboard and trackpad. In fact, they could even employ a proper flip-top tablet design which would run the OS X shell when the screen faces the keyboard and runs iOS shell when the screen faces away from the keyboard.
Don't think for a second that a machine has to be faster or better on battery life or anything like that. An iPad2 with keyboard and mouse would be more than enough for the casual user. As for the power user using FinalCut Pro or hardcore gamer playing Portal 2, that's a different story and ... well a different computer too.
Apple are not allowed to do this, I have sworn that I will never buy an Apple product, then the Macbook Air came out, and I was tempted but I remained strong...and now a MacBook Air with an Arm chip.....that may be a bit too much to resist!
The price tag might be a bit of a cold shower.
Even I'm tempted at the moment.
But who will think of...
.... the poor Hackintoshes!!!
A for Apple, for ARM, or ATOM?
Could still be Sandy Bridge.
I'm putting my money on Apple reviving the "iBook" moniker for a netbook type device running iOS. Probably priced a little cheaper than the iPad is now by losing the touch-screen and using a multi-touch trackpad instead.
So this will be an Apple netbook? Why would anyone need one of these when they could just buy a "magical and revolutionary" iPad?
re Warashi, Netbook
I get a real sense of deja-uv with your comment. Think back to Jan 2010. How many people were saying 'Why would anyone need an iPad when (slightly tongue in cheek) when they can have this magical netbook'.
There is a space in the market for a very low power consumption laptop. If it is as thin & light as the Air and can run 12+ hours doing useful work then yes I'd be interested in such a machine. I don't want it to play games. I want a device that makes it easy to write. The tablet format does not do that. Granted that there are some 'crossover' devices. Laptops that can turn into tablets with touch screens running Windows 7 (ugh!) or tablets that can turn into a laptop by adding a base station.
I don't have a tablet and with my smartphone able to access the internet then I really don't have a need for a tablet/fondleslab. Whereas, I would be interested in an ultra lightweight laptop with a decent quality screen that I can use to write.
My current MacBook has been used to write 2 novels with a 3rd almost complete. And no, geeks who read sites like this would not be interested in my stories. They are in the sub genre called 'boddice rippers'.
It makes a real change from writing code for a living.
Now off to a foreign planet where there are no Man U fanbois allowed for the weekend.
What to do. Mrs spilt coffee all over 4 year MacBook and successfully claimed on insurance. Must replace soon but now what? Gah
Asus Eeepad Transformer. And get biblically drunk for a month on the savings. Sorted.
(Yeah, I know, let the down-voting commence....)
Performance V Cost
Just remember why Apple switched to Intel in the first place. Performance. Arm-5 versus mobile i7 no contest. Will people really pay over the odds for a laptop with performance that sucks but with a few extra hours battery life ?? In fact never mind i7, core 2 leaves Arm in the dust.
I for one...
..don't welcome our new icon overloards.
Where's the Troll icon gone? Where's evil Bill and saintly Jobs?
<= who designed this scary amorphous blob hand slap abomination?!
....oh, wait, I've just seen the new "troll' icon..... it looks, erm
Back to iOS
I think their ultimate goal is to do with the laptop/netbook what they have done in the iPhone/iPad. Get it running iOS so they can create a closed system to force their customers into purchasing their software through the Apple App Store.
Maybe I missed something here, but isn't there a slight problem with applications not being compiled for the ARM core? If the likes of Adobe and palls don't compile apps for the ARM core, then the ARM cored mac book will be little more than an iPad with a keyboard.
It's like Power Mac fun all over again.
Cool. Let's hack it and put RISC OS on it!
@ Brian 6
No, they switched because IBM had no interest in producing a low power powerpc for laptops ie battery life.
intel or arm?
What about both?
There isn't much innovation in the intel mobo space, hdmi + a gazilliion usb ports is about as fun as it gets.
We've already had seamless (kinda) switching from low power gpu to dedicated gpu in laptops (I'd like it in my desktop too please!). I'd like to see the same for the main processor. Cool and quiet when "always on" doing server-type things and fire-up the i7 (and it's fan) only when its required.
Or maybe there's a way to clock down an intel cpu to a few hundred mhz
- Hi-torque tank engines: EXTREME car hacking with The Register
- Review What's MISSING on Amazon Fire Phone... and why it WON'T set the world alight
- Product round-up Ten excellent FREE PC apps to brighten your Windows
- Product round-up Trousers down for six of the best affordable Androids
- Why did it take antivirus giants YEARS to drill into super-scary Regin? Symantec responds...