breaking news, emulation incurs overhead...
ARM has rolled out a battery of test results that fire two shots across the bow of Intel's x86 dreadnought now sailing into Android waters. The first set of results addresses the fact that when running native apps that haven't been recompiled to run on Intel-based Android devices, those apps need to be emulated using "binary …
Even if Intel says it doesn't, which, I think, is the point that was being made.
Re: Update! Update!
To be honest I think the performance is quite impressive when you think of how binary emulation used to be. But I also think it is very ironic for Intel to be promoting exactly the kind of technology they lambasted Transmeta for!
When it comes to native vs. native: well, x86 clearly wins on single-core tasks. But ARM can give you more cores for the money which makes task-switching seem so much faster.
But the bottom line seems to be: Intel stop shouting about the hardware, get busy dishing out the software so that cross-compiling is automatic.
and if they require binary translation, they are not native. AMD are being a little disingenuous there.
That said, it looks like Google and Intel have work to do. I read a review of an Atom Z2760 based tablet running Android and its benchmarks were dreadful, putting it behind most of the current crop of Android tablets. The Sunspider result caught my eye, being around 1400ms, which sounded way too slow, so I ran the same test on my Windows 8 tablet with the same processor, it came in at around 600ms in IE, heck, even Firefox crept in at under 1000ms under Windows.
bleh, that should have been ARM, not AMD...
Re: Update! Update!
The issues that are caused by rubbish like HTC Sense/Google Play Services/Touchwiz far outweigh any caused by the Intel stuff.
Re: Update! Update!
"when you think of how binary emulation used to be. "
Not always. There was the Apple Emulator built for the Amiga which was reported to run Apple software Faster than Native Apple hardware.
Part of the secret of that was that the CPUs were compatible.
The other half was the Amiga proprietary Hardware which used a specialized Chiset to remove part of the computing load from the CPU to daughter chips.
The Amiga chipset was the Predecessor to Sound Cards and Video cards vback when everyone else was still doing it All with the main CPU.
but at least they can boast AMD64 compatibility for their main line of chips
I trust ARM more than
Intel anyway especially after the WINTEL monopoly we've had to endure
Correct, yet the end result is that the Intel (and AMD) based hardware is more really GNU/Linux-friendly. Their CPU's and video cards are almost always well supported and with almost always free drivers. We also have many SoC ARM manufacturers that write ad-hoc, non-portable, messy code and often proprietary. Say, free video drivers for ARM SoCs are almost non-existent, whilst, Intel and AMD have opened almost all of their video drivers a long time ago. .
As a result the proper GNU/Linux is better supported on Intel and AMD based phones and tablets and is a problem on good, capable, specially-made-for-Android ARM hardware. I guess that the penguin is till happier to be on x86 hardware than on any of the ARM SoC's
Re: I trust ARM more than
This little bit of apples-and-oranges makes me wonder whether ARM are starting down the corporate slippy-slide of dishonesty. My level of trust in ARM just slipped a notch.
Re: a paradox
User friendly for a given set of values of "userfriendly"
While the hardware is definitely more userfriendly it is also supporting secureboot from the start and having it enabled from the start. It is only a step away from this to the point where it will be impossible to disable.
Re: I trust ARM more than
Is an Android/Qualcomm one any better?
Re: a paradox
Since there are at least ten ARMs running Linux for every x86, the little penguin better get used to it.
I work a lot with embedded Linux, mainly on ARM, and there are far more people working on ARM than on x86 too.
The Linux ARM SoC world is improving by leaps and bounds - many/most SoCs are not using common cores for peripherals for which the drivers are stabilising. With device trees, a board config is almost getting to the point where just doing some device tree work is enough.
Agreeing with that too, yet for x86 there is only two architectures: i386 and x86_64. Every ARM SoC is incompatible with every other plus the drivers headache. Despite the number of developers, the mess, Linus and other Linux main developers have been talking about, exists exceptionally within the ARM realm.
So, if I get a powerful ARM tablet with good specs running Android, how hard is it to install one flavor or two of GNU/Linux? Even if it is not locked down and rooted? For the exception of the a few Nexus, Galaxy and other main brand models, it's pretty damn hard. Even the latter ones require "porting". How unlikely is it to not be able to install a GNU/Linux on x86 tablet?
Re: a paradox
"many/most SoCs are not using common cores for peripherals "
Should that be "are NOW using common cores" ?
..which Translate rather less than helpfully renders as "Love to engage in machine."
nominative determinism in action
Gotta love that the power consumption metrics were carried out by a guy called Watt. Any relation to James Watt, I wonder?
It cannot be the point of x86 to run Android
The point of x86 is that you have the "IBM-PC"-Architecture which not just includes the instruction set and core, but also how graphics and mass storage works and how to discover hardware. That means I can run an image of every OS written for that platform, and it'll work, no matter what company I buy my computer from. It'll also work on a decade old computers or on computers in a decade.
ARM is still working on it, and it's likely that the SoC manufacturers will oppose/boycott it. This is the opportunity for x86. Now they could act to make mobile devices as useful as laptops are. Instead of having non-upgradable devices, you could have actual computers which you could install a new operating system on after the support for the old one runs out.
Re: It cannot be the point of x86 to run Android
"That means I can run an image of every OS ... work on a decade old computers or on computers in a decade."
Not unless you are deliberately omitting Windows, MacOS, and (most?) Linux distros from "Every OS".
Pretty much all of those have current versions that just won't run on older machines, even if you plump up the RAM. If you meant "Well, _some_ version of these OSes, e.g. the one that was current when the machine was made, will still run", well, yeah. And my PDP-11 will still run RT-11, too.
Re: It cannot be the point of x86 to run Android
My x86 desktop will too .... SIMH for the win!
The ARM fellow doing power tests is quite appropriately named Watt. Looking for one Mr. Mflop in the ARM team though.
At least t wasn't "Armstrong". I would've blinked and called BS.
"The performance is not as high as we're seeing in the Intel device," Watt admitted, "but [...] we're getting much lower power."
So this isn't really a meaningful comparison. If you underclock the Intel device so that it's performance matches the ARM device, how do they compare? Alternatively, if underclocking is impractical, how about comparing energy consumption for a given task instead?
The article pretends to be concerned with battery life, but that's measured in mA-hours at a rated voltage, which is energy rather than instantaneous power. I'm sure anyone capable of making the measurements described in the article is aware of this, so I'm afraid to say I find the comparisons rather dishonest.
Re: Ken Hagan
Unfortunately for Watt, this type of rear-guard action by ARM supporters is all too familiar. We heard it from Cyrix and AMD - "we're not as fast but we use less power" - and how did that work out for them? Being low-power on one-use devices like simple phones is good, but the history of personal devices tells us that the memory, CPU and graphical requirements apps get more and more, not less and less. Even with Android. The apps I run on my Android 4 phone already simply won't run on older Android 2 phones. Tablets and smartphones are only going to get more powerful as their users demand more capability, so Intel being ahead in performance and being able to run multiple OSs will draw developers to x86.
Re: Ken Hagan
>we're not as fast but we use less power" - and how did that work out for them
Yeah look at worldwide shipments of how that is working out for ARM. High margin power hungry electronics are quickly becoming niche which is why Intel is starting to panic.
Intel has an ARM licence
They didn't flog ALL the ARM chips to Marvell either.
Why don't they swallow some ego and use their world beating production engineering, design skills etc to make a better ARM SoC.
Or could it be they think that they can charge a higher margin on x86 chips if they manage to dominate a market?
Really we don't need a narrow focused US company that dominates the PC CPU arena to be successful in Mobiles, Set-boxes, Tablets and over charge the Consumer, do dodgy deals with OEMs and cripple SW development with poor architectures and concentrate more on Process Shrinking than Architectural Innovation.
ARM's licence model encourages diversity and innovation. Intel's sales model is the opposite.
Re: Intel has an ARM licence
>Why don't they swallow some ego and use their world beating production engineering, design skills etc to make a better ARM SoC.
I think they have concluded that that would be a race to the bottom. Intel has never been good at high volume low margin production. That cutting edge technology takes lots of R&D dollars which a lot of their slimmer competitors don't need and don't spend on. Overhead is a killer for Intel as they currently exist.
This just seems like an advert to me. Without The Reg doing a little work to see for themselves they are just taking this whole thing at face value. Im sure there's truth in it but how much has Watt bent the truth to make a point? This whole article is full of doubt!
Yeah nobody will ever confuse El Reg with Tomshardware.
Not Cortex-A15 ...
> The Qualcomm Snapdraqon, based on an ARM Cortex-A15,
It's not based on Cortex-A15. Krait is Qualcomm's own CPU implementation (instruction set compatible, different microarchitecture).
"cripple SW development with poor architectures"
Ho ho. Coz we all still write our apps in assembly language, right?
Seriously, the ISA wars ended when Intel introduced out-of-order execution, over twenty years ago. Instruction decode is unimportant, whether in terms of execution time or die area, and x86 has been an orthogonal instruction set since the 386 so compiler writers don't actually care. There are probably people posting here who were born after the issues you raise were important. Perhaps you are one of them.
Re: Ken Hagan
".....There are probably people posting here who were born after the issues you raise were important. Perhaps you are one of them." Made me smile!
A topic that comes up frequently at these talks I give is precision. It's a fairly central topic to my audience (people in engineering and manufacturing) but it's applicable here as well.
People have a tendency to give too much weight to technical specs and they lose sight of the application itself: (a) has 7x greater precision than (b), kind of thing. It often leads them into making poor choices and spending far too much money to meet their goals.
In my example I have a maker of fine furniture (properly called a cabinetmaker ) and a precision machinist. The machinist will work down to .000001, or greater, and he uses tools with resolution of at least as high as the tolerances of his work. His completed work will go into a controlled environment where everything else is of a similarly high tolerance.
The cabinetmaker will work down to 1/16th, perhaps 1/32, and uses tools compatible with that level of precision. His work will go into a largely wild environment where every aspect of the things around it are as much a function of its purpose as they are of pure randomness.
The 'modern' cabinetmaker however, (this is true in real life) has begun to purchase and use the measuring and layout tools of the machinist. A really nice 6" square for cabinetmakers might cost $100, but he has chosen to spend $7,500 on the high precision square of a machinist. That's cool and all, but now none of his other tools appear straight. Compared side by side the cabinetmakers square looks positively bent as it shoots off to one side of the machinists square.
The only thing for that is to replace all his tools equipment with the high precision tools of the machinist. At great expense he does just that. Then he notices that the work he has done with the extremely precise and extraordinarily expensive machinists tools was no better than the work he was turning out using tools that haven't changed in centuries and some of which are, in fact, actually centuries old.
The cabinetmaker got caught up in a specs arms race and lost sight of the fact that not only does the medium he works in (wood) exhibit environmentally induced instabilities greater than 90% of the resolution of his tools, the very pencil or marking knife he lays out his work with can never reach the levels of precision provided by his machinists tools.
Furthermore, because he has had to fudge the actual cutting and shaping of the wood to reflect the disparities between his tools he has had to sacrifice proven construction techniques that take into account environmental instabilities. The fine chest of drawers he spent two months making is (stupidly) placed over an HVAC register and a year later has split and warped.
The moral of the story is that no matter how fabulous any given specification might be. Regardless of the comparative advantage of that spec over the spec currently in use, it makes no difference if everything else isn't capable of utilizing that improvement. It's entirely possible that attempting to utilize that spec to its fullest will lead to an overall degradation of the end product. Even then, if you can't control the environment the 'thing' is used in then all but rather large improvements will be lost in ambient noise.
Is one of these processors 'better' than the other? Don't know, it's not my field. But I do know that anytime a single spec or particular feat is paraded around it is worth your time to assess its applied value in whatever environment it will be used in and if you don't control that environment then it's highly unlikely the end user will see any change. End user perception is the benchmark, if that isn't noticeably improved you're wasting everyone's time and money.
Re: Invisible Precision
" End user perception is the benchmark, if that isn't noticeably improved you're wasting everyone's time and money."
There's a whole slice of the market that buy on spec, with little regard for anything else. Think of PC tinkerers, who always need the graphics card with the highest FLOPS benchmark. Think iPhone buyers who upgrade to the new model with Pavlovian reliability. Think camera enthusiasts who buy the latest <insert Nikon name here> because it has a few more pixels or whatever....
In many instances these upgrades have no noticeable effect for end users. But does that matter? Without early adopters there is no market, and despite the fact that they can't see a real difference for the new product, they've got the spec they've just paid for, and they'll believe they can.
I take my hat off to these people. Thank you for helping encourage innovation for the sake of innovation. Thank you for paying extra to reduce the cost of the technology by the time I will buy it. Thank you for taking the risk on new products that simply may not work. Thank you for taking the risk on standards that may never gain market acceptance. Heroes to a man!
Re: Invisible Precision
Let's take another example.
I'm hardly a cabinet maker, just a wood botcher. I started in the days of handsaws.then I moved on to electric jigsaws and circular saws. Now I have a circular saw that uses thin carbide tipped blades, and which cuts more precisely, with better finish, and less effort, than ever before. It also needs less power to operate, which means that it can run from my little Honda generator. The first time I used it, I was simply astounded that sheet cutting could be this simple.
Tools for the job are evolving constantly. The trick is to use appropriate ones. Currently ARM is the carbide-tipped thin saw of mobile devices and the performance difference is real and visible.
Re: Invisible Precision
Says me. My staff. My previous employers. But most of all, my customers.
Customers who buy on spec are doing what all that marketing spend tells them to do. It's playing to the fact that end users of most mass produced consumer electronics don't know nearly as much as they think they know. All you have to do is tell them if a given spec is supposed to have a higher or lower value than the thing you don't want them to buy. It's great. They assign the magnitude of variance between two or more things based on what they've been told. That's fine. The fact that at temps of greater than 55C for four or more hours reduces that variance to a negative value in the sane comparison isn't going to matter, 'because they can tell'. That's all that matters to them.
But the product designer, engineer or manufacturer who buys on spec is failing at their job unless their job is making throw away products. If spec on the sheet and spec in the real world were the same I could cut out several million a year in salaries to my engineers. It gets a lot better too. The further away you get from catalog components and materials the published specs get ever less reliable. That shit never does what it says on the tin.
That fact is the technical runaway cost leader in advanced systems like HPC's, aircraft, satellites, spacecraft and really big, ultra specialized equipment like our new large mirror fabrication facility we are beginning to test. The only parts of that project that have stayed within published specs are the titanium bits and the motion controllers and we developed those specs, casting processes and even the flux on the boards in-house. Everything else has been tested for years as we've searched for things that actually do what they say.
It's fun. You should try it. Actually building something and not playing match the item# is where it's really at.
The trouble is that tablets, phablets and smartphones have just been too dashed good and, as all El Reg readers will know, they (the tablets, phablets and smartphones that is) have impacted on PC sales.
Oh woe! The PC makers, assemblers, component makers/assemblers/designers cry as they do depend upon such for income, livelihood and everything really.
will apps now start to look less effective?
the sharp, clear clarity of a mobile device start to look blotchy, pixellated as in days of yore?
will mobility now have to by policy, governance and execution now need to be second best because personal computer world needs to be much, much better than its mobile counterpart?
I suppose what I am trying to say less prosaically is: will stuff for mobiles now be knobbled for sake of improving sales of desktop computers?
"I suppose what I am trying to say less prosaically is: will stuff for mobiles now be knobbled for sake of improving sales of desktop computers?"
No, because the volume Chinese makers who are now snapping at the heels of Samsung and Apple don't give a stuff about desktop computers. If they disappeared tomorrow, companies like Huawei and Oppo would cheer.
And then there's Google's Chromebooks...Google wouldn't be unhappy.
Desktops will survive because there will always be heavy lifting to be done; they may just get a bit more expensive due to lack of volume, but the users won't notice because they will still be cheaper than anything else for a given processing power.
Measuring the wrong thing
The analysis confuses watts (instantaneous energy use) with Ah (energy use to complete the task). The second is the more interesting number when the CPUs have different processing rates. I could readily believe that Atom uses more watts, but also that it finishes the task sooner and can return to a quiescent state sooner than the ARM. So the question is: does Atom's behaviour use more amp-hours than ARM on various workloads. Or more concretely: which will exhaust the battery sooner?
Re: Measuring the wrong thing
Agreed, the article claims to be measuring efficiency, but simply reports power consumption without the performance part.
Give us total energy, performance/power (for games for example, where workload changes with performance, it makes more sense to report fps/W), or EDP, then we'll have some metric of "efficiency" to talk about.
Re: Measuring the wrong thing
The commentard confuses amp-hours (charge) with watt-seconds (energy). I agree that the amount of energy to complete a task is a far more useful metric than the peak power. But you need to multiply voltage by current by time to get energy, so why not just use the correct SI unit, which is joule.
Intersting that later apps have *less* native x86 and more VM code
Logic suggests as time goes on more apps would get round to doing an x86 build.
But apparently not.
noting unverified statements should be consistent.
It is quite poignant that the Register is quick and determined to point out that " these are Watt's numbers, not something The Register has confirmed".
This particular media company, as a close and supportive Microsoft partner has never reiterated such distinction when Microsoft makes claims that are not verified by any third party, independent (objective) source.
Some consistency in supposed objective reporting would be preferable to constantly reading a significant proportion of technology articles in the Register written by Microsoft shills whose prose is nothing more than Microsoft propaganda, with little or no factual technical backing.
Interesting What If?
Now, what if the king and queen of borging (okay = Google) were to introduce something like a PC?
Would it look like an iMac (iHopeSo) or be a beefed up tablet looking a bit like a laptop.
Come on Google skunk works spill the beans?
EDIT: ARM can't you give them a hint, a gentle nudge and maybe a Google Archimedes or Google Aristotle would be good for Greece, investors, ARM, the purchasing public in general and help the Google take a more commanding role in the evolution of mains electricity powered computers leaving iWin8 somewhere in the lurch?
Devs not using "Native"
I'd suggest that a worthwhile question to ask would be addressed to the Developers:
Do you aviod compiling for the native x86 environment because of Dislike or because that process requires you to make Changes to the App in order to get satisfactory results???
This is the real crux of the matter.
If it's just personal dislike, there's really not much that can be done.
If it's a Compatibility issue, M$ has some 'Splaining to do. And some Fixing to do, also.
So, on Windows, same benchmarks, Bay Trail vs Snapdragon 800 - Snapdragon uses 30% less power but Mr Watt then admits that Bay Trail is faster without going into specifics?
But that's the crucial thing, the crux of the argument! If Snapdragon also does 30% less work, then they are essentially equal - the argument moves to idle power draw and OS power management.
I suspect that actually Snapdragon does below 70% of the work of Bay Trail, making it actually less performance/watt efficient - and it's not just a wild guess, there's some old data to base it on:
SunSpider: Snapdragon 800 has 8% less performance
Kraken: Snapdragon 800 has 46% less performance
Octane: Snapdragon 800 has 52% less performance
These are Anandtech numbers from September 2013 - it'd be good if someone found some newer ones.