back to article Surprise! Intel smartphone trounces ARM in power trials

The industry analysts at ABI Research pitted a Lenovo smartphone based on Intel's Atom-based Clover Trail+ platform against a quartet of ARM-based systems, and Chipzilla's system not only kept pace with the best of them, but did so using less power. "The benchmarks were impressive but the real surprise was the current …

COMMENTS

This topic is closed for new posts.
  1. Nate Amsden

    first AMD now Intel

    Those people that hate x86 must be fuming. First AMD dramatically extends the life of x86 with x86-64 a decade or so ago(and almost single handedly quashing Itanium), now Intel seems to be extending it even further with this stuff.

    interesting to see, assuming the tests were balanced.

    1. Daniel B.
      Boffin

      True

      Indeed, x86 should've died long ago. Though this tests might have had some "special sauce" tweaking so that the Intel chips would show up as better, given the "ABI Research provided no details on the content and construction of their benchmarks" part.

      1. h3

        Re: True

        The Orange San Diego seems to have better battery life than any Android smartphone I have seen / used.

        (Can get 3 days with light use out of it.)

        Not at all surprising these results. If Intel starts building Android with their superior compiler then they will get significant improvements. (Or gets the optimized functions from eglibc / uclibc). Be much harder for arm to do the same thing with Realview.

        1. Anonymous Coward
          Anonymous Coward

          Re: True

          The Reg's recent review of a Asus Padfone thingy was a perfect illustration of how you need to compare like with like.

          The Padfone has a much bigger battery than any smartphone available right now. More comparable with an iPad.

      2. CheesyTheClown
        Facepalm

        Re: True

        x86 is definitely not the ideal instruction set, but RISC vs. CISC or VLIW was never really what it was made out to be. There were just as many disadvantages to RISC as there was to CISC. Code size on RISC was huge. Then we ended up with the bastard step child of RISC being Thumb which was somewhere in-between.

        These days, the instruction set means nothing in reality. It's all about efficiency in processing itself. It's about things like how the CPU handles cache coherency, how the CPU manages passing code between cores, how to handle multiple ring-0 contexts... effectively making Ring 0 the new ring 0.5. It's about handling SLATs. These are all things which matter. Then of course what matters is the ability to power down major parts of the chip. This is something which doesn't work well in a single die environment where 99% of the chip is synthesized from a common VHDL/Verilog code base which doesn't allow for the analog nature needed between units.

        Intel's chips make use of x86 and x64 instruction sets, but no decent processor today will make use of that when executing code. Now the next generation of Atom is also doing away with x86 and x64 in the core and replacing it with a instruction set agnostic architecture. The CPU will instead attempt to recompile the code when it receives it in order to handle tasks out of order. In fact, to a certain extent, the nature of x86 and x64 lend better to this design since RISC groups everything into a single instruction where ever possible. Intel's nature is relatively granular and provides what will be easier to recompile on the way in and manage dependencies for. I can very easily in my head design algorithms for out of order execution of x86 instructions where ARM instructions require a second phase altogether to manage the instruction dependencies... though it's not particularly difficult either... just takes more transistors.

        If you also give me a chip with AVX2 instructions, then I'm really happy. AVX2 is just damn sexy in everything regarding mobile phones. It would allow me to vectorize my code and make use of two-in-one-out instructions. If they make a new set which allows a single instruction for a 16x16 16-bit hardware transpose operation or an extra flag to access registers vertically instead of horizontally, I'd be in love. At the moment, a 16x16 transpose is the last missing instruction in AVX2 in my opinion.

    2. asdf

      Re: first AMD now Intel

      If true then Intel succeed in spite big time of x86 not because of it. It would be hard to come up with a worse instruction set for milliwatt mobile computing than x86. Intel had to throw a whole lot of money, engineering talent, and most important of all being a generation ahead in fab technology to compete. That is assuming this is all true in general usage. A big question not answered by this article is how much more the Intel part costs than the multiple sourced mass produced ARM part.

      1. Charles Manning

        The waste of talent

        "If true then Intel succeed in spite big time of x86"

        Too true. Imagine what amazing chips Intel could make if they took all that engineering resource and put it into making their flavour of an ARM chip? The x86 must be giving them a 2x penalty. If they used a reasonable architecture they could knock other players out of the water.

        Intel: why, oh why, did you sell of PXA to Marvell?

        1. Mark .

          Re: The waste of talent

          But I doubt that's true - if they'd be better off making an ARM or even brand new CPU type, why wouldn't they do that? Why wouldn't they take the chance to "knock other players out of the water" - do you know better than the people making the decisions at Intel?

          The advantage of x86 compatibility isn't that much for mobile (there's no Windows compatibility to make use of, and in fact x86 rather than ARM harms them due to lesser compatibility for native Android software). And I don't think it would help give them an advantage either against ARM (since even if x86 does well in mobile, Android will still support ARM for a long time yet).

          No, the reason they do x86 is likely because that's what they do well - their engineers, their software, their manufacturing, is all geared up for it, and you can't just magically turn that into "making an ARM chip". This situation comes up all the time, I know it does in software - "If only I wrote this old stuff from scratch, I'd do a much better job", but the problem is the effort in doing something new is more than simply improving or even hacking the old stuff.

        2. Ken Hagan Gold badge

          Re: The waste of talent

          "The x86 must be giving them a 2x penalty."

          Uh? It's 2013! Out of order execution and micro-ops mean that the ISA's only impact on performance is instruction decode, and instruction decode is about 1% of die area. On desktop-sized chips, ISA hasn't been relevant since the last century. It would be fair to assume that it hasn't mattered on mobile-sized chips for quite a few years either.

          What matters is where you choose to invest your development budget. Intel are now putting theirs into mobile.

          1. Anonymous Coward
            Anonymous Coward

            Re: The waste of talent

            " the ISA's only impact on performance is instruction decode, and instruction decode is about 1% of die area"

            Careful Ken. You don't think the code density of the ISA has any impact?

            I don't know enough about x86 to comment about x86 vs ARM in this respect, but when comparing ARM vs classic RISC, ARM code tends to be smaller. This means more of the application fits in ARM cache, and you get more performance per MB/s of instruction memory (main or cache) bandwidth. And for a given memory size (ie cost) you can fit more "stuff" in on ARM. More performance per MB/s means a slower clock on ARM gets the same performance as a faster clock on a classic RISC, which in turn may mean cheaper batteries or longer battery life or...

            The reasons ARM code is smaller include the Thumb instruction subset and the general predicated instruction capability.

            ISA *does* matter in cost/size/power-constrained embedded systems, and it may well matter more than you seem to think in mobile phones.

            The Coremark benchmark (sourcecode freely available) might shed some light on stuff like this. Couldn't see any recent Intel results last time I looked. Anyone else seen any?

            "What matters is where you choose to invest your development budget. Intel are now putting theirs into mobile."

            Intel have repeatedly put their development budget outside the world of x86 IT for many years, and repeatedly failed. Maybe they'll find a winner this time, after all it is closer to their traditional x86 comfort zone than (say) iAPX432, i860, i2O, IA64, embedded graphics, wImax (add your own to the list of Intel's non-x86 flops).

            1. Wilco 1
              Thumb Up

              Re: The waste of talent

              For Coremark the most recent certified scores are 9.36 Coremark/MHz for a dual A15 and 6.61 for a dual core / 4 thread Atom N2800.

              In addition to what you said, check the Atom die size: http://chip-architect.com/news/2013_core_sizes_768.jpg It seems to me Atom is a bit more than 1% larger than A15.

              That's the "ISA doesn't matter" myth debunked once and for all.

          2. dmcq

            Re: The waste of talent

            That's not quite right. The x86 architecture needs a lot of memory interlocks like checking for writing into the code being executed and has much stronger coherency requirements besides being saddled with all sorts of strange operations. AMD's general manager of the server business unit said it took them more than ten times more money as twice as long to design an x86 chip than an ARM one. That's down to all the messing around and it'll tell now that the whole business of designing is getting more standard but the actual designs are getting more complicated

            Those figures are interesting though as others said one can't say anything definite as no real details are provided but the one that struck me most was them saying the Intel chip had four times better memory performance if I read it right. That would probably explain most anything else and I'd really like to know how it was achieved.

            I'm fairly sure Intel will be able to cream some of the high end market off when they get their new mobile chips out whatever about whether these figures one way or the other. Personally though I think the more worrisome strategy of Intel as far as ARM is concerned is that it is trying to get a better presence in the chip foundry business. If they could knock out the high end of the other foundries they could then start causing real trouble for high end competitors and properly protect that market together with their high margins which all this sort of work doesn't really.

        3. Dave Lawton
          FAIL

          Re: The waste of talent

          Maybe because they were incapable of doing it.

          Go on, google the performance issues of Intel's StrongArm replacement - the first X-Scales, and the other issues they couldn't fix with the later revisions.

          As far as that table is concerned, it's meaningless without the test schedules, so that someone else can independently verify the results.

          Besides Intel themselves have said that they can't beat ARM with current fabrications, just get close.

      2. RikC

        Re: first AMD now Intel

        If I'm correct Intel is not only a generation ahead in the incremental sense of the word, but has a fabrication technology that allows transistors to linearly scale down power consumption (I don't remember the exact details and source, I wondered if the was on TheReg). This is the big reason behind this achievement, and makes concepts such as Big.Little unnecessary. Which keeps me wondering, what would ARM processors be like with that production process? I guess even better. So unless the competition cannot replicate that it's just a matter of time before ARM reigns supreme again.

    3. Voland's right hand Silver badge
      Devil

      Re: first AMD now Intel

      How did they measure the results? Internal phone "power draw" measurement as used in Android for the "what is using my battery" stats? That is waaaaaaay buggy and off.

      I will believe this once I see the battery taken out, current meter inserted, the current measured and recorded. With pictures demoing how they did it as some of the devices in question have a soldered battery. While at it - all phones running Cyanogen same build to ensure that it is a CPU benchmark and not a "how much bloatware did I stick in the build" benchmark.

      In any case, we can expect major suckage in a few years time. Not to worry. This Intel phone has a proper Imagination Tech GPU. Watch the show when it gets an Intel one.

      1. Dave 126 Silver badge

        Re: first AMD now Intel

        >How did they measure the results? Internal phone "power draw" measurement as used in Android for the "what is using my battery" stats? That is waaaaaaay buggy and off.

        >I will believe this once I see the battery taken out, current meter inserted, the current measured and recorded.

        You want multimeter readings? From six months ago:

        http://www.tomshardware.com/reviews/atom-z2760-power-consumption-arm,3387-5.html

        "...tore down tablets and identified critical points where microsoldering leads to a fancy version of a Fluke multimeter yields power consumption data for specific SoC and platform subsystems.

        "Our own benchmark data, extrapolated, is consistent with Intel's. At idle, Nvidia's Tegra 3 imposes similar draw as the Atom. But as workloads become more demanding, Intel's lead increases.

        "I encourage you to do to the same arithmetic we just did when it comes time to comparing platforms. In the meantime, seeing how Intel does its power consumption measurements by soldering wires under a stereo microscope has given me an idea."

    4. John Sanders
      Linux

      Re: first AMD now Intel

      Surprised?

      The real world is about "Practical" and "Pragmatic", with a dash of luck and sense of opportunity.

    5. Anonymous Coward
      Anonymous Coward

      Re: first AMD now Intel

      Point is, we would all be using Itanium if Intel could design a decent CPU properly.

      To continue to tweak x86 is obviously much easier. You can do a before/after comparison of it the change made any difference.

      So rather than say x86 is good and well done Intel, we should be questioning why it is that nobody can design a new processor family that is miles ahead of both?

  2. Sam Haine
    FAIL

    "ABI Research provided no details on the content and construction of their benchmarks"

    Whatever happened to the 'death of the Reg' icon with the tombstone?

    1. Tom 35

      Just one question...

      Is that Chocolate or Maple Fudge?

    2. Rampant Spaniel

      I thought a more appropriate question might be who paid for it? It could be completely genuine and they did it for giggles or publicity. They could also do a lot of work for intel or want to impress intel. It's a very interesting set of results, especiallly if its legit. Intel could spur arm on even more which is good for all of us. Then again if there was nothing to hide they should really have given more details on their methodology and any links to the companies involved.

  3. Shagbag
    Thumb Up

    One swallow doesn't make a summer

    While these tests are ostensibly impressive, I think we need to see a few more independent trials before we can say Intel has cracked it.

    As long as Intel don't get a monopoly in smartphone CPUs like they did with desktops, as a consumer, I'm glad both institutions (Intel and ARM) are trying to out-compete each other's technology.

    The last thing we want is another couple of decades of a Microsoft-like era where real innovation was stifled by pure economic (monopolistic) power.

    1. Anonymous Coward
      Anonymous Coward

      Re: One swallow doesn't make a summer

      Intel's track record of flying straight isn't good. For US businesses it isn't about competing, it is about destroying the competition.

  4. P0l0nium

    "which means that we, the consumers, will be the ultimate victors ..."

    Don't think so .... We're about to enter an era where "The competition" isn't competitive enough and that will have 2 effects:

    1) Price of tier 1 mobile devices will rise as Intel extracts is "pound of flesh" in order to recoup its zillion dollar capital investment.

    2) The EU bureaucrats will declare Intel a monopolist citing their dominant market share as evidence and will issue fines for some yet-to-be-devised infraction of competition rules.

    Strange old world: I'm off to build a 22nm Fab in my back-yard ... Oh wait... You can't!, its really hard.

    Oh never-mind - I'll just build my superfast Quad-core Krait thing on TSMC's new 16nM process.

    Nope , can't do that either because it doesn't work.

    How about Samsung .... Nope, lost the plot 2 years ago.

    And you're left with ....... an X86 chip in your Iphone.

    1. ThomH

      The competition is embeddable cores. Samsung, Apple, et al pick the ARM core, the GPU and everything else, lay out the silicon (or let a computer do it) and hit print. It's very mix and match. As a result, competition is healthier than it has been in years. ARM is likely to persevere both on momentum and because you don't have to go begging cap in hand any time you want a custom fabrication.

  5. Schultz
    Boffin

    What about standby power consumption?

    Phones spend a lot of time heating the pockets of their owners.

    1. Paul Shirley

      Re: What about standby power consumption?

      While I'm surprised at this test (and frankly want proof before I believe), the power saving changes we already know about should have fixed standby performance, if it needed fixing.

      Another aspect is how significant CPU power is in overall consumption. If you're gaming all day or playing video it's going to be important but the screen is still likely to top power use. My elderly and not too efficient phone with it's ancient Qualcomm MSM8255 Snapdragon shows battery usage of 43% screen, 19% WiFi, 23% standby+idle. That's just 15% power used for the couple of hours it was actually working hard (gaming+browsing). As screens get bigger an efficient CPU becomes even less important.

      1. Dave 126 Silver badge

        Re: What about standby power consumption?

        > If you're gaming all day or playing video it's going to be important but the screen is still likely to top power use.

        This Atom uses Power VR-designed graphics like many of its ARM competitors, so playing video might not be the area the biggest differences are seen.

  6. Wilco 1
    FAIL

    Intel sponsored "research"?

    The benchmarks appear to show that a dual Atom can beat a quad A15 on CPU performance. That's quite a feat considering Atom is a 5-year old 2-way in-order CPU while the A15 is a modern 3-way aggressive out-of-order CPU!

    However independent benchmarks show a completely different picture: a Galaxy S4 leaves the K900 in the dust as you'd expect from the microarchitecture comparison: http://browser.primatelabs.com/geekbench2/compare/1979365/1970335

    So that suggests something is going on with the chosen benchmarks. There are a million ways to cheat with benchmarketing. If the quad A15's somehow have to do more work then it is no surprise they burn more power doing so...

    Also the results only show current, which means nothing. For a power efficiency comparison you'd have to measure Watts, and even more importantly Joules (ideally just the CPU, not the whole phone as in this case). Total energy to complete a given task is what matters.

    1. Mark .

      Re: Intel sponsored "research"?

      I have no opinion on the rest of your comment, but your point about age is incorrect - Clover Trail was introduced in 2012, not five years ago (sure, the original Atom is a lot older, but that's like saying ARM is even older).

      1. Wilco 1

        Re: Intel sponsored "research"?

        CloverTrail is a 2012 SoC indeed, however all Atoms are based on the Bonnell microarchitecture which apart from a few minor tweaks is essentially unchanged since 2008.

    2. C 7

      P=I*E

      And assuming they tapped in between battery and phone to measure I, E should be 3.7v for all of them, as that's standard for a LiIon phone battery. Granted that's making a few assumptions given the sketchy details of the testing, but they'd have to be real amateurs (or sheisters) to use I as their benchmark if E wasn't consistent across the board.

      1. annodomini2

        Re: P=I*E

        Lies, damn lies and statistics.

      2. Wilco 1
        Boffin

        Re: P=I*V

        No 3.7V is not standard. There are different kinds of Li-Ion batteries, and commonly used ones vary from 3.6V to 3.8V nominal. It's important to understand what nominal voltage means - it is simply the average between the minimum and maximum voltage. Actual voltage varies from ~4.2V when full to ~3V when empty. Also the battery age, temperature and current draw affect the voltage.

        So no, one cannot just measure the current and assume voltage remains a constant. I'd say showing just current is admitting you're an amateur. To measure power consumption accurately you need thousands of samples per second.

    3. Dave 126 Silver badge

      Re: Intel sponsored "research"?

      >Total energy to complete a given task is what matters.

      That is the methodology that Intel have been pushing:

      http://www.tomshardware.com/reviews/atom-z2760-power-consumption-arm,3387-5.html

  7. Anonymous Coward
    Anonymous Coward

    no mention of the elephant in the room

    On android Intel chips are poorly supported for apps build using NDK

    1. Adam 1

      Re: no mention of the elephant in the room

      According to the NDK website

      " These requirements mean you can use native libraries produced with the NDK in applications that are deployable to ARM-based devices running Android 1.5 or later. If you are deploying native libraries to x86 and MIPS-based devices, your application must target Android 2.3 or later."

      That doesn't seem like a show stopper in 2013.

    2. BXL
      Happy

      Re: no mention of the elephant in the room

      Not any more. Our Marmalade SDK is now supporting native x86 C++ apps on Android. It was relatively painless to convert from using the ARM tool-chain to the x86 ones.

  8. This post has been deleted by its author

    1. 0_Flybert_0
      Boffin

      Re: All well and good...

      firstly .. ARM chip makers can't use intel's trigate / 3D FinFET tech .. which is why power draw is low .. so 20 - 22nm might be hard to achieve .. despite IBM lab examples .. Most new ARM and GPU chip process is a 32nm today and has greater volt leakage .. especially when cranked up to 1.5Ghz or more

      secondly .. TSMC is having difficulty in scaling up 28 nm production .. GlobalFoundries has barely ramped it's 28 nm line .. both had claimed last year to be going to 20nm by the end of this year .. but that seems unlikely in quantity until well into 2014 .. Intel .. which has shown itself a *bit* more reliable on process roadmaps .. will be at 14nm in quantity by mid 2014

      those that think intel won't gain significant inroads in the SoC business are just not paying attention to history .. Intel does not enter a market unless it sees a profitable future in it ..

      perhaps intel likes to control the architecture of it's chips ... not license and therefore dependent .. or felt it couldn't compete .. with a profit against established players .. Qualcomm .. Samsung .. Xscale .. Apple .. TI .. Intel .. if you are paying attention .. doesn't bother competing in GPU either .. except as integrated in the chip die .. why compete with nVidia .. when they can co-operate and keep their common competitors .. like AMD .. suppressed together ?

      1. An ominous cow heard

        Re: All well and good...

        "Intel does not enter a market unless it sees a profitable future in it .."

        Maybe so, but the future is hard for even Intel to predict, whereas Intel's historical track record outside their x86 comfort zone speaks for itself - a lengthy list of failures, including those I just listed in my reply to Ken Hagan.

        Is an x86 "SoC that isn't an SoC" in their comfort zone?

      2. Phill 3

        Re: All well and good...

        "Intel does not enter a market unless it sees a profitable future in it"

        Not quite true - Intel does occasionally try things for which there isn't even a predictable market let alone profit.

        Look at the Larbtree / Many-core / Phi path - The latest Chinese super computer is looking good but who would have know all those years ago when Intel started down that path?

        Also don't rule out the value of the bigger branding issue. Even a relatively poor product with their name on it can influence buyers for their main product line. E.g. Integrated graphics. No-one would say there's a profitable market for feeble graphics but it did keep Intel's low-end CPUs selling more than AMD's in the not too distant past.

    2. larokus
      Stop

      Re: All well and good...

      how is that relevant? the Intel chip in question is a 32nm part. Intel will be releasing 22nm mobile chips in the near future as well so if you want to see what happens at 22nm let's see what happens at 22nm for both platforms.

  9. Sil

    No surprise there

    Sure these facts need to be confirmed by other testers but it's not like it's a real surprise.

    Anandtech came to basically the same conclusions a few monthes ago.

    Also, while not the same product, most tests of haswell laptops have surprised independent testers with outstanding battery life ( such as mac air).

    And the ways to improve processing power such as increasing MHz or say implementing out of order execution are known to take a big toll energywise.

    So while it's not difficult to make a super slow low power processor the faster you want it to become the harder it is to be energy efficient.

    1. Matt Bryant Silver badge
      Thumb Up

      Re: No surprise there

      ".....So while it's not difficult to make a super slow low power processor the faster you want it to become the harder it is to be energy efficient." Anyone else remember how slow and much less power-hungry x86 used to be? I have an old 386 desktop in the cupboard that has a 40W PSU, whereas my current desktop has an 800W one! ARM may have started out as the low-power champion, but adding cores and speed has fattened it up, just as it has done with the x86 desktop CPUs. I just don't see why people are so blinkered by hatred of Intel not to take note of the fact that Intel have spent decades shoe-horning x86 into laptops, and they have experience from their own phone CPUs as well as experience with their own ARM designs. The inevitable growth in power requirements for ARM has allowed Intel to be competitive.

      1. stuff and nonesense

        Re: No surprise there

        There is no arguing that intel have done a good job getting performance up and power down in it's x86 processors.

        Some of us remember that intel got the contracts (ratified) to build the x86 stuff for PCs on the backs of others. IBM demanded a second source.

        Since the 386 when intel believed they could "go it alone" they have tried to sue the competition out of the business (486 time), used anti-competitive practices (buy the competition out of the market by bribing system manufacturers), they have used skewed benchmarks (read anandtech and toms hardware around the time of the athlon / p3).

        Anything can be proven using the "right" benchmarks.

        As for hating intel, naa, not really.. I just don't buy from convicted monopolists.

    2. Dave 126 Silver badge

      Re: No surprise there

      >Sure these facts need to be confirmed by other testers but it's not like it's a real surprise.

      >Anandtech came to basically the same conclusions a few monthes ago.

      Thank you Sil, I'm glad someone has been paying attention to recent developments. "ARM is more power efficient" has become near dogma, when the reality is actually more interesting. Another bench-mark heavy site, Tomshardware, has been looking at this too.

      I don't care what my next phone is built around, and I'm not saying Go Intel: I'm saying lets have more data.

  10. Anonymous Coward
    Anonymous Coward

    Knowing what benchmark test they used would be helpful. The test could have been rigged to make the Atom look better. Also, why did they list Amps? Watts is a better measure to use than Amps. With how chips to turn components off and change their voltage. Some of the Atom processors range from .75 to 1.1 Volts.

    There has to be a reason why they listed Amps.

    1. frank ly

      Testing a range of mobile phones using software benchmarks, you can probably only measure battery current; unless you do some tricky attachment of current probes onto very fine pcb tracks or chip pins. As far as the user is concerned, the only thing that matters is battery current, which determines life before recharge is needed.

      Actual chip power measurements would be useful for high power desktop and server applications where heat production (and dumping) is a very important factor.

      1. Robert Heffernan
        Boffin

        Easy To Measure

        If you are doing this kind of testing on consumer electronics, soldering flying leads to the battery terminals is no big deal. Wire the battery in instead of insert it into the compartment and splice your current measuring devices into the cable. Easy.

    2. Neil Barnes Silver badge

      Amps...

      is useless without knowing the voltage; that'll give you the wattage aka the power dissipation.

      Although if your battery is specified in amp-hours and you care about battery life, then the current is a useful measure - but it's not measuring processor efficiency. Example: a five volt supply might require an amp; I can make that half an amp by using a ten volt supply - but I haven't helped the efficiency (except perhaps marginally in the voltage converters).

  11. h3

    Intel are still not really even trying. 32nm is still years old technology.

  12. johnparchem

    Intel chips have always beat ARM in power efficiency test ( that is comparing work per watt). ARM's low power really only shines when the chip is idle. The Intel chips and systems always have had high idle power. With the move toward tablets and smart phones the devices are not as idle as they use to be. Thus ARM is moving into Intel world. Good luck.

    1. Dave 126 Silver badge

      It says a lot that Johnparchem has been heavily downvoted, when what he says is supported by Tomshardware:

      "In general, our analysis suggests that the ARM-based CPU core is excellent at doing nothing, but starts to require considerably more power during computationally-intensive workloads... In this scenario, the CPU cores aren't cranking away, but the graphics core is still refreshing the screen and reading from memory. This constant reading taxes the memory controller, and is one reason why the Atom maintains low power consumption. Under heavier loads, we saw the Tegra 3 take a double hit as CPU power use ramped up quickly, along with the memory controller's draw.

      Even though manufacturing technology is one of Intel's obvious strengths, the efficiency of its memory controller also becomes quite apparent in the company's power measurements. Intel and AMD have both pointed out the challenges facing ARM as it moves to 64-bit out-of-order execution, since both companies took years to refine and perfect their own implementations. Memory control is just another one of those areas Intel and AMD dedicate a lot of R&D to optimizing."

      -http://www.tomshardware.com/reviews/atom-z2760-power-consumption-arm,3387-5.html

  13. Charles Manning

    One good power number is not enough

    Well, first off well done Intel for getting such low power... even if it isn't an apples to apples comparison using undocumented benchmarks and different software. Pity thouth that it isn't enough to move the needle.

    The biggest challenge for Intel in this space is that do not have any hardware partners. Intel makes the whole SoC end-to-end. As such they can only afford to make a limited set of SoCs. ARM is way different. There are many different companies producing ARM SoCs, each slightly different - covering a wide range of different markets. They compete furiously with eachother, upping the game and reducing prices.

    Next, Intel have a terrible track record in the embedded industry. Embedded design people HATE Intel with a passion. Intel has pumped up the market with new offerings, then choses to dump that business unit and leave designers high and dry. What happened to 8051? 80251? i960? NOR and NAND flash? Various USB chipsets? DtrongARM? ... Once bitten, twice shy and all that... With Intel it is about ten times bitten - 11th time shy.

    Thirdly, many of these gains are likely from playing process hopscotch. The ARMs coming down the pipe will soon be as good.

    Sorry, Intel, but one good result with power numbers is not enough to grab the tiniest percentage of marketshare.

  14. Christian Berger

    That's actually not the point why you'd want to have x86

    The power of x86 lies within the IBM PC, a fairly open and standardized platform with common hardware(-abstraction) and good ways to boot any operating system you want. (unless you have EFI "Secure" boot)

    That's the power of it. Suddenly you can create, for example, a secure cryptophone, just by taking a minimal Debian, and adding OpenVPN and VoIP to it. And it would run on many phones without modification.... in a way just like people are doing now with PCs. You can easily turn your PC into a video disk recorder, just install the proper Linux distro. And no, it doesn't need to be ported like Cyanogenmod, it'll just run on your hardware even though the developer may never have seen it.

    1. Anonymous Coward
      Anonymous Coward

      Re: That's actually not the point why you'd want to have x86

      "The power of x86 lies within the IBM PC, a fairly open and standardized platform with common hardware(-abstraction)"

      OK. But x86 phones are IBM-compatible PCs now? PC-compatible display+keyboard, PC-compatible storage, PC-compatible (w)LAN and USB, PC-compatible BIOS, etc...? References most welcome. Demonstrations of Windows installing on such a phone also welcome.

      If that truly is the case, I'd be surprised. Sometimes I'm surprised.

      Even if true, it still leaves the small matter of what else needs to be on a *real* SoC, and which design+build partners put it there, but let's ignore that for now.

      And if that isn't the case, it's back to doing a hardware layer. Maybe not a complete port to a different instruction set, but then nor are two different ARM SoCs a complete port.

      " You can easily turn your PC into a video disk recorder, just install the proper Linux distro."

      And you won't be able to do that on ARM because?

      Meanwhile, a year or two ago Intel declined to provide Linux support on some members of its SoC family. What's the current state of play of that game?

    2. Charles Manning

      Nobody want x86

      People never want features, they want benefits.

      For example, nobody wants 64GBytes of flash on their phone. Nor do they actually want to even store stuff on their phone. What they really want is to access all their stuff while they are on the move. If you could find a different way to provide their stuff while on the move - without other penalties, they will accept that solution too.

      The huge justification for x86 has always been: People are familiar with Windows and want to have the software that runs on Windows (eg. Office). They therefore buy windows, which needs an x86 processor. That monopoly has created a huge market for commoditised PCs, all which run x86.

      The MS monopoly pretty much gave Intel a monopoly on a plate.

      We have seen far better software and PC-like hardware (eg. Acorn RISC OS) being stifled by the MS/Intel monopoly.

    3. fritsd
      Linux

      Re: That's actually not the point why you'd want to have x86

      The power of x86 (...)

      (...) just by taking a minimal Debian, (...)

      Debian runs on 18 different architectures, if I counted right (not all of them official ports with the full infrastructure). Just saying...

      source: http://www.debian.org/ports/

  15. John Smith 19 Gold badge
    Meh

    "secret" benchmark *proves* Intel is superior.

    I don't think so.

    Exact screen sizes make a difference for a start, backlight brightness as well (and wheather it auto adjusts).

    And it's a good point about amps. AFAIK all mobile batteries are proprietary to their phones. So the mfg can set the voltage to whatever they like.

    So 1/2 the amps at 2x the voltage = same power level.

    Not really a level playing field, is it?

    1. Charles 9

      Re: "secret" benchmark *proves* Intel is superior.

      The batteries themselves, yes, but haven't most phone batteries settled on a common voltage of ~3.6V?

  16. WatAWorld

    Who paid for the testing?

    Who paid for the testing?

  17. Dick Pountain
    Holmes

    Stable Door?

    A company with Intel's resources ought to be able to match ARM.s performance eventually. The point is that ARM now has the same code-compatibility grip on the smartphone sector that the IBM PC gave Intel in the PC sector. Overcoming code inertia may be beyond even Intel, unless they pay vendor's rewrite bills for them (they probably have enough cash).

    1. Mark .

      Re: Stable Door?

      Code compatibility? The dominant smartphone platform with 75% share runs the majority of apps on a virtual machine, and already supports Intel. WP uses a virtual machine also. And all the feature phones run various Java based stuff.

      So there'll be a 10% niche of phones stuck on ARM only, but I think 90% is more than a big enough market for Intel!

  18. Alan Johnson

    LIes damn lies and benchmarks

    Benchmarks are notoriously 'fixable' and a benchmark that is not even described is a joke.

    Intel may or may not have overtaken arm but this benchmark suggests that Intel is inferior to Arm if they were better why do this dodgy benchmark why not do a real one and publish the details.

  19. P. Lee

    Wrong measurement?

    We already know atom beats arm when it comes to work/watt. The issue is watts/idle.

    Actually, it isn't even that - price makes a big difference, as does controlling your own corporate destiny. If everyone does ARM, everyone has a chance to tweak things for competitive advantage or a different market slant. If everyone takes a single model CPU, things get very boring very quickly.

  20. Alan Brown Silver badge

    ARM has been resting on its laurels.

    This should be a good wakeup call.

    1. Destroy All Monsters Silver badge
      Headmaster

      Re: ARM has been resting on its laurels.

      ARM just licenses its stuff to fabbers. This should be good information:

      http://en.wikipedia.org/wiki/ARM_architecture#ARM_licensees

  21. boatsman
    FAIL

    W = V x I

    since the V is not known, we do not know the watts, so the article is meaningless.

    basic physics, did you guys forget ?

    1. Charles 9

      Re: W = V x I

      Don't most of these operate around 3.6V?

      1. Danny 14

        Re: W = V x I

        Which bit of the device? Screen? Cpu? Gpu? What test was run? What brightness etc

        not enough information

    2. Charlie Clark Silver badge

      Re: W = V x I

      Don't forget to add time to that. Even knowing the power draw is of limited value if you don't know how long the processor is doing anything. Spinning up and spinning down are important, too.

      I'm sceptical about these results. AFAIK x86 beats the pants off ARM for rendering web pages but itself is soundly whipped when the GPU gets involved, as on the I-Phone. This is why the SoC with the right silicon for the right task is so important and why big.Little will only start to make sense when the compiler and scheduler have had a few generations to get it right. Intel does not do the heterogeneous computing environment of modern mobile devices anything like as well as ARM or even AMD.

      1. Anonymous Coward
        Anonymous Coward

        Re: W = V x I

        You can avoid the time taken getting to the right compiler optimisations for big.LITTLE if you choose the right scheduling scheme. It appears the current stuff in Linux is not as good as could be (In kernel scheduling - cant remember exact name), and is improved by simply upgrading the scheduler so it can move stuff from core to core quickly.

        Although that upgrade to the scheduler is quite difficult as as standard, the SMP scheduling currently assume all cores are equal, which they of course are not in big.LITTLE

  22. John Smith 19 Gold badge
    Meh

    Intels problem is they are used to living on processor pricing for Server/desktop

    So how do you sell an instruction set compatible processor (because that's the core feature people buy Intel for) at 1/10 (or less) than what you pay for a desktop/laptop/server processor without people feeling they are being ripped off?

    As a designer you with an Intel processor you cannot.

    a) Change foundries. Don't like the shipping delays for your order? SFW.

    b) Insert other chips into their processor carrier. Get your own.

    c) Add ASIC functions to their silicon. You are joking.

    So you pay for more board space, more devices, more assembly and test costs.

  23. Anonymous Coward
    Anonymous Coward

    Okay...

    @Article

    "ABI Research provided no details on the content and construction of their benchmarks, but the comparative results show the Z2850-equipped Lenovo K900 to be more than merely competitive with the ARM-based phones."

    I stopped reading after that!

  24. Mikel
    Go

    I will believe it

    When I have the thing in my hand and it performs as advertised. We've had press-release mobile engineering from this one for many years.

  25. Adam Foxton
    Joke

    what theyre not telling you

    Is the Atom was running at 24volts...

  26. mark l 2 Silver badge

    Intel may now be as power efficient as the current generation of ARM SOC but they are still a premium brand so even if more manufacturers start to produce Atom based phones they will still be competing against cheaper ARM SOCs which means Intel will never get their costs down to get into the £100 budget phones which now are coming with dual core ARM cpus which for a lot of people are powerful enough.

  27. talk_is_cheap
    FAIL

    So a 10" arm based tablet pulls far more current than a 5.5" phone - go figure.

    The only thing I can tell from those benchmarks is that currently the Intel based chip needs far more memory bandwidth than ARM based systems to give about the same performance as the S4 phone.

  28. Paul 141

    Alas, can't be used in Europe

    Designed in Israel, so I read.

  29. zannfox
    Meh

    Can you see the LIE

    Can you see the LIE, they have not listed the Voltage

    Intel Z2580 3D Score 6664 Peak I = 0.61

    Qualcomm APQ8064T 3D Score 6628 Peak I = 1.404

    Power = V in Volts * I in Current

    if you do this they are using the same amount of POWER

    E.g.

    Intel Z2580 6.905 Volts * 0.610 Amps = 4.212 Watts of POWER

    Qualcomm APQ8064T 3.000 Volts * 1.404 Amps = 4.212 Watts of POWER

    1. Charles 9

      Re: Can you see the LIE

      Well, before we well "conspiracy theory," can we get any evidence that these devices were operating at anything other than 3.6V, which is pretty much the standard these days?

    2. theblackhand

      Re: Can you see the LIE

      While that's possible, the specs would suggest it is unlikely given an operating range of 0.3V-1.2V.

      http://ark.intel.com/products/70100/Intel-Atom-Processor-Z2580-1MB-Cache-2_00-GHz

      Of course they may have massively bumped the voltage....

      And I believe that the tests probably weren't rigged to favour Intel - performance and power usage aren't the things Intel struggles to compete on when facing ARM. ARM SoC's can provide most of the performance and most of the power efficiency in a package that can be tailored to the application at a fraction of the cost.

  30. John 156
    FAIL

    If Intel want to sell their Atom to OEMs, they have to create promotional literature for their salesmen to use against the overwhelming competition that ARM represents in their chosen market. Expect more of this as Intel attempts to gatecrash ARM's strongholds whilst mounting a rearguard action to protect its server base against the ARM v8 chips. This is PR.

  31. Phill 3

    Who were the benchmarks aimed at?

    Consumers want to compare devices for size, weight, price, battery life & functionality. We mostly don't care how the manufacturers achieve it but there is some brand loyalty.

    Device makers want to compare SoCs so they know what other chips or modules they need to buy and find space for within their device. If all other things were equal, a physically smaller SoC would reduce the overall device cost. Device makers are probably a lot more focused on the overall costs than consumers but still aware of the consumers' brand preferences.

    SoC makers want to compare CPU cores so they know what other functionality they could or should cram into the same SoC to make it more attractive to device makers. There are many choices that influence the balance here - a better core could need less memory, or the same amount of cheaper memory. Less memory could allow on-die space for hardware to assist the radios. The flexibility of choices created by the competitive ARM eco-system is the killer feature here.

    I'm only a consumer so I assume that the SoC makers & device makers have better access to more relevant (& accurate) figures to make their decisions. But as with most techies, I would be very curious to see a real like-for-like comparison at each level for shipping products: Intel & ARM cores, Intel and ARM-based SoCs, Intel-based and ARM-based devices with the same price/feature/performance envelope.

    Unfortunately this article therefore seems of no value to device makers, SoC makers or techie consumers like me.

  32. plrndl
    FAIL

    Nexus Phones?

    Since when have the Nexus 7 & 10 been phones?

  33. Henry Wertz 1 Gold badge

    I'm surprised

    Tite says it -- I'm surprised. I knew the newer Intel chips were relatively low power, but did not realize they beat out ARM.

    I hope Microsoft can be kept at bay enough to have some netbook-like computers on the market again to run Ubuntu or whatever on (as opposed to the over-priced, over-specced monstrosities like ultrabook that Microsoft wants in order to run Windows dcently.) I was waiting for an ARM based unit, but I realy don't care what chip it has in it as long as he result is low power use and decent performance.

This topic is closed for new posts.

Other stories you might like