back to article AMD details aggressive power-efficiency goal: 25X boost by 2020

AMD has announced an ambitious 2020 goal of a 25-times improvement of the power efficiency of its APUs, the company's term for accelerated processing units – on-die mashups of CPU, GPU, video-accelerator, or any other types of cores. Although AMD has committed itself to that goal, it won't be an easy one to achieve. Efficiency …

COMMENTS

This topic is closed for new posts.
  1. cyke1

    Problem with all this, AMD includes gpu as part of cpu's performance, well issue with that is NOT ALL programs use gpu properly or even at all, only a few do.

    1. Frumious Bandersnatch

      re: NOT ALL programs use gpu properly

      Or all O/S's. My graphics card runs noticeably hotter when I boot to Linux than when I'm in Windows. So much so that I had to install an extra fan to keep the machine from freezing randomly (pretty sure the north bridge was failing because of extra heat rising off the graphics card into that general area). Of course, if AMD wanted to prioritise power savings, they could totally help out the guys making the free drivers by providing a patch (or sufficient documentation) to fix this.

      1. Anonymous Coward
        Linux

        Re: re: NOT ALL programs use gpu properly

        I have a little AMD-based lappie that, according to Powertop, uses just on 5w when running the XFCE desktop (OpenSUSE, but it's about the same on Xubuntu), and on 80% screen brightness. This machine replaced a "low power" atom, that used nearly 4 times as much at rest. This may sound like a simplistic anecdote, but we're off grid, and wind and solar power is hard won, while battery capacity to store it is, of course, limited. So I really appreciate efforts at power saving. The laptop uses an AMD E1-2500 APU reported by /proc/cpuinfo, What I'd love is a mini-itx version of this to replace the Atom-based server which runs constantly, and accounts for 5% or more of the battery capacity over 24 hours.

        Not sure what that anecdote has to do with industry-led concern over global energy consumption, but the effort has an effect in this household.

        1. h4rm0ny
          Headmaster

          Re: re: NOT ALL programs use gpu properly

          >>"I have a little AMD-based lappie"

          "Lappie" is not an abbreviation of "laptop". It's the same number of letters and the same number of syllables. All it accomplishes is to make you sound American or that you really wish you could have a pet but aren't allowed.

          1. Anonymous Coward
            Anonymous Coward

            Re: re: NOT ALL programs use gpu properly

            You are the only one that said it's an abbreviation, he never stated that it was.You also presume he is not "American", he has not stated his country of origin, it is something you have made up.

            So loose the pedant icon and replace with a prejudice icon, it would be much more appropriate.

            1. h4rm0ny
              Headmaster

              Re: re: NOT ALL programs use gpu properly

              >>"So loose the pedant icon and replace with a prejudice icon, it would be much more appropriate."

              I was looking for a prejudice icon but I couldn't find one that was appropriate, so I went with the grammar nazi as it was close to English Snobbery which was what I wanted to convey. I dislike the Americanisation of the English language.

              Also, the word you want is "lose". Unless of course you think I'm going to set some icons free.

          2. Anonymous Coward
            Anonymous Coward

            Re: re: NOT ALL programs use gpu properly

            Sorry you feel that way, and I do hope your day/life improves soon. But oh, the speculation about my origin - that's a low blow.... and I do have a rather sweet westie. ;-)

  2. Anonymous Coward
    Anonymous Coward

    The future is APUs

    For the majority of applications APUs are the future and the future is here with Kaveri. Lower power consumption is always a good thing but most important for mobile and enterprise. AMD continues to deliver high quality class leading APUs which will continue to take market share from discrete CPU/GPU combos in all market segments.

    1. Rol

      Re: The future is APUs

      I agree, I built my current desktop around an AMD A10 5800K and while gaming wasn't my first goal, it makes a good fist of it. It runs many modern games at full tilt and maintains my number one criteria for quietness, which not having a discrete, noisy, GPU card certainly helps.

      As it is I have one huge chunk of heat sink on the chip with a low noise fan and but for the PSU fan that's it. The exhaust fan is still waiting to be fitted if the temps go up, but that hasn't happened.

      I would highly recommend APU chips for any general purpose home build.

      1. phil dude
        Boffin

        Re: The future is APUs

        i would agree, as I have the same chip in the HP, and I was genuinely interested in seeing what FP performance I could get out of the APU.

        Only is is nobbled by HP, so I cannot do it and use an Nvidia card as well.

        I'm a big fan of all technologies that increase computational density!!

        Yours wobbling molecules,

        P.

  3. Frumious Bandersnatch

    power arbitration

    They're free to steal my idea of a "power arbitration bus" that I mentioned in another thread about a year and a half ago.

    (also: gating is cool. Just came across this idea when reading the broadcom docs for RPI GPU today)

    1. Adam 1

      Re: power arbitration

      I am sure when processor designers are short on ideas that the forums on el reg is one of the first stops.

      1. Rampant Spaniel

        Re: power arbitration

        You may mock but AMD designers stole my idea of running their processors so hot they could be used to heat up Hollands pies.

  4. Anonymous Coward
    Anonymous Coward

    "I am sure when processor designers are short on ideas that the forums on el reg is one of the first stops."

    Excellent! I'd like to open source my cheese based diode technology that could supplant silicon within three years. Tests of single plane cheese on toast suggest that its binary "there one moment, gone the next" attributes match silicon switching, but the sticky, entangling threads of hot cheese indicate it can also offer quantum computing capabilities. A cubit of cheese on a stick with a piece of pineapple can also transport the holder back forty years, opening all manner of reverse processing possibilities.

    CPU, GPU, APU, all on their way out, to be replaced by the CheePU.

    1. Steve Davies 3 Silver badge

      Nice but...

      you forgot the Worcester Sauce taste attenuator. Makes using cheapo cheese possible.

  5. P0l0nium

    It must be "Funding Season" ....

    Methinks AMD must be up for another round of fund-raising.

    Oh wait... so they are:-)

    http://www.techpowerup.com/202125/amd-announces-extension-of-tender-offer-for-its-8-125-notes.html

    AMD always generate hopelessly optimistic slide decks whenever they need more "Stupid Capital" to fund next month's paychecks.

    1. Don Jefe

      Re: It must be "Funding Season" ....

      Show me a company raising capital by talking up mediocre goals based on 'last years' engineering and I'll show you a company we'll buyout next year. Fundraising is supposed to be based on ambitious plans with goals a little bit further out than you can reach at this very moment. That's what the money is for you know...

  6. BlueGreen

    Maybe answering the wrong question

    First thing I ask when given something to speed up (a common requirement for me) is,

    * why? Is it doing useful work? Need it be done at all?

    and subsequently these

    * is the slowness in the hardware so buy more of it, or crappily written software typically using hideous O(n >> 1) algorithms (which do you think is most common? go on, take a punt)

    * Do we need to process all of x or just a representative subset?

    and so on.

    We really need to ask why we need these huge datacentres, why people have to have everything 'on demand' etc.

    I guess this is one of those 'state the bleedin obvious' posts, but I guess it needs stating - yet again. (Bitcoin seems the perfect example of everything that's wrong)

    1. John Savard

      Re: Maybe answering the wrong question

      You can't control what people do with computers. You can't control what software they use to do it. All you can do is make better computers than the other guy, so that people will buy your computers and your company will make money. So AMD has the right priorities.

      We, who buy the really fast and/or energy-efficient computers from them are responsible for getting as much useful work out of them as possible. If anyone but ourselves should be held accountable, look at Microsoft, not AMD or even Intel.

    2. Anonymous Coward
      Anonymous Coward

      Re: Maybe answering the wrong question@ BlueGreen

      "We really need to ask why we need these huge datacentres, why people have to have everything 'on demand' etc."

      I have to say, the idea hadn't crossed my mind, so maybe you should repost that, substituting "I" for "we". Unless you're Charleyfarley, and that's the royal "we"?

      But it's a fine idea, and it needs somebody to take it forward. Perhaps you could establish a people's soviet committee, who could prepare a list of "approved & permitted" purposes for computing. Anything not on the list would of course be bourgeois profligacy contributing to climate change, and by definition would be unapproved and not permitted.

      On second thoughts, no, I don't like that idea. If you were to emigrate to North Korea, you needn't be troubled by the thought of fellow citizens indulging in frivolous use of computing power for trivial self gratification?

      1. BlueGreen

        Re: Maybe answering the wrong question@ BlueGreen

        @John Savard

        "So AMD has the right priorities". Yes. I'm not criticising AMD. My point was aimed at the (ab)users of the cpus, not the maker.

        @Ledswinger

        > On second thoughts, no, I don't like that idea. If you were to emigrate to North Korea, you needn't be troubled by the thought of fellow citizens indulging in frivolous use of computing power for trivial self gratification?

        Leaving aside bitcoin, it's not about the 'right' uses for a cpu, just the most effective use of it. Again, the point was aimed at the users. I've made seriously huge speedups in some applications (like, 3 or 4 orders of magnitude). This has in one case arguably saved a company which would otherwise have gone bankrupt. In another case I saved a small company many tens of thousands of pounds they couldn't afford by avoiding the need to migrate to a different DB (not bad for a couple of days work). In another their front-end querying is fast enough for real time when fed off a 100GB back end DB. Point is know your tools to get the best out of them. Brain + experience + knowledge -> usually much, much better than just throwing new hardware at problem.

        As for bitcoin, I'm afraid I do feel it is an abuse of resources, less so the actual energy cost so much as the lost opportunity to do something better with all that computation. We'll just have to agree to disagree I'm afraid.

        1. Destroy All Monsters Silver badge
          Holmes

          Re: Maybe answering the wrong question@ BlueGreen

          I've made seriously huge speedups in some applications...

          Yeah, well done. You know, we express the heuristic about how some need shall be economically solved using the MONEY metric. If there is MONEY to pay Mr BlueGreen to optimize some application, swell. If there is not enough MONEY to gainfully employ Mr BlueGreen but there is enough MONEY to buy another server instead, swell too. If the machine on which Mr BlueGreen might have optimized an application doesn't exist because someone decided that MONEY shall be spent transforming sandpeople into carbonized hungs of meat, tough luck.

          Brain + experience + knowledge -> usually much, much better than just throwing new hardware at problem

          Only true in the world of no economic limitations where "brain + experience + knowledge" can be had at lower prices than more hardware. Recently, this has not been the case in this dimension. As an aside, people tend to bitch and moan once the "brain + experience + knowledge" counterpart expressed in MONEY comes down. I wonder why. Yeah, we liked it in those caves.

          the lost opportunity to do something better with all that computation

          As I said, it's MONEY. Buy the computation, do something better with it. Maybe run SETI@Home, who knows,

          1. BlueGreen

            Re: Maybe answering the wrong question@ BlueGreen

            @Destroy All Monsters

            > MONEY MONEY MONEY MONEY MONEY ...

            What did you think I was talking about? It's all about money. If we'd bought enough hardware to run our unoptimised stuff off, we'd be dead, so we optimised it because it was CHEAPER.

            "Brain + experience + knowledge -> usually much, much better than just throwing new hardware at problem" - that's what the 'usually' was there for; sometimes you need more boxes.

            meh

            @the spectacularly refined chap

            You know perfecty well what I was trying to say, but ok, I got the notation wrong. Please show me how to express, for an input of n, a time/space growth as an arbitrary but greater-than-n function f(n).

            Just as a trivial aside, one algo I got maintained a sorted list by inserting the new item in an array (growing it each time for each new item inserted, which was a sum(1 to n) = (n(n+1))/2 which, taking the dominating term, is O(n^2) ), then quicksorting it. For large lists this started to take minutes because, as I'm sure you know, naive quicksort degenerates to O(n^2) as the input is progressively more sorted. So I replaced it with a sensible array growth and deferred the (unnecessary) sort to when it was needed, to get O(n log n) overall. I'm sure I don't know as much as you but I know a bit. Or maybe I should acknowledge my ignorance and just buy more hardware instead.

            I don't know what field you two work in but a lot of my optimisations are of this trivial type. It's rather sad, actually.

          2. DropBear
            Stop

            Re: Maybe answering the wrong question@ BlueGreen

            Only true in the world of no economic limitations where "brain + experience + knowledge" can be had at lower prices than more hardware. Recently, this has not been the case in this dimension.

            Original point aside, I'm frankly not amused by the implication that having 1000x faster hardware than, say, a decade ago but 100x slower software is a-ok, because building the machines was cheaper than writing the software properly - and hey, it runs 10x faster, woohoo! Guess what - it's nowhere in the same bloody universe as "ok".

      2. Anonymous Coward
        Anonymous Coward

        Re: Maybe answering the wrong question@ BlueGreen

        your self-gratification may be trivial, but it takes me hours to set up all the belts, pulleys, oranges and hamster tubes...

        1. BlueGreen

          Re: Maybe answering the wrong question@ BlueGreen

          > belts, pulleys, oranges and hamster tubes

          Hardware. I don't do hardware[*], sorry.

          [*] obligatory fnarr

    3. the spectacularly refined chap

      Re: Maybe answering the wrong question

      crappily written software typically using hideous O(n >> 1) algorithms

      Go away and learn what that actually means, it's clear that you don't. When you know what you are talking about you may be worth paying some attention to. Depending on the exact intent of (n >> 1) (much larger or left shift) you end up with either constant or linear time behaviour. Both are generally considered "fast", and well under even the theoretical minimum complexity of many tasks.

  7. defiler

    When does it stop being the GPU?

    I find myself wondering, if so many intensive tasks are so much better on a GPU, at which point will it take over dealing with the daily chores of the CPU? At that point will the CPU just be a boot device?

    1. BlueGreen

      Re: When does it stop being the GPU?

      I believe GPUs are a very different architecture to standard cpus. They are rather SIMD (<https://en.wikipedia.org/wiki/SIMD>) and allegedly a bit of a bugger to program efficiently, even assuming the problem is suited to them at all. They complement rather than compete with conventional CPUs.

      For example you couldn't practically run java on them and while you can use them to sort data you have to pick a sort algorithm that matches their design (<http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter46.html>).

      Disclaimer; I'm not an expert in these.

    2. Richard 12 Silver badge

      Re: When does it stop being the GPU?

      A GPU is optimised for "Do one identical action to thousands of independent datasets"

      A CPU is optimised for "Do thousands of different actions"

      It's the difference between very large numbers of rather simple processors (GPU) and small numbers of very powerful processors (CPU).

      Many common tasks would be ungodly slow on a GPU, and many are ungodly slow on a CPU.

      There will always be a need for a mix of technologies.

      1. dan1980

        Re: When does it stop being the GPU?

        A CPU is a general-purpose device, designed to handled pretty much whatever you throw at it. A GPU achieves it's amazing performance by being far simpler.

        You wouldn't be able to build a useful general-purpose computer with only GPUs.

  8. luis river

    My prophecy

    AMD will always will be the "poor relative" compared with Intel, alone AMD has future if another solvent company buys it

This topic is closed for new posts.

Other stories you might like