back to article AMD to nibble the ankles of Nvidia this summer with 14nm FinFET GPUs

AMD says it will ship graphics chips using its next-generation "Polaris" architecture from mid-2016. Crucially, these processors will use 14nm FinFETs, which means they should have better performance-per-watt figures than today's 28nm GPUs. Let's be clear: today's announcement is timed to catch the hype building around the CES …

  1. This post has been deleted by its author

    1. Paul Crawford Silver badge

      <- this

      In the past decade or so the only major trouble I have ever had when installing or updating systems has been crappy video drivers. Both Linux and Windows.

      A pox on them all!

      1. sgp

        Not to mention the crap update and 'enhancement' bloatware that comes with them.

        1. Les Matthew

          "Not to mention the crap update and 'enhancement' bloatware that comes with them"

          Custom install can sort that for you, on nvidia that is.

    2. Mikel

      Look up Vulkan, and Khronos Group. Apparently they have you covered.

    3. Bronek Kozicki

      I am convinced that AMD will keep its new AMDGPU driver in Linux kernel updated for the new chip, including open source register include files.

      1. pyite

        AMD GPL driver status?

        Bronek -- what do you think about the state of AMD's GPL driver currently? I love Intel's thorough commitment to open source drivers, and combining this with AMD's hardware would be wonderful.

        I have been wanting to try AMD again but I'm holding out until they have a GPL version of VDPAU or equivalent. VA-API is not GPL and not nearly as good.

        Thanks!

        Mark

  2. Paul Shirley

    power consumption

    Hard to believe 14nm finfet won't make a difference to power use but AMD have failed to deliver promised power improvements for so many years now I'm struggling to believe it will actually happen. I fear I'll be struggling on with my old GPU for a considerable time.

    If forced to upgrade for work I'm likely to end up back on nVidia despite their shitty attitude to bug fixes (both hardware and drivers) and the many times they just disable broken features instead of fixing them. My power bill will thank me and i won't need AC to run the PC full speed next summer.

    1. h4rm0ny

      Re: power consumption

      I've never understood why someone who spends £300+ on a graphics card cares about it using an extra £10/15 per annum in electricity. I mean if you extrapolate it for a decade, maybe it starts to accumulate to the point you notice it, but you're probably going to upgrade the card after a couple of years anyway, if you're part of that market.

      I only recall Perf. per watt becoming the big talking point after NVIDIA suddenly stole a march and got ahead of AMD in this area. Suddenly it became the big differentiator of graphics cards in any online discussion. I mean if the advance was used to reduce heat so you could ramp up the frequency, that would be more of an argument but it's mainly used to reduce power consumption.

      If we were talking laptops, I'd get it. But when the same thing is applied to desktops, I just don't see why it's such a big deal.

      1. Infernoz Bronze badge

        Re: power consumption

        It is a big deal because excessive idle/peak power usage compared to Nvidia (w/ GPU damaging heat), a more expensive PSU, and worse performance do cost time and money.

      2. Anonymous Coward
        Anonymous Coward

        Re: power consumption

        "I've never understood why someone who spends £300+ on a graphics card cares about it using an extra £10/15 per annum in electricity."

        It's more selecting a card that's (for example) £40 cheaper, because it's cheaper, but over its life will be more expensive when you factor in increased power usage.

        1. h4rm0ny

          Re: power consumption

          >>"It's more selecting a card that's (for example) £40 cheaper, because it's cheaper, but over its life will be more expensive when you factor in increased power usage."

          Lets run the numbers. And lets use current technology. Here is power consumption at idle and at load and I'm going to use the Fury which retails for around £455 and the GTX 980 which retails around £410, so there's your "£40 cheaper". At idle there's almost nothing in it (about 2W). At full load, the difference is about 100W. Source:

          http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/17

          Let's assume about 12p per KWh. So 100W at 8hours per day is going to cost you around £2.88. A whole year? £35.04.

          So there you have it. Run your card at load for 8hrs every day, Mon-Sun all year round, and you still wont make back your £40. And quite frankly, in that scenario you have bigger problems with your life than a small bump on your annual electricity bill.

          That's why I call bullshit on this "Power savings" lark. As I said, it only became the Big Talking Point when Nvidia suddenly found themselves able to beat AMD in it. If we're talking laptops, that's fine. But we're not - people keep using this to argue about desktop GPUs. I don't know anyone who would buy a high-end GPU and then freakout because it cost them £20 extra at the end of two years (a more realistic scenario). In fact, by that point such a person is probably itching to buy the latest new GPU.

          1. N13L5

            Re: power consumption

            It seems you're refusing to see, that saving 100W of power from being converted to heat in your computer system is going to make for a less noisy, less annoying computer system.

            Not to mention that all your components will live longer if your GPU isn't causing a 10 or 20 degree temperature rise for all other components.

            Lastly, a LOT of people would be extremely happy if they didn't need a large metal box to play computer games or farm bitcoins anymore, but could do it on a reasonably sized laptop, like Razor's Blade 14. The GTX 870 in mine wasn't bad, GTA5 ran well on it. And it didn't bake itself into oblivion yet. But I could easily see much better than this.

            Nvidia hinted, that with Pascal, laptops may not require 'm' versions of their GPUs anymore.

            If the power savings mean nothing to you, portability and noise does mean something to me.

            Also, your calculation is flawed, because you're assuming everybody keeps their GPU for only one Year. I keep mine for 3 years, so I'd be well ahead on the FinFet adorned GPU that cost 40 bucks more.

            1. h4rm0ny

              Re: power consumption

              >>"It seems you're refusing to see, that saving 100W of power from being converted to heat in your computer system is going to make for a less noisy, less annoying computer system."

              I'm not "refusing" to see anything. The poster I replied to talked about cost savings. That was the argument I was addressing.

              >>Also, your calculation is flawed, because you're assuming everybody keeps their GPU for only one Year"

              No I'm not. My post explicitly referred to a two year lifespan and explicitly stated that the sort of person who buys a top of the line GPU is typically looking for the next latest greatest within two years. Someone interested in long-term value nearly always goes for mid-range where the depreciation is far, far less in absolute terms.

              >>"If the power savings mean nothing to you, portability and noise does mean something to me."

              Then you'll presumably love AMDs new 14nm chips which are going to be available a long way in advance of NVIDIA's and already look to be far in advance of NVIDIA when it comes to like for like power saving.

      3. P. Lee

        Re: power consumption

        >I've never understood why someone who spends £300+ on a graphics card cares about it using an extra £10/15 per annum in electricity.

        Probably isn't the cost of electricity, but the reduced noise from running cooler, or increased power from cramming more in, for a desktop system.

        I'd hazard a guess that the desktop is also the proving ground for mobile, where power consumption is important.

        1. h4rm0ny

          Re: power consumption

          >>"Probably isn't the cost of electricity, but the reduced noise from running cooler, or increased power from cramming more in, for a desktop system."

          Agreed. Noise is an issue. But that's not what I'm counter-arguing. It's when people start talking about the cost savings, like the person I replied to.

          >>"I'd hazard a guess that the desktop is also the proving ground for mobile, where power consumption is important."

          Yes, in mobile it matters.

      4. Mikel

        Re: power consumption

        Globally computers account for about 10% of electricity generation. An eye on efficiency is good for the planet. Also, thermals limit density which is critical in servers and HPC. And as another mentioned, noise.

        1. HamsterNet

          Re: power consumption

          Performance per watt is a measure of the possible total performance.

          As graphics cars are limited by their heat generated. Hence why AMD stuck a water cooling loop on the fury X.

          With AMD claiming 2x performance per watt and Nvida already showing 10x performance, Im suspecting its not going to be a good year for AMD.

          (Source, Nvida just launched midrange 14nm core in it latest diving computer and it's pulling more teraflops than the current Titan X and thats still on GDDR5 and not HBM.)

  3. Infernoz Bronze badge
    Meh

    H/W vapour and too damned late, probably still with poor drivers.

    Getting this 1/2 size shrink to work well enough is probably going to be hard and I can't wait another year or more!

    I did like AMD/ATI, but my new main machine will not just have an Intel CPU, but a high end Nvidia GPU too (probably a 980 Ti) to properly drive 4K monitors, both with reasonable power consumption.

    1. Frumious Bandersnatch

      Re: H/W vapour and too damned late, probably still with poor drivers.

      Getting this 1/2 size shrink

      <pedant>halving the side of a square means one quarter of the area, not a half</pedant>

  4. h4rm0ny

    AMD

    The sub-title to this article seems a little unfair. AMD produce good cards and usually are an excellent choice on the price-performance scale. Their high-end cards are also actually better for 4K. They got held back by the hold-up to 14nm which messed up their release schedule badly. I'll be really glad to see them start hitting their stride again.

    I'm particularly interested in their new architecture to see if they have modified it much for HBM. Memory bandwidth is THE key thing you build a graphics architecture around. If you have a much higher memory bandwidth then you would want to do a considerably different design. So the two questions I'm most interested in are whether the new line-up will be focused on HBM with lower cards just being rebrands of older models and if so, how much the architecture is really changed to make use of the new technology.

    1. Infernoz Bronze badge
      Meh

      Re: AMD

      BS, an EVGA Nvidia 980 Ti is loads better for good 4K at a reasonable price, has better power usage and better drivers. No more AMD/ATI cards for me until I see real competition with Nvidia!

      1. Sorry that handle is already taken. Silver badge

        Re: AMD

        an EVGA Nvidia 980 Ti is loads better for good 4K at a reasonable price

        For the price of one GTX 980 Ti you could have two GTX 970s or R9 390s.

        1. HamsterNet

          Re: AMD

          1 x 980Ti will outperform both the 2*970s and 2*390s. SLI/Crossfire doesn't offer double performance and also causes greater variance in average frame rate.

    2. DrTechnical
      Alert

      Re: AMD

      AMD, and then AMD/ATI, are the sole reason that PCs are so cheap today. Period. If Intel was the sole provider of x86 chips, there would never have been the vast numbers of cheap machines everywhere!

  5. phuzz Silver badge
    Flame

    Even if you prefer nVidia, you should still be hoping that AMD can create some great graphics cards this year, because that will spur both companies on, and will mean cheaper graphics for everyone (hopefully).

  6. h4rm0ny

    The Features...

    As it gets harder and harder to push the boundaries of silicon, I think we're going to see a lot more emphasis and interest in being clever with what we have. Nvidia and AMD both got blindsided by the failure to decrease node size to 20nm. They both had plans they had to put on ice. Nvidia seems to have handled the crisis better.

    But now we're moving again, there are a lot of interesting little details in this new architecture. They've improved the compression further that they introduced in the last architecture - which eases the pressure on memory bandwidth a lot. They've improved the ability to calculate what doesn't need to be rendered significantly, apparently. That's a big deal because the problem AMD have had is the inability to keep their SPs working flat out - this helps feed them faster by needing to pass down only what they actually need for the end result. They've improved the hardware scheduler (same benefit - lets the card get more for the same amount of work) and updated the video encoding and decoding (important to some) with h.265 in hardware.

    I'm honestly pretty enthusiastic about this and looking forward to seeing it.

    1. Bronek Kozicki

      Re: The Features...

      I like this too, but there is one small detail that irks me : they appear to be starting at low-power end . I understand they do not want to inflict Osborne effect on Fiji sales, so it only makes sense. But I would still appreciate if they were a little more explicit about it. If I buy Fury part this year, I'd like to know whether or not it's going to stay near the top, performance-wise, of AMD cards, at least for the rest of this year.

  7. marees

    HBM(2) might be used only in high-end GPUs (manufactured in TSMC)

    the laptop class and low-end GPUs (replacement for 360, 370, 380) displayed initially and which are supposed to be manufactured at GloFo will ship with GDDR5 - which is more than enough and suits the purpose for this class of GPUs

    the higher class of GPUs (replacement for 390 & Fury) which follow later would have - the costlier & more powerful - HBM2 and presumably manufactured out of TSMC's proven high performance process

    the zen APUs expected to ship in volume next year (2017) might ship with 1st-gen HBM (expected to be affordable by then) as L4 cache/dedicated-GPU-memory in addition to supporting plug-in ddr4 chips for CPU.

    and special version of zen for consoles might be manufactured with SSDs built into the SoC itself I guess

  8. Emo

    Missed opportunity?

    I can't help thinking that AMD and nVidia missed a trick by not incorporating silicon that could do bitcoin/litecoin etc calculations. Think of all the coin farms they could have sold GPUs too.

    Or even a separate line of low watt high power *coin chips to really worry the competition.

    GPUs lost out to FPGA and ASIC due to power/performance per watt a long time ago. If they could nail the low power and have massive giga/tera hash products people would be queuing up to buy.

    A sideline of their engineers working on this could surely pull off something amazing.

    I guess its too niche.

    1. HamsterNet

      Re: Missed opportunity?

      WAY too late. ASIC came out for bitcons 2013, ASIC then came out for the other coins shortly afterwards.

      Until this point AMD sales where driven by having the best mining cards, but ASIC killed that overnight.

  9. Anonymous Coward
    Anonymous Coward

    nibbling at the ankles

    "AMD to nibble the ankles of Nvidia this summer with 14nm FinFET GPUs"

    So they are only half-byting at their feet?

    I'm only 2 months late with that joke.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon