back to article AMD claims 'world's fastest GPU' title

AMD has unveiled its first graphics card based on its Graphics Core Next architecture, which The Reg told you about in excruciating detail this summer. According to AMD, the card – the Radeon HD 7970 – is also the only GPU to be built using a 28-nanometer process. "This graphics card represents a revolution in the graphics …

COMMENTS

This topic is closed for new posts.
  1. Oninoshiko
    Joke

    "an improvement of over 150 per cent in performance/sq mm over the prior generation."

    Bah! What happened to our sensible metrics like "Mhz", and "transister count?"

    1. Voland's right hand Silver badge
      Devil

      They are meaningless

      As a comparison to the previous GPU they are meaningless - different arch altogether.

      Abandoning VLIW is interesting though. This leaves Itanic as the last survivor in the Jurassic forest.

      1. Anonymous Coward
        Anonymous Coward

        Yup....

        ...even with a bloody great joke alert icon, some people still miss the whole point of the post.

  2. LaeMing
    Thumb Up

    Interesting times ahead (hopefully)

    Well, AMD did it to intel back in the '90s, and all of the industry (including intel's offerings - which I presently use) are a good deal better as a result. It will be fun to see if they can kick the graphics industry up a notch or two in the same way!

    1. Greg J Preece

      They've been doing that for a while. When they came back fighting with the HD4870, it was hilarious watching nVidia shit bricks.

    2. Ammaross Danan
      Coat

      GPU

      Based on the cursory overview provided, it actually sounds like a nice GPU. Granted, with the performance improvement in "performancy/sq mm", it leads me to think there's only a small 110% speed bump over the last-gen 6970. Even so, the feature set will be nice. If their ZeroCore works as one would hope, perhaps we'll have a GPU that fair better than 120W at "idle" (rendering desktop only). Incorporating Turbo Boost is an interesting ploy too, as it allows that OCers may have been doing manually/semi-automatically for some time: cranking the GPU into OC mode during game play, and reverting to normal/underclock for desktop use (especially considering that the GPU in some cases sucks more watts than the rest of the computer combined). Wasn't that the point of leaving the Intel HD gfx core on Sandy Bridge strapped to your monitor with a bit of Virtu magic to (hopefully, but didn't work very well) put your gfx card in idle outside of games?

  3. Andy 70
    Meh

    meh same old same old.

    AMD did it to intel in the 90's...

    well if history repeats itself in that regard, with AMD being AMD, and Nvidia being Intel, AMD will hold the lead for a while, massively publisize it, and then Nvidia will take the lead and romp off into the distance, and, if we continue Nvidia following Intel's current trend, actually scale back their release schedule to let the competition catch back up.

    but i don't think so. AMD have the lead. then Nvidia will over take, then AMD will retake the lead.

    ad-infinitum. not that this is a bad thing, i just think that the usual marketing guff of "a new era in graphics processing" a little bit of a stretch.

    so it does what we do now a bit faster. woopy-doo. how about showing us the way to do stuff in the future? that's the difference between evolutionary, and revolutionary.

    i'm sorry but AMD has been on the back foot since getting to 1Ghz first, and showing us how to do 64bit nicely.

    wasn't the ATI/AMD "greengrass" - or whatever it was called, graphics arch supposed to stomp all over the competition? but didn't?

    whatever. a short lived hollow victory. well done.

    1. Anonymous Coward
      Anonymous Coward

      You're missing the point...

      ....AMD (again) have realised they can't win on the how many processors you can cram into a space or how fast you can clock something, so they are doing something that they are good at, looking at the whole pc and going, right, how can we rework this without breaking everything (as Intel have such a huge advantage over AMD, they usually tend to try and force their technolgies through USB3 vs thunderbolt anyone?)

      So as with the Athlon, the x64's and now fusion they have gone, ok, let's see how we can increase the whole machine. I know lets get the CPU and GPU working together nicely doing what each is best at.

    2. scrubber
      Flame

      Intel AMD

      The only reason AMD didn't romp off into the distance for a couple of years while Intel pursued NetBurst is that Intel illegally forced Dell etc. to use their chips and not use AMDs.

      The CPU market would be in much better shape now had AMD actually built up a decent cash pile to invest in R&D to compete with Intel's Core architecture.

  4. Anonymous Coward
    Anonymous Coward

    More good news for consumers and AMD

    Works for me. Bring on the next gen GPUs for all to enjoy.

  5. Anonymous Coward
    Anonymous Coward

    Let's just hope this new chip isn't such a disaster as the over-hyped and under-performing Bulldozer architecture. The industry needs real competition for the sake of everyone.

    1. Piro Silver badge

      Haha, no

      There are already benchmarks at AnandTech.

      Don't even think the GPU division is similar to the CPU division.

      AMD's CPUs have been fairly uncompetitive for a while, and Bulldozer is absolutely terrible.

      AMD's GPUs on the other hand, are absolutely fantastic, and represent high performance at price points that make NVIDIA often look like a bad deal. The best thing to do for a year now has been to get a 6950 2GB, unlock the shaders to 6970, and enjoy the absolute sweet spot in the GPU market for high end gaming.

      1. Wallyb132
        Happy

        The bulldozer design...

        The bulldozer design itself isnt bad, Its the socket that AMD needs to improve... The bulldozer Opterons perform far better then the desktop versions.

        I have always applauded AMD for sticking with the 940 pin AM2/3+ socket design to promote upgradability, however not this time, they need to dump that design and go with something bigger. since they like to promote long term product evolution, they need to design a new desktop socket with lots and lots of headroom, its ok to build a socket with say 1944 pins and only use 3/4 of them for now, the material to make these sockets are cheap, but the future upgradability is invaluable, hell, they could even make an expanded socket that could accept the corrent AM3+ CPU's, ie make the center of the socket socket pin compatible and have the socket extend out sides of the chip to accommodate a larger chip, or something along those lines.

        Also, as for the bulldozer design, when looking at this product release, think they are great things in store for future Fusion products.

        Long live DAAMIT err AMD/ATII...

  6. Bronek Kozicki
    Thumb Up

    won't be surprised to see some poor benchmarks

    not because I would expect new architecture to perform badly, but rather because it usually takes a while to iron out programming with new paradigms and in new ways. If this is as revolutionary as claimed, it will be a while before driver writers learn to use the new instruction set properly.

    One other thing I'd like to know - how will this compete with CUDA.

    1. Anonymous Coward
      Anonymous Coward

      It's not that revolutionary

      The BRCM2727/2763 mobile device GPU uses a similar vector scheme and that tech has been around for years.

    2. Gordon 10
      Meh

      Indeed

      It will take a while for the graphics improvements to trickle down to mere mortal levels.

      Im also guessing that the market for HPC co-processing is starting to eclipse the high end gamer market in terms of sales.

      Therefore - unless it can out do Nividia in the HPC space its all a bit meh.

  7. h4rm0ny

    What do we use it for?

    I don't want to sound negative about this card. I actually get a warm glow reading about it. But I'm not really a gamer (have played about two in the last ten years), but I get the impression that games developers write primarily for the consoles and then port across an equivalent-ish version to the PC. And consoles are less powerful than a high-end PC + Graphics Card, so are games really making use of the power available in these cards? Correct me if I'm wrong.

    1. Richie 1

      Re: What do we use it for?

      Consoles only have to shift enough pixels for a telly. (Not even an HD in the case of the Wii.) PC games need to push round enough for a monitor (or two or three), which can be more than that.

      Also, the main point of AMDs architecture reworking is so you can offload more general computing tasks to the GPU (which is especially important if you have a Bulldozer CPU!). I expect that this sort of card will be lapped up by OpenCL users.

    2. Cuddles

      "I get the impression that games developers write primarily for the consoles and then port across an equivalent-ish version to the PC. And consoles are less powerful than a high-end PC + Graphics Card, so are games really making use of the power available in these cards? Correct me if I'm wrong."

      Yep, you're wrong. It seems to be a popular myth that PC gaming is dying, but there are still plenty of games developed specifically for PC, along with an awful lot more that are developed for all platforms at the same time rather than just being ported later. Plus, even the worst ports usually have much, much better graphics on the PC version, even though their interface often ends up sucking donkey balls.

      It's also worth bearing in mind that the current generation of consoles is obsolete and will likely be replaced within a couple of years (sooner for the Wii, but it's not really worth talking about hardware for that). Not only will that mean PCs need to keep working to stay ahead, but this sort of new technology is exactly the sort of thing that future consoles will be built out of. Remember, pretty much all improvements in computing have been made incrementally, and without the constant push for more powerful PCs, consoles would never be able to improve either.

      Slightly more on topic - from the Nvidia GTX580 website linked:

      "Swift. Stealthy. Deadly."

      I think my money will go to AMD, since at least their hardware doesn't seem to be threatening to kill me.

    3. brainbone

      GPU to eventually replace CPU's SIMD/FPU

      I would expect to see bulldozer successors to eliminate their native vector/SIMD/FPU abilities, to instead translate and offload the work to the GPU.

  8. Anonymous Coward
    Anonymous Coward

    however does it still need either a bespoke game patch or driver patch for every game released?

    So often on gamefaq or steam forums you see "weird shizz happens to game" "what graphics card you running?" "ATI" "problem found."

    Then after a week the game will get patched or ati will release a new driver that makes it work again.

    1. Rob
      Go

      Curently experiencing that...

      ... with Star Wars the Old Republic, worked fine in Beta, release version though is terrible, best frame rate I can get is 18 on an ATI chip. Playable, but waiting for the driver/game update.

    2. Anonymous Coward
      Anonymous Coward

      sage

      >ati

      >drivers

      saged for old news.

  9. tmTM

    But will they ship it?

    Going by their current production mess, I'm going to say no.

  10. Anonymous Coward
    Anonymous Coward

    Same performance as a GTX 580 but costs more?

    Despite its price tag and claim to be "The World's Fastest GPU" it still falls just short of the nVidia GTX 580 on some benchmarks. The perfrmance tests show that there isn't much difference between the two cards, they appear to be almost evenly matched and yet the GTX 580 is only £300 - £400 whereas this retails for £400 - £500

    It also falls well below the nVidia GTX 590 and in some tests fell below the 6990 albeit those are dual GPU cards but that then begs the question of why the price tag is so high when it isn't anywhere near as powerful.

    I hope this trend doesn't keep up because if it does, there is nothing "next generation" about this card when for a cheaper price I can get better performance from what is supposed to now be an obsolete card.

    1. No, I will not fix your computer
      Stop

      Re: Same performance as a GTX 580 but costs more?

      The benchmarks that it doesn't win tend to be game specific (e.g. Metro), it sometimes takes a while for the drivers to catch up, often patches specifically to get the best performance in games.

      Generally, overall the difference between a 7970 and a 580 is similar to the difference between the 6970 and 580, thus justifying the launch price - note you're comparing an old cards "settled" price with a new cards lauch price (which is only 10% more than the 580s launch price), expect them (both) to come down.

      >>there is nothing "next generation" about this card

      It's called bleeding edge! I think it's the only card that supports DX11.1 and it will be some time before we see the under-the-covers benifits of that (large cbuffers for example), and it scales much better in crossfire than the 580 does in SLI

    2. Anonymous Coward
      Anonymous Coward

      @ AC

      None of the published reviews agree with your assertion.

      The 7970 is approx 20% faster than the 580 on all tests. Only some of the super-OC'd 580s can match it, and they cost MORE than the 7970.

      The 590 and the 6990 are both dual GPU cards, and not a meaningful comparison at all. You can expect the 7990 to blow them both away as well.

      Your post appears to be green team FUD.

    3. Phage

      For clarity

      The 7970 is faster than the 580 by a decent margin and is launching a price that reflects it's performnace. ie About what the 580 launched at.

      http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/49646-amd-radeon-hd-7970-3gb-review-20.html

  11. Anonymous Coward
    Anonymous Coward

    Sounds good, but is it powerful enough to perform a 90 degree rotation on a subset of pixels in the framebuffer, and thus allow a PLP (portrait-landscape-portrait) Eyefinity setup? Cos by all accounts that's still an insurmountable problem for both ATI and Nvidia...

  12. JDX Gold badge

    re: what do we use it for

    Games like Crysis, with brand new cutting-edge graphics engines, are the area where PCs still lead the way.

  13. Ken Hagan Gold badge
    Mushroom

    That new metric

    "an improvement of over 150 per cent in performance/sq mm over the prior generation."

    As and when the size of my PC is limited by the size of the chip rather than the cooling system required to stop it doing this -> {see icon}, that will be a useful figure of merit.

    1. Hungry Sean
      Holmes

      is not a new metric

      Perf per area directly translates into "bang per buck." All else being equal, if you can squeeze a given amount of oomph into a smaller area, you're going to improve both your yield and dies per wafer. Of course hardware people have tracked this for years, it just normally isn't presented as a marketing term because it doesn't necessarily translate to a change in sales price (might be an improvement in margins for the manufacturer instead). If price doesn't move, frankly, consumers don't care about this, but management clearly does.

  14. Giles Jones Gold badge

    Do they still ship the "World's crappest drivers" with them though?

    1. stuff and nonesense
  15. Dick Emery
    Joke

    The mothership has landed

    "AMD Radeon HD 7970 graphics card (click to enlarge)"

    Isn't it big enough?

  16. Anonymous Coward
    Anonymous Coward

    while it is a fast card for gaming, it is slower than the GTX580 in none gaming related programmes from what I have seen on the reviews.

    I am now just waiting for the next Nvidia card, and then we will see a true comparison of it because at the moment there are no Nvidia PCI-E 3.0 card's to compare it to so we cannot, see how well this really perform's.

    Compared to the GTX580 though for games this is a smart choice for people who need or want to upgrade at the moment.

This topic is closed for new posts.

Other stories you might like