back to article AMD ATI Radeon HD 5970 two-GPU graphics card

Cast your eye over our news piece on AMD's ATI Radeon HD 5970 and our review of the HD 5870 and you’ll have the essential information at your fingertips. AMD has, for some unknown reason, changed its naming convention, so this two-chip HD 5870-based graphics card has been named HD 5970 instead of the more predictable HD 5870 X2 …

COMMENTS

This topic is closed for new posts.
  1. Anton Ivanov
    Flame

    If I need a heater in a room I may look at it

    This thing draws enough power to heat up a small living room. No thanks...

    1. Anonymous Coward
      Flame

      Heater

      I know someone who bought a 'Low Power' 300w 'Laser Heater' (I think Laser was just a brand name not real lasers!) anyway it was friggen useless.. (as you'd expect for 300w)

      And it scored 0/10 for entertainment too... at least these 'heaters' are good for something.

      (if you dont want the heat stick a water block on it and a radiator outside your window.)

      1. MrPatrick
        Flame

        The title is required, and must contain letters and/or digits.

        Radiator outside is all very well until its the middle of summer. Then you need the radiator next to an air con unit :/

  2. Greg J Preece

    Pricey, but for me there is an advantage

    Ever since I built this machine back years ago when it was running two nVidia 7900GTXs (and heating my room at the same time) I've had a Matrox Triplehead2Go monitor splitter to give me three-screen gaming (and loads of screen space for programming IDEs). Not every game ran well with it - in fact, while a lot of games list the resolution as available in the options screen, actually selecting it completely knackers the perspective and makes the game unplayable. Escape From Butcher Bay is a good example.

    With Eyefinity, I could dump my extortionately-priced and annoyingly analogue first-generation external splitter box on eBay, buy a couple of adaptors for my existing monitors, and use the money to soup the machine up even further. Say, with a huge, expensive, completely over the top graphics card? Hell yeah...

    Definitely something I'll be looking into. So long as it doesn't require the same number of wires and fecking about as my existing setup - and considering this thing already has 3 output connectors without any external boxes at all, that's quite likely - I would be very interested.

    1. Anonymous Coward
      WTF?

      Puzzelled

      Greg there are many double headed Graphics Cards around almost all of the recent decent cards are double headed. There are likewise many sli/crossfire mobos which will take two cards.

      Its been easy to run 4 monitors for many years now... Im puzzled as to why you are struggling with splitters.

      Actually Contact Shaun2 (posting below) hes selling two double headed cards!

      you could buy another and do this http://www.cdrinfo.com/images/uploaded/Nvidia_GeForce9800GX2.jpg (yes thats Six dvi sockets!)

      1. Greg J Preece

        It's simple

        I could easily have had as many monitors as I wanted, but at the time I built the machine I wanted tri-screen gaming, which needs (or needed) Windows to recognise all three monitors as if they were one. And if you want to run SLI, you can only output from one card. So I hooked two SLI'd GTX cards into a Triplehead2Go, which meant the three screens identified to Windows as one huge 3840x1024 monitor. Job done. Played Half Life 2 at 3840x1024. Was very much fun.

    2. Greg J Preece

      Though I would like to add

      Having peripheral vision in games isn't always an advantage.... Think F.E.A.R., in surround sound, at 2am...

  3. Anonymous Coward
    WTF?

    two groups of GDDR 5 memory that total 2GB

    Always wondered why they use 1GB per chip. it really makes it a 1GB Card (with 1GB available to the game) although it will be sold as 2GB!!

    Im guessing this card is actually two cards on one board, with two times the data on the buffer... loading all the textures twice...

    surely if both chips are working on one scene there is only one set of textures.. both CPU's both sharing the majority of the 2GB Ram for textures (for the whole scene), would be a better way to do it? (making 2GB available to the game) with a small reserve each for dedicated use where it is required..

    or is it not possible to have two CPU/GPU's sharing ram? (in which case how do dual/quad cores work??)

    Or is this improvement reserved in order to gaurantee future extortionate sales?

  4. Shaun 2
    Unhappy

    If only they were available..........

    I've had an XFX 5970 Black edition on order for a month now - The availability date keeps moving.

    Currently looking at the 16th December for delivery - will just have to wait and see.

    Looking forward to ditching my two 9800 GX2's which draw close to 300W each at idle!

  5. Andrew Kemp

    3 outputs

    To answer the question posed by an AC above regarding why get this or a TH2Go unit over two cards is that you can make the three screens apear as one in an application, something you can't do with a single monitor on a single output.

    Two twin output cards are not going to allow you to play a game over three monitors. The Eyefinity function of the ATI 5 series cards or a Matrox TH2Go do...

    1. Anonymous Coward
      Thumb Up

      Ahh

      Ahh ok got ya! not multi display but single display on multi screen.. yeah any 5 series ATI card then! not just this very expensive one!

  6. Peter Kay

    SoftTH, triple monitors, etc

    What Andrew Kemp said. The Matrox TH2Go is extremely limited in the resolutions it supports, and all three monitors must (not very surprisingly) use the same resolution.

    This is not necessarily the case with the ATI cards - or at least technically, it shouldn't be necessary when you're not running games.

    SoftTH is an option for triple headed game play if you don't have a new 5xxx series ATI card, but as it's a non driver level software solution it has occasional problems. It works well though, as I can attest, using two NVidia cards across four monitors..

    Personally my priorities are driver stability (something ATI are slowly nailing, but haven't got right yet), more stability, solid 2D support, quiet, exhausting heat outside the case and fast 3D support. Power consumed is not as important as noise and heat. I've got an 8800GTX which is a tad hungry, but also lovely and quiet and not very difficult to keep cool enough not to affect other components.

    Eyefinity is a very welcome feature, although it does require at least one monitor with a Displayport connector. Bit annoying really, considering the cost of active Displayport converters.

    1. Greg J Preece

      Apparently...

      ...if you're connecting just one monitor via DP you can use a passive adaptor, according to ATI's tech sheet on the matter. Should make things cheaper.

      As for the heating point, I agree, and that's why I rarely buy stock ATI cards. There's always a third party out there with a superior cooling rig for not a hell of a lot more.

  7. Stuart 14
    Happy

    But

    will it play crisis

  8. Hugh_Pym

    I always worry...

    ... that if I watch late night TV and I get bombarded by the inevitable 'chatline' adverts aimed at the sad and lonely I have actually become part of that target audience simply by being there.

    By the same token if I bought this I would become part of the group of buyers likely to be swayed by pictures of sci-fi body armoured young women on hi-tech equipment. I'm not sure which group is sadder.

  9. Raddimus Voracious
    Paris Hilton

    That's hot!

    I have always said there aren't enough computer rendered female rastafarian models in the graphics card industry.

  10. Martin Nicholls
    FAIL

    Review makes no sense..

    The price is totally irrelevent and it's still cheaper than an nVidia card that costs more and isn't as powerful.

    So how did it get a 65% score exactly?

    Anybody that buys a card that's this powerful (me included) isn't going to give a damn about the price.. except in the knowledge you're going to get more bang-for-buck than with nvidia..

    Seriously this review is just wrong.. No really.

    You review what it is - arguably the most powerful single card money can buy, and if you can get comparable for cheaper (even nVidia could actually do cards even close to this powerful) /then/ you start knocking points off...

    You put two GTX 295's in your PC, it's not going to be as powerful, it's going to use up at least 4 slots in your case and it's going to cost you 800 quid before you even get started with the 15TW PSU you're going to need.

    Seriously - what's the deal? I mean really where is the nVidia comparison anyways?

  11. Martin Nicholls
    Paris Hilton

    Also...

    While I'm ranting..

    "so these figures might be sustainable provided we could stand having the cooling fan running at full tilt during a gaming session"

    Firstly most people are going to water cool and secondly.. Yes, if you play games without sound it's going to get annoying but who does that..

    If we're going to be talking about the sound it makes - what's it like at idle? If in a normal environment when it's not being pounded and it's quiet, that's all that matters.

This topic is closed for new posts.

Other stories you might like