If I need a heater in a room I may look at it
This thing draws enough power to heat up a small living room. No thanks...
Cast your eye over our news piece on AMD's ATI Radeon HD 5970 and our review of the HD 5870 and you’ll have the essential information at your fingertips. AMD has, for some unknown reason, changed its naming convention, so this two-chip HD 5870-based graphics card has been named HD 5970 instead of the more predictable HD 5870 X2 …
Ever since I built this machine back years ago when it was running two nVidia 7900GTXs (and heating my room at the same time) I've had a Matrox Triplehead2Go monitor splitter to give me three-screen gaming (and loads of screen space for programming IDEs). Not every game ran well with it - in fact, while a lot of games list the resolution as available in the options screen, actually selecting it completely knackers the perspective and makes the game unplayable. Escape From Butcher Bay is a good example.
With Eyefinity, I could dump my extortionately-priced and annoyingly analogue first-generation external splitter box on eBay, buy a couple of adaptors for my existing monitors, and use the money to soup the machine up even further. Say, with a huge, expensive, completely over the top graphics card? Hell yeah...
Definitely something I'll be looking into. So long as it doesn't require the same number of wires and fecking about as my existing setup - and considering this thing already has 3 output connectors without any external boxes at all, that's quite likely - I would be very interested.
Always wondered why they use 1GB per chip. it really makes it a 1GB Card (with 1GB available to the game) although it will be sold as 2GB!!
Im guessing this card is actually two cards on one board, with two times the data on the buffer... loading all the textures twice...
surely if both chips are working on one scene there is only one set of textures.. both CPU's both sharing the majority of the 2GB Ram for textures (for the whole scene), would be a better way to do it? (making 2GB available to the game) with a small reserve each for dedicated use where it is required..
or is it not possible to have two CPU/GPU's sharing ram? (in which case how do dual/quad cores work??)
Or is this improvement reserved in order to gaurantee future extortionate sales?
Greg there are many double headed Graphics Cards around almost all of the recent decent cards are double headed. There are likewise many sli/crossfire mobos which will take two cards.
Its been easy to run 4 monitors for many years now... Im puzzled as to why you are struggling with splitters.
Actually Contact Shaun2 (posting below) hes selling two double headed cards!
you could buy another and do this http://www.cdrinfo.com/images/uploaded/Nvidia_GeForce9800GX2.jpg (yes thats Six dvi sockets!)
I know someone who bought a 'Low Power' 300w 'Laser Heater' (I think Laser was just a brand name not real lasers!) anyway it was friggen useless.. (as you'd expect for 300w)
And it scored 0/10 for entertainment too... at least these 'heaters' are good for something.
(if you dont want the heat stick a water block on it and a radiator outside your window.)
I could easily have had as many monitors as I wanted, but at the time I built the machine I wanted tri-screen gaming, which needs (or needed) Windows to recognise all three monitors as if they were one. And if you want to run SLI, you can only output from one card. So I hooked two SLI'd GTX cards into a Triplehead2Go, which meant the three screens identified to Windows as one huge 3840x1024 monitor. Job done. Played Half Life 2 at 3840x1024. Was very much fun.
I've had an XFX 5970 Black edition on order for a month now - The availability date keeps moving.
Currently looking at the 16th December for delivery - will just have to wait and see.
Looking forward to ditching my two 9800 GX2's which draw close to 300W each at idle!
To answer the question posed by an AC above regarding why get this or a TH2Go unit over two cards is that you can make the three screens apear as one in an application, something you can't do with a single monitor on a single output.
Two twin output cards are not going to allow you to play a game over three monitors. The Eyefinity function of the ATI 5 series cards or a Matrox TH2Go do...
What Andrew Kemp said. The Matrox TH2Go is extremely limited in the resolutions it supports, and all three monitors must (not very surprisingly) use the same resolution.
This is not necessarily the case with the ATI cards - or at least technically, it shouldn't be necessary when you're not running games.
SoftTH is an option for triple headed game play if you don't have a new 5xxx series ATI card, but as it's a non driver level software solution it has occasional problems. It works well though, as I can attest, using two NVidia cards across four monitors..
Personally my priorities are driver stability (something ATI are slowly nailing, but haven't got right yet), more stability, solid 2D support, quiet, exhausting heat outside the case and fast 3D support. Power consumed is not as important as noise and heat. I've got an 8800GTX which is a tad hungry, but also lovely and quiet and not very difficult to keep cool enough not to affect other components.
Eyefinity is a very welcome feature, although it does require at least one monitor with a Displayport connector. Bit annoying really, considering the cost of active Displayport converters.
...if you're connecting just one monitor via DP you can use a passive adaptor, according to ATI's tech sheet on the matter. Should make things cheaper.
As for the heating point, I agree, and that's why I rarely buy stock ATI cards. There's always a third party out there with a superior cooling rig for not a hell of a lot more.
... that if I watch late night TV and I get bombarded by the inevitable 'chatline' adverts aimed at the sad and lonely I have actually become part of that target audience simply by being there.
By the same token if I bought this I would become part of the group of buyers likely to be swayed by pictures of sci-fi body armoured young women on hi-tech equipment. I'm not sure which group is sadder.
The price is totally irrelevent and it's still cheaper than an nVidia card that costs more and isn't as powerful.
So how did it get a 65% score exactly?
Anybody that buys a card that's this powerful (me included) isn't going to give a damn about the price.. except in the knowledge you're going to get more bang-for-buck than with nvidia..
Seriously this review is just wrong.. No really.
You review what it is - arguably the most powerful single card money can buy, and if you can get comparable for cheaper (even nVidia could actually do cards even close to this powerful) /then/ you start knocking points off...
You put two GTX 295's in your PC, it's not going to be as powerful, it's going to use up at least 4 slots in your case and it's going to cost you 800 quid before you even get started with the 15TW PSU you're going to need.
Seriously - what's the deal? I mean really where is the nVidia comparison anyways?
While I'm ranting..
"so these figures might be sustainable provided we could stand having the cooling fan running at full tilt during a gaming session"
Firstly most people are going to water cool and secondly.. Yes, if you play games without sound it's going to get annoying but who does that..
If we're going to be talking about the sound it makes - what's it like at idle? If in a normal environment when it's not being pounded and it's quiet, that's all that matters.
Biting the hand that feeds IT © 1998–2019