AMD has unveiled a dual-GPU, easily overclockable, 3D-capable, DirectX 11–supporting consumer graphics card that it claims is "the fastest graphics card in the world". "Packing more raw performance than any consumer graphics card ever created, the AMD Radeon HD 6990 provides the latest for the ultimate gaming advantage," crowed …
That's a hefty cooler... so what happens in a year when the squirrel cage fan is full of dust and craps out?
The stock Intel CPU coolers seem to do quite well in terms of reliability, and aftermarket replacements are readiliy available. Replacements for graphics cards coolers are few and far between, and in my experience are the first part of a system to die. Passively cooled graphics cards are a better option for lower end systems.
Anyway with this level of processing power, hope to see more software that can take advantage of offloading computations onto the GPU.
What happens when dust craps out the fan?
Why, then everyone knows you're the retard who would spring $700 for a penis extension GPU but is too cheap to spend $20 on a PC vacuum and too lazy to spend two minutes every six months using it.
Oh, and you get your PC's picture put on El Reg for free next time there's a Disgustingly Grotty Computer compo.
I have a pair of watercooled 4870x2 in crossifire, crysis on full and it makes less noise than my old air cooled 8800s (and all second-hand less than £250).
Watercooling isn't quite mainstream "yet" but it's getting there, I remember the day when I had to get a fan for my CPU... 486DX 50Mhz heatpipes have been a good stopgap but modular, generic watercooling may be the next step?
I have a watercooled 4850x2 :)
and that was *after* I hacked my own bios together to lower the fan speeds.
to the OP, keep your pc clean and it will look after you.
mine has dust filters on all the intakes - doesn't eliminate cleaning but it does mean you don't have to clean internally as often. if you could blow $700 on a GPU, i'd expect the rest of the system to be up to spec too.
Doing It Wrong
"Your system will scream" should not be literal. Yes, I know 85dB is where hearing loss starts, but this is absurd. How far does it rev for a Blu-ray movie?
Bear in mind dB is a logarithmic unit
So 85dB is 50 times more intense than 70dB, the higher figure mentioned in the article, or 500 times more intense than the lower figure of 60dB. Your graphics card will melt long before your eardrums do.
not quite fair...
I see where you're coming from, but you're being a little misleading on your use of figures there, but in any case the issue is that the corresponding sounds in the room have to go up to compensate for the extra noise.
say you were at 40dBSPL with your computer fans, and playing battlefield with your speakers set to a max 75dB SPL, that'll let you hear footsteps in the game which are 35dB lower than the gunshots, say, which are the loudest sounds.
Up your fan noise by adding an industrialheatingelementcumgraphicscard, and you're at 70dBSPL noise floor, which means you can only barely hear the gunshots anymore, and the footsteps are long gone... so you keep getting knifed. In order to hear the footsteps again, you turn up your speakers by 30dB to get the footsteps above the noise floor, bringing you to gunshots 105dB SPL...
Realistic some might say, lol... but even soldiers want to wear earplugs at those kinda volumes!
Also, people don't realise how loud 85dB sustained volume actually is (that's a louder sustained volume than a lot of drumkits) - it's well above the level of hearing loss whatever H&S documents say... it's a volume you have to really shout over. If you ever go into a small studio room with a level meter, and turn up the speakers until you hit 85dB, you will find it incredibly uncomfortable. It causes 24 hour temporary hearing loss after only a minute or two, and if you work at that level regularly, you will permanently lose high frequency hearing even if it doesn't show on the crappy hearing tests which stop at 8kHz generally. The only engineers who work close to 85dB and above are those who have already lost a chunk of their hearing... most of us work lower than this most of the time - frankly, I'd do a lot of proper work at 70dBSPL, lol (though sadly as I get older I'm having to go louder... :-/ )
The grenade just seems appropriate...
"Fastest in the world"
... until the Radeon HD 6995, of course...
"If so, it appears that the Radeon HD 6990, formerly known as Antilles, may be an effective weapon."
Looks like this is merely the thin end of the Wedge.
Shame about the Drivers
My Radeon still can't manage to output a clean 1080p signal over HDMI to a 1080p TV. The crappy ATI/AMD drivers insist on scaling the image up (badly) and then scaling it down again, making the pixels look like they've been through a train wreck. Latest drivers. Nothing's changed.
Try with a VGA lead I have found with a variety of laptops and desktops they always look like crap over HDMI, I always thought my no-name TV just a cheap HDMI/Digital system but having tested both a high end Sony and a Toshiba I'll stick with VGA.
@HMB - Word of advice
Check the native resolution of the display panel in the TV (it should be printed on the box or in the instruction manual). Too many TV's (and not always just the cheap ones) on the market today claim and will accept a 1080p signal, but will then downscale it to 1388x768 or 1680x1040 or whatever their native display panel can do.
I had a big argument with a major on-line retailer about this when the published resolution for a TV I bought from them was wrong on their website, and they were extremely slow to accept the fault. Even then, I needed to go through their onerous RMA process, which takes about 2 weeks, before they would refund.
I actually went through forcing the driver to override the EDID value read from the TV to prove the case, and actually at the end of the day concluded that using the VGA port rather than the HDMI or DVI port was far more flexible and gave more control.
It's fastest in the world...
...until the next card from nVidia or ATI/AMD arrives.
Do they have good linux drivers yet? What? I cant hear you! My computers a bit loud... What??? Oh, what sort of content would I need this for on linux??? ...Doh!
...any old shitty card can run white text on a black background.
Let me see now....
.....if you install this you need ear plugs, several 14 cm case fans, an industrial grade psu and you can also heat the office with it. As far as the private market goes we are of course talking financially well heeled obsessive gamer who still lives in the cellar at his mum's, are we not?
Do people actually live in their mum's cellar?
I can heat the house and has alternative uses? Stick a bigger fan on it to make it quieter and it sounds like a good deal for the family home. Just need someone to play on the PC for a bit on those cold winter evenings.
I wonder if AMD's next offering will come with a stove hotplate as its cooling.
.......and probably reads magazines like.....
.........."Topless Benchmarks for Men".
Not absurd mate.
It's not absurd it's disgusting, we need proper control of these resources. It's time for the capitalist to fall.
Fastest in the world...
Until the GTX590 in 6-8 weeks which should knock its socks off.
60-70 dBa is the same sound level as most good cars on the motorway. While it seems quiet there, beside your desk is a different story.
I still opt for nVidia
due to CUDA, and Linux support. We have a lot of code for 3D visualization that I also run at home.
This card is pointless for games
$699 to play DX9 console ports, what's the point other than bragging, world record hunting in the overclocker scene or playing a hamstrung DX9 console port across 6 screens? OK there is software emerging that lets one leverage all that power for cracking cryptographic systems so there maybe some point to the card after all, but it is certainly not gaming.
The hype surrounding Vista and DX10 convinced PC gamers that Vista was worthy upgrade in order to play all those awesome DX10 PC games that would be released. Almost five years have past and the number of PC games truly taking advantage of DX10 or higher that are not DX9 console ports with a DX10 or higher code path tacked on as an after thought must be in single figures.
The PC gaming industry has been killed not by piracy, but by empty promises, poor quality console ports and ridiculous DRM. Why invest money in a high power gaming rig when one can get the same results with the exception of precise character control on a cheap console?
Grow a brain
The gaming industry aren't going to release games that require DX10/11 until the vast majority of users can run them. Until then, they have to develop games for DX9 and add extra features for DX10/11. It's simple economic sense.
Surround gaming and futureproofing - that's what
If someone is buying a 5990, they'll have a one or more monitors to go with it. It's one of the few cards that can run Crysis on maximum detail, at full 30" TFT resolution and never drop below 30fps (much cheaper cards can generally match the speed, but their minimum frame rate can drop below 20fps)
AMD has been strongly pushing Eyefinity, and at particularly high resolutions using beefy games, a top end graphics card will be required.
It's overkill for the rest of us, though. For most people buying an AMD card the only reasons to go beyond a 6850/6870 are either a) exhausting heat outside the case (this is why I went for a 6950) or b) double precision floating point support
Wish I had your skills at casting aspersions on a persons intelligence.
"The gaming industry aren't going to release games that require DX10/11 until the vast majority of users can run them. Until then, they have to develop games for DX9 and add extra features for DX10/11. It's simple economic sense."
That's your opinion, not entirely invalid I may add.
However I see it more like this
The gaming industry aren't going to release games that require DX10/11 until the cash cow that is the console market has the hardware to provide a DX10/DX11 level experience. Until then they will continue develop for the console market and pass on shoddy ports of 6 hour long games with awful control schemes made for those of low attention span to PC gamers at full price. It's simple economical sense motivated by greed and a blatant disregard of the PC gamer.
Now whilst it may make economical sense for developers to create console games and port them to the PC. It doesn't make economical sense to purchase a card such as this to play those ports. This card is overkill for gaming unless one wants to play across six screens because there are, and will be no games created to take advantage until consoles support DX10/11. By then cards from AMD and Nvidia will be another magnitude higher in performance terms and performance such as that provided by the 6990 would be considered entry level. Considering MS develop both the DX API and Xbox, I would be surprised if MS are in a hurry to take the DX API to the next level.
Now if you can reply without insulting me I might just read what you write.
why no HDMI's - FAIL
ok, so it has 4 DsiplayPort's and a DVI,... are any monitors made that have DisplayPort's (apart from over priced Apple offerings) - that was well considered, well done AMD !!!
From what I've read elsewhere, AMD will require DVI and HDMI adapters to be bundled with the card.
Apart from Apple, I know Dell use Displayport on some of their PCs and monitors.
Beautiful hardware, terrible software
AMD/ATI makes great hardware, but sadly the drivers are terrible for legacy games, terrible in general and while NVIDIA makes a fair bit of junk drivers, I have consistently returned 5 generations of ATI cards, even a 6950, simply due to all the legacy games and issues with the drivers.
Its too bad they dont work hard to make the ATI drivers work in all games, not the ones released in the last few years.
can keep both their cpu's and gfx cards. ANY techie knows radeon drivers are amongst the biggest heaps of shite code available (lessons from apple??) and their processors just cant cut it with the latest chipzilla silicon....
In fact, as someone whom has seen AMD come from their humble PR166/200 cpu's and seen them burn, fall over and display massive incompatibilities, i stand by what i said 20 years ago. Want a proper CPU, intel and only intel...
...you're not biased at all are you?
Classic flame that...you start with dissing ATi's GPUs with no evidence and a sweeping assumption about techies, nor do you provide a comparison to their competitors (nvidia are hardly a shining light of benchmark fairness or driver stability), then start recommending intel CPU's????
Do you even know the difference between a GPU and a CPU? Why would you talk about CPU's in a GPU thread? If you were even remotely a techie you'd know that intel GPU's are total mince at the high-end.
Ahh, said the hidden voice.
Nope, not at all, 16 years of using IBM/clones and installing thousands of graphics cards taught me one thing. ATI software is shit. THATS all the evidence i need.
And any techie from the same era will agree.
Nvidia, who mentioned nvidia? You did. Whilst i agree that nvidia arent exactley bastions of quality there software works more often OOTB than ATI.
GPU and CPU, seriously mate, be sensible....
One last thing, who mentioned Intel GPU's??? You again.Twice you've decided to bring something to the conversation i didnt mention..I said the intel icore (a few of which have *onboard GPU's*) CPU's will trounce any AMD CPU.
Before you go blowing out of both arseholes, read what people write.
The hidden voice speaks
As a tech from the same era, I actually do disagree.
I've have more nvidia returns that Ati, and those that were driver issues required a driver re-install. But the nvidia ones were likely to need the wipe, whereas the Ati could usually be installed right over the top. Oh and on the subject of hardware-related returns, it's the later era nvidia cards I'm getting returned in greater numbers than Ati. And I only sell 'brand name' units.
This is why I mention nvidia - because in GPU land there really are only 2 players at the high end, and how can you talk about one and not the other?
My question still stands - why are you talking about CPU's in a GPU thread? WTF does intel CPU's have to do with ATi GPU's?
At least my arsehole(s) knows how to stay on topic.