back to article Asus ENGTX285 TOP overclocked graphics card

Asus' ENGTX285 TOP graphics card is based around Nvidia’s latest graphics chip, the GeForce GTX 285. This is the GT200b core, which is a die-shrink from 65nm to 55nm of the GT200 that was the basis for last summer's big Nvidia release, the GeForce GTX 280. Asus ENGTX285 TOP Asus' ENGTX285 TOP: factory overclocked GTX 285 …

COMMENTS

This topic is closed for new posts.
  1. Lionel Baden
    Thumb Up

    drool

    I Want in SLI, (with small reactor to power it)

    nothing more to say

  2. Jan
    Flame

    so the 4870 is still better?

    If I get this right (from the graphs) a "single" 4870X2 for ~370quid will be consistently faster than a single 285 retailing for 350, but gets eaten by a dual 280 oder 285, which cost 600 to 700 quid, correct?

    sooo.... wheres the big deal in a 285?

    flames, since it is what I´ll earn for my post.

  3. Ash
    Thumb Down

    Waste of money.

    I bought Dead Space three days ago (completed, good game.. Not as scary as Doom III but unique story line which was very interesting). Ran it on top details at 1920x1200 with no slowdown (including the enormous big boss) on a single stock 8800 GTX and Q6600. Incidently, it's one of the few games i've played recently which supports task switching without crashing. That was a MOST welcome feature!

    These new cards are pointless headline-grabbers. 2 years time (the time i've had my 8800GTX currently) they'll still handle the loads they did at release, nothing new technology wise will bog them down, and they'll be under half the price. I *MIGHT* get a second 8800GTX for SLI, but like I said i'm not feelnig the pinch at the moment anyway. Even Crysis was playable at 1920x1200 on medium settings (The meme "Will it play Crysis?" being answered by me "Yes, at 30FPS until the end boss.")

    I'll hold onto the cash, thanks.

  4. Stu
    Unhappy

    PC hardware manufacturers - change the paradigm.

    Its an age old argument that holds true, now more so than ever - "Go out and buy a PS3 or X360!!"

    Less than the price of this card alone and, ultimately, much more capable a system than your typical PC of the same price, because games developers write titles that are guaranteed to run well (not necessarily play well tho) on every single PS3 or X360 out there.

    And now for our extra wonga we see what? A whole 3fps improvement! Wow, thanks. And as games become more complicated and visually 'rich' (read slow) the percentile will reduce further still with each new iteration of gfx card.

    The feature gap is closing in as PC type casual functionality is continually added to these consoles - MP3, Videos, Pictures, web surfing (is this on Xbox yet?), email etc etc etc. A large proportion of people require nowt more.

    Of course there will always still be room for PC hardware of many different configurations, just not top end gaming monsters that cost a small fortune.

    ...and this is coming from a person originally into PC gaming too!

    .

    And no I dont work for Sony/Microsoft gaming. Im just saddened by this continual push that will ultimately kill the PC as a games machine. If anything, I think there is definitely a need for a radical slow down of performance improvement efforts made, slow down on the sheer amount of new gfx chipsets and new cards out there. Perhaps a few announcement every few years, to give the games developers time to slow down, write titles that look and run well on modestly priced hardware. Plus new hardware will be received well because it will be bigger news to the enthusiast.

    If anything, such a move will benefit Nvidia and ATI because they can price one spec of gfx hardware at a good set price and lower their production costs for that same spec over time, similar to how Sony+MS do things.

    @El-Reg editorial - can you commission me to write a full article on this problem with the industry!? I wont charge much for it! ;-)

  5. Daniel Silver badge
    Stop

    500W

    "Adding a second GTX 280 in SLI made the performance jump by 80 per cent which is impressively efficient and it raised the power draw to 500W under load."

    500W??? Does anybody here other than me think it's bizarre in these 'green' times to have a PC graphics array which consumes HALF A FRICKIN' KILOWATT of power?

    Anybody? No?

    Guess it's just me then.

  6. Dan
    Thumb Up

    @Stu

    @Stu. Couldn’t agree more.

    I bought a PS3 for exactly those reasons. Sure I miss the mouse and keyboard controls. But I got sick of spending hundreds on a card that required a good system in the first place.

    The second I can use a mouse on console FPS or RTS games I won’t ever need anything more powerful than a Pentium 3 laptop.

    With consoles offering email and the net, it can only be a matter of time before WASD is an option.

  7. Francis Boyle Silver badge

    @ Stu

    Maybe El Reg should commision an article from you - at least then I meant have some chance of working out what you're on about. But since that's probably not going to happen I'll do my best to respond to what you appear to be saying.

    re: part one

    I've never bought a new graphics card to get an increase in frame rates (Why would I want that - is someone overclocking my brain?) I buy new graphics card to run games that are more "visually 'rich' " at the same frame rate. That's the way I like it.

    re: part two

    Oddly enough, we have arrived at a situation where most games work well on modestly priced hardware. My radeon X4850 is the most capable card I've ever owned relative to the games current at the time of purchase, yet it cost be around half what I had become used to paying. The top end is largely irrelevant to the majority who have no interest in bragging rights, save as a preview of what we'll be buying in six month's time. As for Nvidia and ATI copying the Sony/MS way of doing things - no you've got me there. I can't think of a single argument against it.

  8. Stu
    Unhappy

    @Francis Boyle

    re: part one -

    Okay point taken, save perhaps for the fact that I would have liked to have seen a decent frame rate in the first place! I went out quite recently, bought a nice shiny new Quad core Intel (Q9450 or something) 2Gb of ram in it, and coupled that with what I considered to be a good compromise gfx card, the 8800GTS 512Mb. Now why then am I not able to run Crysis over lower medium quality and a sucky screen res. The PC cost me, all in, over £500.

    Now compare that with the all-in cost of a PS3, playing, say Wipeout HD. 1920x1080 progressive, a great frame rate out the box, all in price - aroundabouts £300 + 12 pounds for Wipeout HD off the PS store!

    See my point?

    re: part two -

    I think you''ve proven my point - a preview of hardware in 6 months time - all well and good except, as far as I can tell, the PC game development industry seem to be writing games that run well only on PCs 6 months in future, i.e. they are too far ahead of the game. Actually more like 3 years in the future as crysis isn't what I'd call a brand new title any more, so why is my relatively new PC unable to run it at full whack quality settings? Think back to your doom/quake/half life days, give it six months to a year, and pretty much any mid priced complete system could run them well.

    Its not like I'm new to building custom PC setups either.

  9. Ross Fleming
    Black Helicopters

    @Stu

    Stu, agree with the highlights. All valid points and in utopia your suggestions would work well - homogenize the graphics market and you will give game designers the option to get the best out of the hardware. Right now they're crippled by the APIs and never get time to discover the tweaks that would unlock their capabilities/ Exhibit A: the marked improvements on graphics quality from the first PS2 games to the latest (and extend to exhibit B-Z for every other console out there in history). Also goes against the "IBM-compatible" model where you can have any conceivable combinations of hardware.

    However it would crush innovation in the market. ATI and nVidia don't make radical leaps very often, they work by gradually improving, tweaking and nurturing their products - taking notes out of each development to come up with new reference models (their big leaps) but again this is learnt from the gradual progression. And making a single new board costs millions, making the next one costs pennies so they flog them. Similarly, the dodgy ones that come off the production lines are underclocked, faulty stream processors disabled and flogged at a cheaper price.

    Never forget that without this development, the XBox 360 and PS3 wouldn't exist in their current format, they absolutely benefit from this progression by grabbing the highest performer available (trading off against cost of course). Also, witness the profit margins that shift. NVidia and ATI can only recoup their costs through hardware sales. Sony and MS can take a loss (and historically do) on the hardware and recoup in the games. So you get powerful consoles costing relatively little money (compared with equivalent PC) but more expensive games. Conversely PC games come in at the more healthier ~£30.

    And then, we'd need to see ATI and nVidia agree to slow down. Would work for a minute before one of them broke ranks and flogged a 1% improvement model, starting the cycle all over again. :-)

  10. Brandon
    Heart

    ATI still a better deal...

    I'm running farcry 2 on all very high settings on a measly ATI 4850 w/ a 3.8ghz E8500 supporting it, and it's great! (and I'm very particular about good framerates) My point is, if you're looking to play all the best games at more than playable framerates, but budget matters to you, then ATI cards are where it's at! They win on all the "bang for your buck" scales that I've seen. I'm no fanboi either... I actually like nvida better (stereoscopic support, CUDA, and other reasons), but budget wins in 2009 :)

  11. Francis Boyle Silver badge

    re: @Francis Boyle

    I can only second Brandon's comments - you're on the wrong side, mate! The 4850 is a brilliant card capable of handling anything that Crysis can throw at it and more (and that's at 1600x1200 with the quality settings turned up to the max). The 48xx series killed the "but can it run Crysis" joke stone dead. Sure, it took a while (well, 7 months according to my calculations) but then Crysis is hardly an average game. Face it we've never had it so good. Sounds like you contracted a bad case of nostalgia there.

  12. KB
    Stop

    Is there any need for this much power?

    My Q6600 / 2 x 8800GT 512MB SLI box is coming up to a year old now and I'm still playing pretty much everything with all the settings maxed.

    Even Crysis worked very well indeed (30+ fps) with everything on Very High, albeit at 1280 x 1024 resolution. I just knocked it down to 1024 x 768 for the final boss.

    Thing is, a 8800GT SLI setup costs a fraction of one of these cards and yet it really isn't that far behind when it comes to real life gaming. My point is, it really doesn't have to cost the earth to be able to play the latest PC games and whilst I do have a PS3 as well as my gaming PC, I wouldn't want to be without either. WipEout isn't really comparable to Crysis, is it?

This topic is closed for new posts.

Other stories you might like