Feeds

back to article Nvidia launches GTX 200 series GPUs

Nvidia today rolled out its next-generation 200-series GPUs, the GeForce GTX 260 and GTX 280. The series name is at last a concession to the bewildering naming scheme Nvidia has played with in the vast array of GeForce graphics chips. That is — starting now. Customers will still wade through the convoluted model numbers with …

COMMENTS

This topic is closed for new posts.
Stop

GeForce10 series already??

FFS they just came out with the 9 series. Why bother anymore to have a high end rig when every 3 months its time for another upgrade because your not at the top anymore. I love nVidia but damn, this is just overkill.

0
0
Flame

236W on a graphics card?

Green computing campaigners are going to LOVE that.

0
0

What a bad design. 180-260Watts? That's insane!

And it's a real waste of money to buy the first GPUs of a new family of products anyway. In a 4-6 months time Nvidia will surely start selling way cheaper GPUs based on the new architecture that will consume 40-80Watts of power.

The first generation product are only a waste of money for rich people that don't have anything better to do than waste money playing games.

0
0
Alert

Power crazy!

How long before we start seeing PC cases with duel power supplies just to feel Nvidias ever increasing hunger for wattage?

0
0

The bleeding edge of GFX development....

Can it run Crysis yet?

0
0
Ash

No problem

Just means that 3 days later i'll get a second 8800GTX, stick it in SLI with my other, and i'll have a card 50% (by their maths) faster than one of these brand new cards.

I can play Age of Conan at max settings, 1980x1400, at 30fps right now. SLI should keep me going for at least another year.

Hell, by that time, i'll get a console and an Eee PC instead.

0
0

whereas the second generation product

are for who? Slightly less rich people who still don't have anything better to do than waste money playing games? The entire discrete graphics card industry depends on those who don't have anything better to do than etc...

0
0
Bronze badge
Coat

Can you take the heat??

Well, at least you will not need to turn the central heating on again; although you had better upgrade the air conditioning.

I can feel the heat radiating from my HD3870; fcuk knows what these cards will feel like.

Mines the asbestos lined one with the foundry gloves in the pocket.

0
0
Stop

@Power Crazy

They've been selling 'em for the last few months:-

http://www.overclockers.co.uk/showproduct.php?prodid=CA-072-AN

0
0
Thumb Down

Wasted

All those gazillions of megamonkeys of processing power, and Windows still looks like My First Operating System, ugly and cobbled together.

0
0
Stop

Obligatory monday moan

Hi-spec PC gaming has been killed off by the Crysis fiasco, rampant piracy, and shiny new consoles that are cheaper than the cards themselves.

I just upgraded last year, and feel like i've wasted my money already, my box is now a noiser room-heater that has hugely increased my electricity bills.

Even though Crysis is playable (20fps), i just can't be arsed anymore, Spore and Starcraft 2 are the only games worth having for a long while anyway.

The PS3 will keep me going for the next few years, maybe i'll be back then.

0
0
Joke

Obligatory Vista Joke

And the Vista overall performnance score moved to 5.9 from 5.7 and it still takes forever for AERO to switch desktops!

0
0
Thumb Up

In other news

You can get an 8800GT for ~GBP100 now. Good value from where I'm sitting. (Railgun ahoy !)

0
0
Paris Hilton

But will it

render the Solitaire cards any better ?

I hear Paris likes it hot...

0
0

RE: GeForce10 series already??

The 9 series wasnt really a new series. Just a con/stop gap.

They just shrank the G80 chip in the original 8 series cards (GTX, Ultra, GTS 320/640mb) called it the G92. Used the G92 in the 8800GT and GTS 512mb.

Then they rebranded the G92 as the 9 series.

This really is a new generation of card though

0
0
Go

@Joerg

"Don't have anything better to do than play games...."

Ummm.... That's like saying "some people don't have anything better to do than surf" or "some people don't have anything better to do than go to the gym"

So everyone should conform to your hobbies? Anyone else is wasting their time? I work hard and I spend quite a bit of money on my toys.

I'll be looking at one of these cards, or alternatively go the same way as Ash and buy a second 8800GTX, given that the GeForce280 will costs nearly as much as my surfboard.

Oh, and as for 40-80watts? Do you know anything about graphics cards? They frequently struggle to get mid range cards draw that little. Go check what an 8800GTX draws you muppet.

@ James O'Brien

I suspect it's because the 9's have been an utter failure. They're still using the G92 core and last time I looked they couldn't get either the 9800GTX or 9800GX2 to out-benchmark the 8800GTX!

Although that's supposed to be driver related it's still piss poor.

0
0
Go

Available now...

Already available on ebuyer today for an eyewaterung price!!

http://www.ebuyer.com/cat/Graphics-Cards-Nvidia/subcat/nVIDIA-260---280-range

Ho hum, glad I stopped at 9800GTX

0
0
Anonymous Coward

power management in vista

I'm guessing aero counts as graphics intensive? so when can the card power down? during POST?

I held off buying a new system since I new these cards were coming out, now I might wait a bit longer, at least till I see some benchmarks. The price/performance/power station requirements seem to be a little off the sweet spot for me atm.

0
0
Coat

Fanless non-gamer

I do not buy them before they become fanless and cost less than £50. Fans on GFX-cards do tend to be notoriously noisy. Without a fan, they need to come down to around 35w, but this is mainly because the memory is clocked lower so it becomes remarkably slower.

I only run linux so the most gfx-demanding I use is probably Google-Earth, So even my fanless GF7300 does the job.

0
0
Anonymous Coward

Price?

£563.99 for the top-o-the-range one, according to Overclockers.

You can buy an entire PC (or 2 and a half Aspire Ones) for less than that.

0
0
Anonymous Coward

@Power

The ATI X1950's from a while back needed 30A @ 12V or 360 Watts peak.

Or so it said on the box, and indeed my cheapo 500W supply had to be replaced to get a stable system.

It really does heat the room up, and it's not even high end any more.

0
0
Thumb Down

Nice Pins...

What really grabs my goat is this new 6pin+8pin power requirement - so you're supposed to blow hundreds of pounds on a top end graphics card, and also the £150+ for the PSU that happens to have the correct connections...

What about all the people that already have a very good power supply that has a standard 6+6pin SLI output? There's definitely market collusion going on here to force people into unnecessary upgrades!

Why I ought to.....

0
0
Thumb Down

Why...

...is the bottom of a graphics card always shinier / better finished than the top?

Or, why is it flashy at all? - shiny reflectivity is only going to be appreciated by those with a case window, but even they would be disappointed if the fancy part is to the bottom, hidden, and the PCB/donotremove-stickers are obviously spoiling the aesthetics to the top.

Using the upside-down shots for promo pics is almost as bad as the ads for hard disks showing the platters...

0
0
Alien

Crysis? Smysis?

For all you non-gamers out there, the "Crysis" gamers frequently refer to is a particularly hardware thrashing game that actually delivers on all the Wattage being consumed.

For the complete lowdown, go here http://en.wikipedia.org/wiki/Crysis

But for a brief flavour of how hard it thrashes your hardware, most hardware reviewers post frames per second in the high 90s for most games of today (from Half Life 2 onwards) with a reasonable graphics output of 1600x1200 with all the graphical bells and whistles turned on.

Crysis barely managers 30fps and that's also with the output toned down to 1280x1024. Rack the display output up to the 1600x1200 and we start seeing <10 fps.

BTW 30 frames per second is seen as the bare minimum for a First Person Shooter.

So it'll be interesting to see how this new card and also ATI's one fare against the "Crysis" experience.

0
0
Anonymous Coward

Alternative options ?

"You can get an 8800GT for ~GBP100 now."

So what's better, 2x 8800GT or one new 260 ?

0
0
Silver badge
Paris Hilton

@No Problem, Ash

"Just means that 3 days later i'll get a second 8800GTX, stick it in SLI with my other, and i'll have a card 50% (by their maths) faster than one of these brand new cards."

The current Custom PC mag has a review of graphics cards, and basically says that SLI/CrossFire hardly works. In some cases, two cards can be worse than one.

Paris - 'cause I'd be settle for one of her...

0
0
Thumb Up

I've seen the future. it's mediocre.

Bleeding edge aside, technology is becoming cheap enough to treat a desktop PC as a short-term purchase, rather than a multi-thousand pound investment.

Rather than paying through the nose and building myself a cutting edge gaming system every couple of years, i've started spending £300 on a low spec pre-built PC, filling it with RAM and bunging a half decent graphics card in.

Sure, it'll be obsolete within a year, but it was cheap enough to just throw away and buy another one to tide me over for a year. And these disposable desktops even come with a warranty should the unthinkable happen!

0
0
Bronze badge
Stop

Crippled cards too

Don't forget they removed Video Mirroring because of DRM on all 8xxx series cards and above. Thanks Nvidia. You really know how to fuck over your customers.

0
0
Silver badge
Boffin

@IIsRT

Actually, the TOP of the card is the shiny/fancy bit. PCI was designed to make you insert the card UPSIDE-DOWN in an attempt to stop punters from putting the cards in the old ISA slots. AGP and PCI-X just followed suit for no real reason.

0
0
Happy

@Ryan

shame on you. that's eco-terrorism!

0
0
Thumb Down

nothing better to do.

Funny how manufacterers of CPU, power supply, hard drive, RAM etc all manage to decrease the power requirements while increasing performance. but the "green team". not so much.

0
0
Thumb Up

@ Nice Pins... By Christopher Reeve's Horse

"What about all the people that already have a very good power supply that has a standard 6+6pin SLI output? There's definitely market collusion going on here to force people into unnecessary upgrades!"

The power pinout of the cards is dictated by the fact that the conector pins have a specific "amps per conductor". The engineers hands were probably tied. You can violate the spec yourself by plugging into an interface connector. Better though is to cut off the wires at the supply and run your own appropriate gauge conductors to the required connector. Just dont wire 12 volts to a ground pin as smoke and fire quickly follow.

0
0
Thumb Down

this is what annoys me about all graphics cards reviews/introductions/etc

none of you reporters never give the truly important requirements and that is the AMPAGE!!!!!!! you all go on about the wattage fair enough that is important but with PSU's reaching 1000+watts anyone who buys a highend gfx card and a shitty bottom end bollocks psu only has themselves to blame, its also not reporters who are just to blame. manufacturers of PSU's are just as shit buy not displaying the ampage rating of their psu's on websites and also the retailers that do the same. y'all shit!!! only sites i can find when buying new psu's that display the ampage on the 12v rail of a psu are all foreign!!!! we english get ripped off enough without having to pay import tax!!!!

y'all sicken me *shakes penis in your face*

excuse me whilst i cry golden tears in your eyes!

0
0
Black Helicopters

@Daniel Hobson

Umm, unless your PSU is made in England, you're paying import tax on it already.

0
0
Flame

Nice hardware, could we have some drivers please?

Great, more Nvidia hardware.

Perhaps they could spend a couple of dollars more on the stabilty of their drivers? The latest 175 series drivers are woeful, leading to all manner of amusing BSODs. Even the latest Nvidia Linux drivers (well, the 64bit ones) won't recognise my nearly-new 8800GTS!

0
0
Coat

@Daniel Hobson

If you know the wattage of the card then some basic maths will give you the amps..

Watts = Volts x Amps

so

Amps = Watts / Volts

19.66A = 236W / 12V

Mines the one with the pocket protector and slide rule.

0
0
Pirate

@frank gobbo

which is why whenever i buy from abroad, i make sure i enter my full first name.

"Gift for" being a very popular first name round my neck of the woods.

0
0
Gold badge
Stop

@daniel hobson

Funny that. All the decent products I've ever looked at seem to have a spec sheet on the manufacturer's website showing total amps available and the max draw on each rail. That even goes down as far as the not-a-well-known-brand unit currently doing its stuff in my rig.

For the cheap ones, who gives a toss? If you buy an el cheapo PSU you should be damned grateful if the volts are anywhere near spec, never mind the current. Anyone not quoting a full spec probably has a damned good reason for keeping quiet on the subject.

0
0
Heart

Big screens benefit the most

Hmmm, from some Inquiring sites, the early numbers seem to suggest that it's the big screens that get the biggest benefit. So if you are still on your 19" monitor with a maximum resolution of 1280x1024, the upgrade would hardly be worth it.

Then again, the people that can afford the 30" screens that benefit the most are the ones most likely to be able to afford one of these cards anyway!

I wonder if ATI will be doing the same as they did at the last launch and concentrating on volume as opposed to Top Dog. The card that is the Top Dog typically costs Top Dollar, and as I said, it's the people that have the big house that has the large room for large monitors that buys the Top Dog card. So in terms of earnings, I think the lesser cards are what makes the company coffers less empty.

However, the cynic in me still thinks that if ATI could have shot for Top Dog they would have....

PS: I'm a PC gaming fan with a 24" monitor with a self specced Crysis killer machine (heh, my machine lost to the beast that is Crysis!) but the whole Console and HD thang is most definitely going to be my next purchasing focus.

0
0
This topic is closed for new posts.