Feeds

back to article Nvidia 'nveils nForce 7 chipsets

Nvidia has rolled out its first nForce 7-series chipset, this one targeting Intel processors and supporting the GPU maker's new three-way SLI technology. Nvidia nForce 780i SLI Nvidia's nForce 780i SLI The nForce 780i SLI supports Intel's one-, two- and four-core desktop processors on a frontside bus (FSB) clocked at between …

COMMENTS

This topic is closed for new posts.
Ash
Boffin

3 x 8800 GTX?

Hell, I have ONE at home and I can't find a game that I can't play on max settings with 60+ fps.

By the time 3 8800GTX cards are needed, something better will do it cheaper. Complete overkill, and for bragging rights between nerds.

0
0
Go

PSU?

3 overclocked 8800 GTX's would be awesome!

Is there a PSU and a fan large enough in the world to keep this powered and cool?

I needed a 500W PSU for one 8800 GTX on a dual core comp and overclocked it ran pretty hot! 3 of them and a quad core CPU might just melt a hole in the earth through to Australia.

I want one, but I think i'll need to build a nuclear power station next door to my house before I can get the power to run one.

At least I'd save on central heating costs.

0
0
Stop

good to see...

...that while server retailers/builders try to swindle us with their 'green computing' initiative, desktop manufacturers offer us a more honest 'f**k the world, look at this 1.5KW desktop machine!' approach.

bravo. </sarcasm>

0
0
Paris Hilton

Re: 3 x 8800 GTX

Ash, try running Crysis at 2560x1600, maximum detail.

Chris, you need an 1100W PSU.

CrossfireX is much better.

0
0

Ah, 3 GPUs...

... as found in most households, however they tend to refer to it as a "kettle".

Did PowerGen sponsor nVidia to develop this perhaps?

As for the comment that it is un-needed, perhaps these people are running dual 30" screen rigs, then they would need the extra muscle. After all, to be able to afford 3 high end graphics cards clearly dual large screen is nothing.

0
0

Razor blades?

Is it me or are these cards going the way of razors? First one, then two blades. Then three. Then, in a fit in amazing creativity - four blades.

How about coming up with something new and cool rather than just stringing together existing stuff?

0
0
Alert

all this just to run Windows Vista!

Nuff said!

0
0
IT Angle

SMRT that spells smert

3 PCIE channels, only 2 of which are 2.0 ? what the hell kind of dumbassed committee management approved that?

A cookie to the first person who can tell me what the point of that was.

0
0
Go

Forget Vista for gaming

I have SLI and vista didn't recognise it for 6 months. By the time this is mainstream vista might catch up. It chews up your RAM leaving the game wanting for more.

But seriously that power and that rating you will need liquid cooling and serious fans for cooling the thing down. Never mind your electricity bill.

There isn't a game out there that needs that kind of graphics (crysis can run at those specs you don't use vista, have 4 GB RAM and normal SLI like me) but dual screens are not common in games yet either. I can think of 1 (supreme commander) which is a bag of sh*te anyway.

....Give it a few years for tri-SLI to be warranted.

Either way I am a sad gaming technie nerd and will most likely get it when I upgrade in a few years.

0
0
Gates Horns

@boldman

Windows Vista? Have you not heard of computa games?

or are you in the apple mac/linux cult? Still waiting for the cult mass suicide....

0
0

2 PCIe 2 and 1 PCIe 1.1

Taken from http://www.bit-tech.net/hardware/2007/12/17/first_look_nvidia_nforce_780i_sli/1

"The third x16 slot is actually full x16 this time around but is only PCI-Express Gen-1.1. While this offers an unbalanced bandwidth and latency difference (because it's connected to the south bridge) at best for 3-way SLI, Nvidia doesn't seem too concerned about this because most of the data in 3-way is passed over the new 3-way SLI connector. When asked, Nvidia also said that PCI-Express 2.0 is just an incremental update and provides only a one or two percent performance difference at best. Most users that do require the third x16 link will only need it for more mundane things like hardware RAID controllers etc, and for that job, it's perfectly suited."

0
0

Right....

good luck keeping that lot cool..

Never mind the fact that there still isn't a stable set of vista drivers for 2 cards christ knows what happens if you try 3....

0
0
Bronze badge

power consumption

I predict in the next five years you will need a electrician to come out and hook up your computer. I can just see it now. whoa you got what in that computer, that requires an 100 amp hook up on its on breaker.

0
0
Thumb Down

SLI = bad

As far as I am aware, with SLi you cannot run dual head to get independent display on two screens with games such as Supreme commander. I will never go SLi until they resolve this major failing with the platform.

0
0
Boffin

@Neil

Four blades was here before 3

http://www.slizone.com/object/slizone_quadsli.html

As for the insanity of GPU counts, I blame all those people so spoilt by graphics that they cannot for their life visualize what they're reading while playing a text-based game.

I once lamented this before and I'll do it again (strangely, I made the lament on my blog after reading one of el Reg's April Fools article earlier this year about how CPUs are actually getting slower). Why is it that Karateka runs on a 8086 with 256k of RAM and a CGA graphics card and still managed to have great graphics, great sound and great physics, while Ghost Recon Advanced Warfighter needs dual-core CPUs, loads of RAM, PhysX, SLI and X-Fi and still managed to be a game that cannot hold my attention.

0
0
Black Helicopters

umm is it me

or does it look like those 3 behemoths are covering the ram and IDE connectors

0
0
Ash

@Mark Rendle

2560x1600? That's a serious size monitor.

Saying that, though, if you can afford a monitor that size you're probably already running that Quad-SLI rig suggested further up.

0
0
This topic is closed for new posts.