Intel has released two new top-of-the-line, six-core "Sandy Bridge E" processors, the first to bring four-channel memory to the desktop. Before this Monday, the high end of Intel's desktop line was capped by the four-core 3.5GHz i7-2700K. Now that spot has been taken over by the six-core 3.3GHz i7-3960X Extreme Edition, which …
""RAGE uses a very compute-intensive real-time process to transcode texture data ... blah blah"
Yes, but it really isn't a terribly good game for all that fantastic technology. It's *ok*, but, it's not quite in the same ballpark as say, Portal2, or even Half-Life2, is it?
And, heck, lets face it, minecraft is more fun that most visually stunning hardware humbling blockbusters.
I managed 30 minutes of Rage and would love to sell the damn game, waste of £25. My loyalty for id software, already hard pressed after Doom3, finally left me.
It's not what you've got, it's what you *do* with it - and if game developers expect me to continue to fork out wads of cash *just* to play their latest PC game "the way it should be played", sorry, not interested.
Whatever happened to old school programmers who would eek the most incredible wonders out of minuscule processing power and disk space?
old school programmers
Why, they're still at it, of course. With their C64s and whatnot. I don't dare say I'd wish they upgrade, but cycles are so cheap these days that nobody's bothering to not waste them; throwing more hardware at a developer is far cheaper, even if it requires all your customers to upgrade too--that's not a cost the developing company has to bear. That it's globally unoptimal is not their problem. You can still find some of that eking out in the console market, where hardware upgrade cycles are far slower. Doesn't change that it'd be nice to make better use of the hardware in peecee land too.
I love the way that Intel constantly undermines the ability of users to upgrade their PCs by changing the socket each time. If you're sensible a build a machine with the CPU a couple of places back from the line-topper you'll find that when you come to do an upgrade the only thing that fits is that line-topper. LGA 1156 and LGA 1366 were the last ones now we have LGA 1155 and LGA 2011. It's taking the piss, pure and simple. Especially in the case of 1155 (or H2). One pin less? Bastards.
IS THIS OVERKILL..
TECH for tech sake or is there legitimate necessity for it.
Get your facts right
Intel has been shipping 6 core processors for ages in the form of the i7 990X, so before this Monday the high end of Intel's desktop line was NOT capped by the i7 2700K .
Yes but Id are going to license their tech... someone else can then create a masterpiece from it.
RE: @Matt 89
Uhh, no, it's highly unlikely - FYI nobody uses iD engines since Quake III (even Quake Wars was an iD game, developed by Splash Damage.)
Really, Carmack, in 2011?
Aside of the fact that despite their "uniquely textured world" Rage does not even look as good as, say, Battlefield 3 (and that GPU rendering is probably much more of a bottleneck than this texture (de)compression) iD makes the most boring corridor-style shooters in the world - it's better to think of them as tech demos for their engines...
...so I guess it shouldn't come as a surprise that I cannot recall any 3rd party licensing any of their engines for at least 5-6 years now - they are usually sort of unfinished/poorly balanced game engines: always have nice, shiny parts but always at a price of some serious tradeoff.
This was clearly manifested on their sheets when couple of years ago financials forced them to sell the company (iD) to the owners of Bethesda (Fallout- and Elder Scroll-series) as I recall.
Intel really needs to leave 2000 behind, it's 2011 - someone needs to tell their PR dept that people stopped playing Q3 a loooong time ago and Rage will not win awards with its piss-poor sales on PC either.
Sandy Bridge-E getting panned by most reviewers
Sandy Bridge-E and the X79 chipset are mostly hype and hack. The platform is being touted as an enthusiasts platform but in fact it's a hacked server platform that does not function well for either market segment. The only thing extreme is the CPU and mobo pricing. While there is definitely some performance to be gained in certain apps - which have nothing to do with "enthusiasts", there is a very disproportionately high price for the modest performance gains. So for now, SB-E is a hacked mess not desired by many.
With a stock watercooler...
Whats the power consumption on that thing... 1.21GW??!?
Title is optional?
Temperature too high
3.8GHz means that the power of super high,Motherboard temperature also increases,Cooling has become an important issue.
For me, it's all about RAM addressing.
I've several Core i7 laptops, but they can only hold 16GB of RAM which isn't much use.
It would be more important for me, as a hobbyist Data architecture researcher, to have a quad core box that could map 1 TB of ram, than 32 GB Ram, and 12 Cores.
Of course, not everyone has the same interests.
Why are you bothering with laptops?
The i7 can take what, 24GB max, yet you're stuck at 16 because you're letting yourself be hobbled by the form factor. If you wished you could easily go way beyond that.
A quick search shows system boards that go up to 288GB (2x xeon), no, 512GB (4x opteron). Not cheap, of course, but if you really need to focus on mapping ram, why are you bothering with more than one laptop as a fancy interface to your very own (cluster of) server(s)? A couple of boxes in the basement and a bit of networking should go a long way.
Or buy into the cloud and move data around from a continent away.
re:old school programmers
People were saying the same thing a decade ago, and probably a decade before that... "wasting cycles" because the Pentium is so fast.
It hasn't exactly improved in the meantime
now has it?
You can run a very decent workstation on five year old hardware as long as you use some free unix*, even the latest releases, but not if you try and use redmondian software. There is also a very noticeable performance gap between, oh, opera and firefox. And even the faster software likely could do even better.
But instead it's standing practice to give developers the fastest hardware available to keep them from twiddling thumbs while their projects compile. Testing on slower boxes easily falls by the wayside. So "everyone" just sighs and buys new hardware. Which is a pity because it means that though we do get some benefit from the increase in hardware capabilities, we're not seeing all of it. Despite our advances in algorithms and such that should actually give us even more performance, over and above the "simple" hardware-offered increase. And yes, this has been going on for ages.
* Currently following video lectures and running matrix problems with octave on a 1.5GHz via C7.
- Google: Surge in pressure from govts to DELETE CHUNKS of the web
- Oi, bank manager. Only you've got my email address - where're these TROJANS coming from?
- Updated ARM server chip upstart Calxeda bites the dust in its quest for 64-bit glory
- Macbook webcams CAN spy on you - and you simply CAN'T TELL
- Feminist Software Foundation gets grumpy with GitHub … or does it?