back to article Intel's Sandy Bridge welcomes discrete graphics

The on-chip graphics of Intel's Sandy Bridge processor may be measurably ahead of Chipzilla's previous integrated graphics, but it's not intended to replace discrete graphics for high-end users and dedicated gamers. "I don't see high-end discrete graphics cards going away," said Tom Piazza, Intel Fellow and graphics- …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Jobs Horns

    LGZS Platform Controller?

    The current Intel processors apparently need a separate chip called an "LGZS Platform Controller" in order to connect a discrete GPU, the need for which some have speculated (most notably in an article at Arstechnica) is related to Apple not offering a 13" MacBook Pro with an i5 processor, with the reasoning going along the lines that there wouldn't be enough space on the motherboard to hold the rather large LGZS thingy, the rather large processor, and the similarly large discrete GPU. Apple doesn't want to give up the high performance discrete graphics, so it has to suffer along with the older Core2 Duo.

    Now my vague understanding is that Sandy Bridge does a better job of integrating things, so perhaps the functions of the LGZS Platform Controller no longer require a separate chip. Does anyone know if this is the case?

    Also, for what it's worth: I've driven across a bridge over the Sandy River east of Portland Oregon in the United States. Intel has a habit of assigning code names inspired by Oregon geography, so it seems plausible this bridge is the source of the name. Maybe some chip design team executive commutes across it every day. The bridge itself is a green steel truss bridge and is in a very scenic area near the Columbia River Gorge.

    Evil Steve because I can't prove he isn't sticking with a C2D in the 13" MacBook Pro just to annoy me as revenge for me not buying an iPhone.

  2. Trevor_Pott Gold badge

    Show me the 3DMarks.

    Regardless, it still won't play crysis.

    1. dogged
      Boffin

      not relevant

      nobody actually PLAYS Crysis, Trevor. It's just used for benchmarking.

      1. Trevor_Pott Gold badge
        Unhappy

        but...but...

        ...I like Crysis.

    2. Matt Bryant Silver badge
      Boffin

      RE: Show me the 3DMarks.

      "Regardless, it still won't play crysis." True, it probably won't even if you turn off half the eye candy, but then many of the graphics cards on the market today struggle with Crysis. The point is there are many "light" gamers that won't need that level of graphics capability, and for them it might be a reasonable option. Those that do need that high level of capability are the type that build their own rigs with top-of-the-line graphics cards anyway, and they already have the mindset that says they need to buy a discrete graphics card rather than use onboard (all part of that bragging about your fps in CZ or the like).

      Besides, Crysis is so old, we need a new and hip measure of OTT demanding game. DNF, maybe?

  3. Anonymous Coward
    Anonymous Coward

    FAIL!

    The performance benefit of integrating the two is a one off gain, after that, nVidia will release a faster card to compensate and the sandy bridge will just be another shit onboard graphics option unless Intel can somehow keep ahead.

    Anyone with half a brain knows that graphics performance is king in the gaming world, and making that something you can't choose or easily/cost effectively upgrade is just stupid. In the non gaming world, very few care about graphics performance.

    1. Chris Beach

      Fail for Intel Maybe

      This might fail for Intel, but for consumers it could be that we get what's been missing for a while: Decent laptop graphics capability!

      If SandyBridge is 'good enough' to make the current AMD and Nvidia offerings too expensive for the performance, then I'd imagine one or both of them will release updated products that will outperform the Intel offerijng. And then we might actually have laptops that can run most games at the native res of the monitor...and not rely on horrible scaling or running everything on low.

      But having said all that, gaming laptops is niche, even if OS's and other products start to use the GPU for offloading more work...the general consumer probly doesn't care, and SandyBridge will be a cheap all in one solution for most OEM's.

    2. Matt Bryant Silver badge
      Boffin

      RE: FAIL!

      ".....after that, nVidia will release a faster card to compensate and the sandy bridge will just be another shit onboard graphics....." I have four PCs at home with integrated graphics chips on their mobos, of which only one hasn't had a discrete graphics card added. Two of the PCs with add-in graphics have nVidia chipset mobos with onboard graphics, the other two are Intel chipset. In nVidia's case, they actually don't want to make onboard graphics too good because it would eat into their graphic card market, and a whole mobo usually sells for less than even a middling graphics card. In intel's case it has been because Intel was primarily fixated on the office desktop, which required minimal graphics. Intel has the most to gain by actually making integrated graphics good enough to make a hole in graphic card sales as they don;t really have a graphics card bizz to worry about, which is why nVidia and AMD/ATi better hope Intel don't develop Sandy Bridge into something really good.

  4. TeeCee Gold badge
    FAIL

    "nor do I see Formula One race cars going away just because we built Priuses."

    Gosh Intel, I hate to piss on your picnic but even if you were building Bugatti Veyrons rather than Priuses, Formula One cars *still* wouldn't go away.

    Is that your bottom lip quivering?

    1. zimblade

      I am unhappy with intel

      Intel seems to think that the world will keep upgrading to their new platforms every two years, when is enough enough, computers do a dam fine job now why bother with all this integrated graphics business when only a small portion of users will ever use this platform. I personally started using AMD a couple of years ago because of Intels overpriced 2nd rate cpu's and have not looked back and saved a lot of loot in the mean time.http://www.theregister.co.uk/Design/graphics/icons/comment/unhappy_32.png

This topic is closed for new posts.

Other stories you might like