back to article Apple turns to Intel for low-end laptop graphics

Apple is said to be gearing up to use the next generation of Intel graphics in its MacBook and - probably - MacBook Air - laptops in place of the Nvidia technology they currently use. So claims CNet, citing unnamed moles - whether at Apple or - more likely, we'd say - Intel isn't made clear. Intel's new graphics tech will be …

COMMENTS

This topic is closed for new posts.
  1. Charlie Clark Silver badge
    Jobs Horns

    Wait and see

    Probably makes more sense to use AMD's integrated offering now that the low power versions are available and continue to differentiate at high-end with discrete. Having AMD, Intel and nVidia as suppliers should help with price negotiations as well: Apple doesn't need Intel's engineering expertise as much as it did a few years ago.

    Of course, there is also the possibility of using the Power VX stuff from the iPhone for graphics. The really would put the cat in amongst the pigeons!

  2. Anonymous Coward
    Anonymous Coward

    Old MacBook Air

    The old MacBook Air used to overheat and underclock itself for thermal reasons, that's why the new ones are faster for CPU-bound tasks. Graphics chip performance plays a very small role in computer performance unless you are doing 3-D gaming. Sure, interface elements in OS X (and now Vista) are drawn as texture mapped polygons, but you are only going to be looking at maybe a couple hundred of those on-screen at any given time, and any graphics chip made in the last 10 years will be able to handle that no problem.

    Actually, if you look at the Hackintosh community, it is fairly common for people to not be able to get 3-D acceleration to work, so the OS will rasterize everything in software. It's noticeably slower for doing animations like the dashboard fading in/out and Expose but otherwise it's barely noticeable.

    So, the importance of graphics performance is completely overblown, with exceptions for gaming and potentially using the GPU as a general purpose compute engine... not sure if any mainstream software does that yet though.

  3. Goat Jam
    FAIL

    A big leap in GFX performance

    Firstly, I have heard that before from intel and it never amounted to much.

    Secondly, when you are coming from such a low base (as intel are) then a "huge leap" doesn't necessarily amount to "great", "good" or even "acceptable" performance when compared to others.

    Thirdly, it is going to take more than putting the CPU and GPU on the same die to make intel a contender in the graphics market.

    Honestly, if I were intel, I would be seriously looking at purchasing nVidia* and just stop with the pretense of building their own sub-par graphics chips.

    They are simply not capable of doing much other than massage their ancient x86 platform. Almost everything they try outside that sphere has been pants.

    Itanic anyone?

    * Not that I personally want that to happen. I like nVidia cards and I would hate to have to move to AMD/ATi just to avoid purchasing stuff from intel.

    1. paulf
      Pint

      "huge leap" / "great" / "good" / "acceptable"

      IIRC Intel reckon their integrated graphics are "good enough" (at least according to some guy at AMD!):

      http://www.theregister.co.uk/2010/11/10/amd_talks_intel_at_analysts_day/

      The most amusing Intel-bashing came from Rick Bergman, senior vice president and general manager of AMD's products group, who took issue with what he claimed was Intel's identification of its graphics performance as "good enough".

      If they're already "good enough" why bother with anything like a "huge leap", eh, Chipzilla?

      I suppose people are just spending all that money on nVidia and ATI graphics on the off chance Intel graphics aren't "good enough".

      Pint - its Friday

  4. Peter Kay

    Intel graphics chipsets aren't that bad

    Whilst Intel occasionally make claims about the gaming performance, that has never been their real focus. Their graphics chipsets provide decent 2D acceleration and video decoding with low power usage.

    For laptops and general office productivity their chipsets are ideal. I have Intel chipsets in my work systems and my laptops and with few exceptions don't want/need more power. My home desktop which plays games as well as productivity, is of course a different matter.

This topic is closed for new posts.

Other stories you might like