back to article Move over Radeon, GeForce – Intel has a new graphics brand: Iris

Intel is getting increasingly serious about integrated-graphics performance, and to prove it they've done what any self-respecting marketeer would demand: they've rebranded their top-performing parts. Meet "Intel Iris Graphics", slated to appear in the top-performing parts of Chipzilla's soon-to-be-released "Haswell" chippery …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    No thanks. Still don't have a reasonable linux driver for the original one

    Doubt they'll get round to a decent linux driver for the new ones for at least two or three years.

    1. Anonymous Custard

      Re: No thanks. Still don't have a reasonable linux driver for the original one

      Would be nice to have decent Windows drivers too for most of their silicon - at least some that support OpenGL and similar stuff.

      And where are the actual specs for these new chips, not just meaningless graphs without any context, detail or background? So proper comparisons can be done, rather than this all singing fashion show of dressing up old stuff in new clothes?

    2. Jamie Jones Silver badge
      WTF?

      Re: No thanks. Still don't have a reasonable linux driver for the original one

      The sandy-bridge 3d drivers on xorg on FreeBSD work fine

    3. Alan Brown Silver badge

      Re: No thanks. Still don't have a reasonable linux driver for the original one

      It gets worse: Newer Intel drivers have explicitly removed support for older chipsets and will continue to do so on a rolling basis,

      That's one way to avoid fixing long-standing driver issues...

  2. Sandtitz Silver badge
    FAIL

    Promises, promises

    Ever since i740 Intel has promised "amazing oomph" and each time failed to even come close to their competition in anything related with 3D (gaming or design), due to both poor performance and driver quality.

    Their products are perfectly fine on regular desktop usage and I'm typing this on a machine with Intel graphics.

    Intel has many times bigger workforce than AMD or Nvidia combined, they're working on bleeding edge CPU's, NAND, lithography and so forth, yet they are constantly years behind AMD/NV products if you measure the gap in pure performance. Why is this?

    1. Anonymous Coward
      Anonymous Coward

      Re: Promises, promises

      Why? Simple: They don't have to be better to win, just good enough. More cash comes out of the large majority that just want a display, not to play video games. Also, don't forget that their competition is aiming a wicked eye at them by pushing GPU cycles, cores, and math. If Intel sits still, they could loose the desktop/mobile market once and for all, and we all could be whisked away to Cuda`Ville.

      No matter. If the "Intel HDA" driver or its proprietary cousins actually performed in an non-ass way, I'd actually care about this news. However, being that the driver might as well come shipped blacklisted, Intel is wasting their time.

    2. Skoorb

      Re: Promises, promises

      What I am going to be interested in is the bang per buck. How much would an equivalently performing Intel / Nvidia or AMD combo cost for a laptop? If the Intel chip is cheaper, then it becomes very interesting for users that don't do any serious 3D gaming or major CAD works.

      If it is more expensive though, or years behind, then I agree that something is amiss.

    3. Yet Another Anonymous coward Silver badge

      Re: Promises, promises

      > Why is this?

      Because the drivers are written by the guy sweeping up after those working on bleeding edge CPU's, NAND, lithography and so forth

      1. Anonymous Custard

        Re: Promises, promises

        And good luck finding a CrystalWell product for the desktop. Cos according to all sources I've seen about them so far, basically there won't be any - it's mobile (ultrabook mainly) only on the roadmap.

        1. Epobirs

          Re: Promises, promises

          Why would they bother? Desktop users with graphic performance as a primary concern have plenty to choose from in video cards with Nvidia and AMD parts.

          Intel is much more interested in design wins where power and physical volume are driving factors. The return on investment is far better for enabling better graphics performance with decent battery life in a notebook than for doing anything other than cutting video on the desktop. And as long as the corporate sector is satisfied with Intel's latest, which is still an improvement over the Ivy Bridge GPUs, they will continue to own more desktops than AMD and Nvidia combined by a huge margin. If a cubicle drone can get Skyrim to play decently on his Intel-only box, bonus!

  3. P. Lee

    Won't SGI have something to say about this?

    or am I just showing my age...

    1. Dave 126

      Re: Won't SGI have something to say about this?

      Well, I don't wish to be impertinent.... : D SGI are all about storage, data centres and HPC. I would imagine Silicon Graphics, Inc would never have had the revenues that the consumer-centric nVidia and ATi had.

    2. Rampant Spaniel

      Re: Won't SGI have something to say about this?

      Silicon Graphics, commonly abbreviated to sgi for as long as I can remember may have a case there. Given those chips will be featuring alongside apples retina display, expect rackables baby to get sued by apple anytime now for their previously infringing workstations.

  4. toughluck
    FAIL

    Wow! 75 times faster than... whaaat?

    Seriously, who are they kidding? Why not claim they are seventy-five HUNDRED (7500) times faster than ViRGE, the 3D decelerator? While they're at it, why not remember that their Core i7 CPUs are several hundred thousand times faster than 8088?

    75×rubbish is just rubbish, but more of it. Their drivers are bad, the performance is in the basement compared to integrated GPUs from AMD. While they could be on to something with Iris, the competition would need to stand still for the last five years. Wake up call, Intel! You are NOT competing with 2006 chipset-integrated Radeon or GeForce! You're going to compete with 2014 APUs which are going to include hUMA (which for most users will mean PS4-like GDDR5 system memory). Your GPU may well be 75 times faster than in 2006, but AMD's GPUs made more improvement in the last 7 years and you are not going to fool anyone.

    1. Dave 126

      Re: Wow! 75 times faster than... whaaat?

      > you are not going to fool anyone.

      Er, who do you think they are trying to fool? HD 4000 is already plenty good enough for anything other than more recent games and CAD work, handles transcoding quickly enough, and is happy to run a few monitors and decode some full HD video.

      Gamers and CAD users know their own needs, and will usually buy a machine with discrete graphics hardware- after having researched benchmarks, game frame rates and any reports of driver issues.

    2. Voland's right hand Silver badge
      Devil

      Re: Wow! 75 times faster than... whaaat?

      Seconded.

      In any case, AMD check-mated everyone in the GPU integration game by making it cache coherent in their announcement for their next GPU. That is not just "faster", it is differently faster - GPU ops no longer have the latency associated with them and the GPU becomes one enormous co-processor.

      Everything else (including what Intel does) is bundling and bill of materials savings. 75 times faster snail is still a snail.

      1. Nigel 11
        Headmaster

        75 times faster snail is still a snail.

        Err ... no.

        Say a snail can do 1 cm/second. 75 times that may be a funereal walking pace, but is still far beyond any land-dwelling mollusc.

    3. Steve.T

      Re: Wow! 75 times faster than... whaaat?

      A simple glance at the diagram shows that they're comparing it to 2006 series which was the first Core-2-Duo (code-name Conroe). Reading graphs is not rocket science :/

      What is amazing though is that the units have decreased TDP yet increased the overall perfromance, meaning more performance per watt. From the demos Ive seen these processors are serious about gaming and any 3d application (play BF3 and MOH just fine). You can watch ~20x 720p videos without breaking sweat (under 30% CPU load) (if your SSDs can push the data through). Given that Intel didnt buy ATI like AMD they do have a steeper learning curve...

      In any case it will be interesting to see just how well they fit into tablet markets when you dont need high-end workstations but a combination of on-demand performance and long battery life.

      1. toughluck
        WTF?

        Re: Wow! 75 times faster than... whaaat?

        @Steve.T: Reading comprehension, man. It was obviously irony. Should I have used HTML5-compliant <sarcasm> tags?

        They can be used for light gaming, assuming you're happy with 1366×768 resolution at absolutely lowest settings (some games provide Intel-specific setting, which offers quality even below the basic).

        Oh, and funny you should mention AMD bought ATi. Remember Intel740? Thought not. Intel bought Real3D and released their GPU in 1998 -- eight years before AMD bought ATi. They had EIGHT MORE YEARS to develop the (admittedly rubbish at the time) solution into a solid product. When AMD bought ATi, they were struggling with their lineup, slowly recovering from 2000 series debacle with notably improved 3000 series, but they weren't well entrenched until releasing the 4000 series and Evergreen. Integrated GPUs from ATi were already vastly better than Intel's at that time and it was without much prior support from AMD that the GPU was excellent. Intel's GPUs continued to lag behind AMD's, and when AMD integrated them into APUs, Intel was again outstripped.

        Between 1998 and 2006, Intel had time to improve their GPUs. They failed. They had eight years of possibilities to integrate the CPU and GPU within the hardware, even when the GPU resided in NB, but they didn't care about it. Since 2006 they have slowly improved, with each generation about doubling the performance, but it was still way behind the curve. Seeing Intel's lack of initiative I have to call bullshit on this 'Iris'. Maybe Haswell is not going to bring anything new to the table in terms of graphics (aside from increased clocks) and Iris is just a way to counter the lack of performance by doubling the number of GPUs.

        As for playing video streams -- Intel's CPUs DO NOT use the GPU portion for decoding the stream. The CPU has a dedicated processing unit for this. And although it is impressive in its own right, it is supposed to play high numbers of video streams without breaking a sweat.

        And your last paragraph -- as long as Intel is trying to stick x86 into everyone's face, they will continue to fail. And it's funny how Intel continuously claims that their target is ahead of them. When i740 was released, high performance GPUs were their future. They failed. Then they said their goal was best integrated graphics. They failed. Then they were supposed to release Larrabee, which was supposed to introduce Intel to the enthusiast GPU market. When that failed, they said Larrabee was intended for heterogeneous computing all the time and they never intended it to be a GPU. Now you are saying their goal is best performance in tablets? Ain't gonna happen. Iris isn't going to convince anyone, either.

  5. Anonymous Coward
    Anonymous Coward

    Not exactly...

    All Intel has done is rename their poor HD 4000 graphics. The performance gains are minimal and not even close to what they need to be competitive in APUs.

    1. Anonymous Coward
      Anonymous Coward

      Re: Not exactly...

      If you look at product sales, it would seem that Intel is a bit more than just competitive.

    2. JDX Gold badge

      Re: Not exactly...

      the HD4000 is pretty good, a massive leap from the previous generation. It runs full D3D10/11 and our 3D software runs on it happily. Obviously not going to be used in a gaming rig but it will run games.

      1. Nick Ryan Silver badge

        Re: Not exactly...

        Agreed. They are leaps ahead of what they were before. It's no longer a case of "argh, integrated Intel chipset with obscure set of digits that'll take a day to track down" and now "that's not so bad, it works ok now".

        They're not speed demons, but at least now they are perfectly adequate for the majority of computer users' needs. i.e. Using a computer in place of a typewriter and browsing the web a bit.

  6. Anonymous Coward
    Anonymous Coward

    Has this product come out of the remnants of the ill-fated graphics project that Intel shutdown a couple of years ago? I am not able to recall the name of that project but it did get a lot of press then. Unfortunately the project did not come to fruition.

    1. Anonymous Coward
      Anonymous Coward

      You're thinking of Larrabee. Basically 16+ general purpose x86 CPUs on one chip. Would have been pretty cool for general purpose computing but presumably wasn't that great for game graphics.

      http://en.wikipedia.org/wiki/Larrabee_(microarchitecture)

      1. TeeCee Gold badge

        Would have been pretty cool for general purpose computing...

        But nowhere near as much bang for buck as GPU compute with conventional GPUs from AMD and nVidia. Presumably that's what caused it to be stillborn, everyone else moved the goalposts.

        1. Anonymous Coward
          Anonymous Coward

          "But nowhere near as much bang for buck as GPU compute with conventional GPUs from AMD and nVidia."

          Depends on workload. There are a lot of non-vector compute things you still can't do with those GPU cores. There's a reason you're not using one to run Windows 8 (or whatever).

  7. Anonymous Coward
    Joke

    IRIS

    Naming it after my (blind) Great Grandmother says it all.

    Will it also smell of Lavender????

    1. Anonymous Blowhard

      Re: IRIS

      No, silly, it's because the iris is in front of the retina; next generation will be called the "Cornea"

      1. Martin Maloney
        Coat

        Re: IRIS

        Oh, no! Are the jokes around here getting, erm, cornea?

      2. Patrick R
        Joke

        Re: IRIS drivers

        Update to the latest Cataract.

        1. Martin Maloney
          Coat

          Re: IRIS drivers

          Lens be careful. Some folks might take a dim view of cataract jokes.

          1. wowfood
            Trollface

            Re: IRIS drivers

            I see what you did there. But really, we shouldn't be so short-sighted about all this. Intel have managed to perform miracles before. Now I don't advise blind faith, but we should still keep an eye on how this develops.

  8. jonathan keith

    Yeah, right

    Have they hired some new coders to replace the bunch who write their legendarily awful drivers?

    1. Anonymous Coward
      Anonymous Coward

      Re: Yeah, right

      Seem to work fine with OS X.

  9. Wang N Staines
    Happy

    If MS is not allowed to bundle IE with Windows then Intel shouldn't be allowed to bundle their graphics with the CPU.

    ;-)

  10. Dropper
    FAIL

    Huh

    Good job Intel, but 1987 called to ask for their Blitter back.

This topic is closed for new posts.

Other stories you might like