back to article AMD's Fusion details break from containment

After AMD announced plans for the CPU/GPU 'Fusion' processor on the back of its acquisition of ATI in 2006, the company has since been extremely guarded with details. But folks at TGDaily say they've unearthed news on Fusion chips from unnamed industry sources. According to the publication, the first Fusion processor, code- …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    Hmm, sounds like the cell.

    A CPU that has also has the qualities of a GPU.

    The only difference of course, coming from AMD/ATi, it's bound to be crap.

  2. Danny
    Paris Hilton

    Is it me

    or does this seem like a mad thing to do? Upgradeability will be poor. I doubt crossfire will work to extend the life of a part. What about multi core? Heat? Are we sure that these are not just going to be a low level integrated graphics killer and not high end parts?

    Paris, she likes high end parts.

  3. Ru
    Thumb Down

    Re: Hmm, sounds like the cell.

    Only it will have a graphics core instead of little vector processing things. So you might be able to use it in a cell like way (depending on what sort of access you can get to the graphics core) but you couldn't use a cell like a fusion chip, cos the cell doesn't have a graphics core on it.

    So if you actually meant 'its like a cell because its a processor' or 'its not really very much like a cell' then, yes. I agree.

  4. Anonymous Coward
    Anonymous Coward

    RE: Hmm, sounds like the cell.

    It sounds NOTHING like a Cell, completely different type and purpose of CPU...

    Basically it seems to me what Intel and AMD are doing is taking basic graphics out of the chipsets (will simply chipset lines as only the non graphics variants would be necessary) and putting them on the CPU package. Simpler for OEMs - same chipset for all SKUs, from the basic models with fused CPUs up to the higher end with standard CPU only package and graphics cards in a PCI-E slot, no wasted silicon (and no wasted few £/$) in the high end models where integrated graphics not being used in the chipset.

    Could also simplify upgrades, and mean that a CPU was always accompanied by a matching level of basic graphics - system design can be simpler - just buy one CPU that is certified for HD video for example - rather than having to buy a CPU that's powerful enough, and check your graphics card can do it as well...

    Doesn't seem at all bad an idea...

  5. Eric Van Haesendonck
    Thumb Up

    Good idea

    I think it's a very good idea, as this removes the need for a graphic core in the chipset or an external GPU (except for the hardcore gamers of course).

    This would also allow for better "integrated" graphics than when integrated on the chipset since the CPU die is cooled by the CPU fan (while the chipset usually has passive cooling) => we can see machines with decent graphica perfomance and only 2 fans: one for the CPU and one for the PSU.

    Also this could lead to some savings on the MB side since with the GPU and memory controler integrated only one chip would probably be needed (no north and south bridges). In the case of compact machines or laptops this could be a major advantage.

  6. Anonymous Coward
    Anonymous Coward

    Can compete with discrete parts?

    I am wondering if this CPU+GPU approach will be able to perform as discrete GPU parts + CPU. Well, no more PCI bottlenecks with this approach, but current discrete GPUs take a lot of power, far more than current CPUs and I guess that with this new concept GPU should be more power efficient.

    Anyway it seems a good idea for laptops and notebooks and desktops for non hardcore gamers (so, for 99% or so of all computers).

  7. Anonymous Coward
    Heart

    Integrated chips

    It looks like they are going for laptop and business pc's, where Intel chippery dominates.

    It makes sense, just look at the reviews of the 780G chipset, this is just a another logical step. It will bring down prices without affecting 90% of users who just want to surf the web, watch videos, type letters and edit photos...

  8. Jonathan

    The nice thing about this is...

    For those who want discrete graphics processors (like me) it means you have a free vector processor to use for physics calculations. With both ATI and NVidia looking into physics on GPUs, its not much of a stretch to see that if you had a Fusion CPU and a discrete GPU, your Fusion GPU could do the physics while your discrete GPU does the graphics. Sounds like a match made in gaming heaven to me.

    @AC

    Yeah, everything that ATI/AMD makes is crap, thats why the 4850 and 4870 are the new cards to have, forcing Nvidia to drop its prices by 25%. Maybe you would rather there was no AMD and nvidia could charge whatever they liked?

  9. Anonymous Coward
    Stop

    RE: Hmm, sounds like the cell.

    Smells like a Sony Playstation 3 fanboi.

  10. Charles
    Thumb Up

    @Jonathon

    I was just thinking of that angle. Of course, the built-in GPU chipset needs to be pretty decent to start with. I can tell you now that if it's pretty basic (say like an nVidia 8400GS), it's not really worth the benefit compared to a modern multicore CPU. OTOH, if it's closer to, say, the 8800GT (which is capable of doing games at a pretty solid clip) then you can do some serious stuff with it, with or without the added boost of a new GPU.

  11. Dave

    Integrate to survive

    I think it's great that intel has a competitor and all.

    But AMD's advancement strategy seems to be about integrating traditional mobo components to give it's processors the edge. First the memory controller, and now the GPU...

    Perhaps there is more to this? Perhaps this is going to be an architectural paradigm shift in the same vein as Larrabee.

    But, perhaps, this is just a desperate slog towards System On a Chip to scrounge at reduced latency and market it as a processor improvement.

    I'll look forward to the details, but if AMD want to stimulate market interest, they are going to have to give us more than this.

  12. Peyton
    Happy

    @sounds like the cell

    Oh please god no! Give me something I can actually program for without my losing my sanity!! @_@

  13. benn gold
    Thumb Up

    back to the future

    Am I mistaken, or didn't Intel have a fusion-type processor way back when with the I860? The difficulty in getting the most bang for the buck (as a supercomputer type part) was in actually getting to use the graphics instructions effectively. I think that the VLIW instructions were an effort to move in that direction.

  14. Maxwell Starr
    Thumb Up

    Think about it...

    An RV800 Graphics chip clocked at full CPU speed... If HD4850 has 800 Shaders at 625MHz, think of what 100+ shaders with no PCI bottlenecks can do at 4 times the speed. That'll be some decent Intergrated graphics.

  15. Charles

    @benn

    According to what I've read, the i860 suffered because it tried to put too much of the pipelining and scheduling work on the programmer and compiler (similar to the Itanium). It proved to be too difficult to program efficiently.

    Whereas with the Fusion, they're integrating two well-established chips: an x64-based CPU with a modern ATI GPU, both of which have a long real-world history.

  16. Kurt Guntheroth

    $1k cpu

    Integrated graphics isn't about what you want. It's about what Intel wants.

    The makers still dream about the days when they got $1,000 for a chip. Well, nowadays it's $200/chip. CPUs are getting smaller and cheaper. The only way to get the price they want is stuffing more stuff on the chip. They'll integrate RAM, graphics, sound, anything.

    Upgradeability sucks, but they want you to upgrade to a new PC every year or two, like with your cell phone, but more bucks. In fact, they want your PC to become a single sealed unit, just like a cell phone. It's cheaper to make, it will be lighter and thinner, and it will be oh so disposable.

    For the makers, the alternative is unthinkable; chips getting cheaper every year until you pay more for the steel in the case than for the magic in the chips. PCs MUST be expensive and difficult to upgrade.

  17. Doug Lynn

    AMD still rules!

    Hi, AMD is not crap, Intel still does not have a discrete quad core chip. AMD is innovative and they still the best Phenom gamng cpu out there for Unreal Engine3.

  18. Jonathan Tate
    Alert

    Larrabee. Larrabee! *LARRABEE!!!*

    Seeing as I'm turning out to be the resident Larrabee fanboy I just figured it'd be appropriate.

    Exclamation point because LARRABEE!! LARRABEEEE!!!

  19. Steven Knox
    Dead Vulture

    code-named "Shrike"

    Named after the bird, or the horribly-beweaponed metallic assassin from the Hyperion series?

    <--- a shrike after crossing the Shrike

  20. b166er

    Hardcore gamers?

    Please try to recall that there are many people who appreciate decent graphics processing, other than gamers.

    Architects, graphics designers, scientists to name a few.

    I thought that these next generation solutions were in part designed to do simple hand-offs to discrete solutions when more grunt is required.

    A much greener solution - less silicon and manufacturing, less power required in the majority of systems, with the ability to ramp up as required. Win-win.

  21. Zmodem

    Im lost

    a vector is a few triangles,

    n/s bridges were on boards long before the bottleneck nforce,

  22. joshua kidd

    old news really

    AMD roadmaps ahve been showing these for almost a year, they will start with low end parts this year and high end parts starting at the end of 2009, with the best ones coming in early 2010 with the drop in process, putting a gpu on the cpu can result in massive gains in performance, as the cpu to gpu path way is direct, which is one of the bottle necks in the system today, also system memory path way is also direct as the memory controller is on the chip most likely out side the main cores so the gpu will have direct access to the memory controller, and with the memory controllers capability to address more than bank of ram, and even differnt types per bank, its possible we migth see gddr sticks available for sale seperately and motherboard with seperate banks for the system memory and graphics memory.

    course most this comes from articles already written and which they themselves are speculation based on what the amd's current crop of cpus can do already, that just isnt being used except for multi proc. opteron boards.

    This tech could signifigantly change pc architecture or it may flop or end up a niche market for laptops and business pc's, compatability and standards will determine its ultimate fate.

  23. Anonymous Coward
    Anonymous Coward

    Cheap PC's....

    It'll be crap for gamers, current high end graphics has about 512MB of extra fast memory on an extra wide bus. Looks like this'll use system RAM.

    Microsofts next OS (Vista) still doesn't have cheap machines that can run it properly.

    Businesses will eventually have to switch to it, so there is a demand for cheap as chips machines with reasonable graphics.

    So, stick everything on one chip. If they could eliminate the PCI bus completely, the pin count would be even lower and the whole caboodle would fit in a nice small cheap box.

    If you're making a mass produced discount system that businesses will buy by the thousand then this is the kind of stuff you want.

  24. Anonymous Coward
    Anonymous Coward

    /yawn

    Not interested in integrated nothing that gets switched off in the BIOS as soon as I install a card that does it better. In fact cache and core type are what I look at when buying.

    Low power consumption is what they should be working on, I won't be impressed till I see a chip that can power itself from the heat generated by running Vista and DX10 simultaneously.

This topic is closed for new posts.

Other stories you might like