back to article AMD's 'Revolution' will be televised ... if its CPU-GPU frankenchip Kaveri is a hit

AMD has released its long-awaited Kaveri processor, the first accelerated processing unit (APU) to incorporate both on-die CPU and GPU cores in a heterogeneous system architecture (HSA) with a shared memory architecture. AMD Kaveri - overview Behold Kaveri, in which CPUs and GPUs are equal 'compute unit' partners (click to …

COMMENTS

This topic is closed for new posts.
  1. phil dude
    Linux

    linpack....

    or it didn't happen....

    I trawled high and low at SC13 trying to find AMD benchmarks,NVIDIA and Intel have been far more aggressive about this area.

    This is potentially exciting area for those of us that wobble molecules, and perhaps even add more physics to games ;-)

    P.

    1. Captain Scarlet Silver badge
      Childcatcher

      Re: linpack....

      Definatly would like to see how it would fold compared to a normal machine

    2. Anonymous Coward
      Anonymous Coward

      Re: linpack....

      So by "wobbling molecules" you mean more realistic jubbly bounce on Lara Croft in the next Tomb Raider?

  2. Semtex451
    Mushroom

    Miners will Decide

    We know how well Hawaii churns out cryptocoins.

    If the experience of AMDs discrete gpus is anything to go by, they'll be unable to bake enough of these to meet the demand.

    Kerrrching

    1. Steve Todd

      Re: Miners will Decide

      Mining with GPU's still? How very old fashioned (and unlikely to make you your costs back now you have to compete with FGPA and ASIC rigs).

      1. Semtex451

        Re: Miners will Decide

        Alas yes - Litecoin in particular has caused a shortage of many of AMDs GPUs not just Hawaii

      2. Nick Ryan Silver badge

        Re: Miners will Decide (@Steve Todd)

        While ASICs are the only way ahead (largely forget FPGAs as well as GPUs), this is currently only true for BitCoins or very similar. Other algorithmic digital virtual currencies have different requirements which don't search ASICs as well as they were designed as such from the outset.

  3. James 51

    Will this need new code or recomplication to take advantage of all the goodies?

    1. Destroy All Monsters Silver badge

      Recompliation ain't never gonna push your code to the GPUs, mon.

    2. ThomH

      In most cases it'll mean recoding and recompiling to take full advantage. Probably only OpenCL apps will just work more quickly, and theoretically DirectCompute apps but Microsoft's inexplicable decision to bury that in DirectX makes it somewhat obtuse and hence obscure.

      Right now Nvidia seems to dominate the market for compute languages with its proprietary CUDA, which isn't going to work on an AMD product but that and OpenCL aren't even as different as, say C++ and Java as they're built on fundamentally the same concepts. Think more like mid-'90s C++ and Ada95.

      1. Dave 126 Silver badge

        >Right now Nvidia seems to dominate the market for compute languages with its proprietary CUDA, which isn't going to work on an AMD product

        True at the moment, but ever since Apple announced the new (AMD-powered) Mac Pro, some software developers have been shifting their wares to work with OpenCL. Speaking naively, the consumer benefits since in time they will no longer be tied to one GPU vendor - nVidia gear can do OpenCL too, the clue is in the name.

        1. Dave 126 Silver badge

          >Litecoin in particular has caused a shortage of many of AMDs GPUs not just Hawaii

          Ironic really, since Litecoin was supposed to be based on the Scrypt POF, which deliberately imposes RAM demands so as not to hand an advantage to GPUs over CPUs. Alas, they didn't implement Scypt properly, so GPUs still give a c10 x advantage over CPUs.

  4. John Savard

    If the GPU can be used for computation other than video, which indeed has been done now for a while, it might be asked if the VCE, the UVD, and the audio thingy couldn't also be made to be pressed into service for doing any maths that might be a fit to their capabilities!

    But since the regular CPU is what runs most code, just calling the GPU cores 'cores' is going to be perceived as confusing, or even deceptive, marketing. They really shouldn't have gone there.

  5. Smudged

    Bought one this morning

    It is going to be part of my son's late Christmas present of a (hopefully) mid-end gaming PC. We've just been waiting on the Kaveris being released.

    If I can mine a bitcoin or two on it, brilliant.

    1. Nick Ryan Silver badge

      Re: Bought one this morning

      You won't be able to mine bitcoins on it. OK, technically you could, but your hash rate will be magnitudes inferior to even the current ASICs, let alone those that are due to hit operation soon (assuming that the new ASICs are not vapourware of course).

      The website https://en.bitcoin.it/wiki/Mining_hardware_comparison has a definitive comparison of CPU vs GPU vs ASIC bitcoin hash rates. Average GPUs are 10x faster than the best CPUs, current cheaply available ASICs are 10x faster than the best GPUs and the new wave of ASICs that are promised will be 100x-1000x faster still.

  6. Gary F

    A load of marketing hot air

    What if I order some new servers for our data centre with AMD "compute cores" and post installation discover they only have 4 cores and not 8 that can be used productively by our web and SQL software? I'll have 4 "compute cores" per socket doing absolutely nothing. I would feel very short changed and annoyed with this silly notion of no longer discriminating between CPU and GPU.

    1. Destroy All Monsters Silver badge
      Holmes

      Re: A load of marketing hot air

      I hope your are not actually in charge of procurement.

    2. Scott Pedigo
      Headmaster

      Re: A load of marketing hot air

      The article implies that in the future there could be various combinations of CPU and GPU. For example, possibly a chip with 8 CPU and 4 GPU. They'll be oriented toward specific needs, so if you only want CPU cores, there will presumably be one like that. Nobody will limit your selection. You will have to choose the correct product for your needs.

  7. Guillermo Lo Coco

    The VCE lack vp8 & vp9 support ?

    1. Anonymous Coward
      Anonymous Coward

      > The VCE lack vp8 & vp9 support ?

      Yeah, that was might thought looking through the codec list.

      Could they not add an equivalent of an on-board FPGA for this function and effectively program it via firmware?

  8. 1Rafayal

    I stopped reading when....

    ....I saw it didnt have an entire PS4 on die.

  9. Frumious Bandersnatch

    It's a poor sort of memory that only works one way

    By which I mean sticking with the one-dimensional layout. As you up the number of cores and (as they're doing here) introducing more speculative pre-fetching on either side of a branch you're putting more and more strain on the memory bandwidth. It's all well and good scaling your compute cores up to 12, but the memory bandwidth just isn't keeping pace. Wouldn't it make more sense to look at going to 2-d or some other architecture (maybe even use a projective plane like the Fano plane and let apps build custom topologies on top of it)? Even the PS3 had two ring-shaped memory buses (though main memory itself wasn't laid out like that), so it's not beyond the realm of possibility to get novel memory buses in consumer/general purpose machines.

    Maybe this is something we can expect to come along eventually thanks to the SeaMicro purchase? Or is that purely for inter-system connections?

  10. James Wheeler

    Actually a very important development

    AMD seems to be pitching these chips at consumer gaming devices, which I guess makes sense if you're showing at CES. But I'm much more excited about the potential for this architecture to solve the fatal flaw in today's GPU model - the need to copy data onto and off the GPU in order to take advantage of its vector architecture. When the GPU and main CPUs are on the same die, with direct access to the same memory (and the same on-chip cache?), the potential uses for vector-assisted number crunching expand from just big scientific/math tasks to things like BI and business analytics. I could care less about having this in a game console, but put the architecture in a data warehouse and analytics server and it could be a very bid deal.

  11. Zacherynuk

    I like the idea. I only hope it can throw 18Gbps out of a HDMI port without the rest of the bus falling over.

  12. Anonymous Coward
    Anonymous Coward

    This is a clean architecture, with the clear potential to get better for virtualization and running varied workloads. Right now AMD is unique in their ability to integrated CPUs and compute-capable GPUs, and they have used that advantage well.

    Lets look at what others are doing.

    Nvidia is ATI/AMD's traditional competitor, and has a multi-year lead in language, application, and ecosystem development. OpenCL is a poor second to CUDA, the language. And CUDA has grown to be far broader than a language extension. Nvidia didn't have a CPU to integrate, and thus concentrated development on mitigating the programmer's pain in dealing with two distinct memory spaces. They currently have "unified addressing", which essentially page faults from the other address space, rather than seamless, cache coherent virtual memory. But being stuck on the wrong side of the PCI bus isn't all bad. In exchange for not having cache coherence, the GPU has massive bandwidth to local memory, optimized for its access pattern.

    ARM Holdings is the other competitor, albeit indirectly. The Mali 600 series GPUs are compute capable, and are expected to become cache coherent with the upcoming ARMv8 cores. But thus far they have proven to be difficult to evaluate, with only a few quirky implementations (e.g. the first Exynos 5) and OpenCL support available to the public more than a year after the benchmark results were filed.

  13. Anonymous Coward
    Anonymous Coward

    AMD done good

    Hat's off to the engineers at AMD for developing and delivering a wholesale change in the x86 architecture that brings a huge performance boost to PCs of all kinds. Intel will need to copy AMD's tech because it's so good the entire x86 industry is supporting it. Consumers and AMD are the winners and for a few years until Intel figures out how to copy the tech. AMD's Mantle bumps performance even more so it's all good.

    1. larokus

      Re: AMD done good

      While it all looks great on paper and at firce glance more efficient, I will leave it to the benchmarks to decide just how effective this proves. I'm reminded of new Opteron cores sharing cache and memory in around 2008 which gave AMD a short-lived lead. Ultimately if Haswell+Discrete dusts Kaveri even at a marginal increase in power consumption, are you going to buy because in a whiny school kid on the playground voice you will shout 'but this is more efficient!!!' Tell it to my fps lead when I no-scope your dome. I miss AMD kicking ass, they just haven't since Athlon 1 in the consumer space. Here's to hoping.

  14. psychonaut

    graphics? buy a graphics card

    Dont get it. Want graphics? Buy a card on your budget from 16 quid to 1000 quid. If you don't need graphics over and above hd on an on motherboard gpu why bother? The systes I build use a 16 quid nvidia 210 1gb card. Can do hd. You can hook up 3 monitors if you need if you include the mb out. Why would you want to bake it into the cpu? Unless you are using a laptop. But

    THen who cares? If you are using a laptop for games then theres something wring with you. If its for consoles then I couldnt care less. Fuck me. Taje tour money and build a miles better pc.

    1. Destroy All Monsters Silver badge
      Coat

      Re: graphics? buy a graphics card

      Your wifebeater, sir!

    2. Ian 55

      Re: graphics? buy a graphics card

      I will admit to being surprised, but AMD's APU graphics capabilities are the sort of thing a good mid-range discreet card could do a couple of years ago. This means a lot of games are genuinely playable on it.

      Obviously, those wanting to do AAA FPS 'need' the latest and greatest card, but more and more people don't need any discreet card. The low power the APUs take compared to many graphics cards is another very pleasant surprise.

    3. Killraven

      Re: graphics? buy a graphics card

      " If you are using a laptop for games then theres something wring with you."

      Must be something wrong with millions of students who use their laptops for games, instead of trying to cram a couple more desktops into their dorm rooms. Not to mention limited budgets. Or people that have to travel a lot, and not being quite willing to haul their desktop along with them on flights.

      How about the cost of power, straight juice or air-conditioning? My latest home build PC uses an AMD APU that allows me to play games quite happily at higher graphics settings and decent frame rates, but only requires a 250 watt power supply. It runs nice and cool, rather than heating up the room.

    4. Anonymous Coward
      Anonymous Coward

      Re: graphics? buy a graphics card

      Are you that far removed from computer technology? They do not put GPUs on motherboard chipsets anymore. Duh...that connector on the motherboard these days leads to the one on the CPU socket. Even Intel's IGPs are all baked into the CPU now. Who makes these chipsets, lets see, Intel and AMD design and sell chipsets, so now instead of having to design an IGP into chipsets its not taking up any more space or more chips to integrate on a motherboard. And I guess you didn't read the article, because it was basically all about HSA, where the GPU cores can now be used as compute cores, even if you are using a discrete GPU for rendering video.

      1. Anonymous Coward
        Anonymous Coward

        Re: graphics? buy a graphics card

        > where the GPU cores can now be used as compute cores, even if you are using a discrete GPU for rendering video.

        Could anyone comment on whether a game could usefully use both the on-die GPU cores for compute in addition to a fast discreet graphics adaptor?

      2. Killraven

        Re: graphics? buy a graphics card

        >>> Are you that far removed from computer technology? They do not put GPUs on motherboard chipsets anymore. Duh... <<<

        At this moment, NewEgg offers 39 motherboards with motherboard-based GPUs. Granted a bunch of those are models that are more than a year old, but it's far from a dead market.

  15. Kunari

    Wonder how close these new APU's are to the custom AMD chips in PS4 and XBOne.

  16. BinkyTheMagicPaperclip Silver badge

    Dull and slow

    It's still way slower than a 4770K unless you're using built in graphics and playing games. There's no AMD provided software to show the potential of this supposedly new architecture. Frankly it looks like another acknowledgement of the fact they can't match Intel in CPU power, even when they overclock their chips and peg them at 220W TDP.

    I would like to give AMD a try, really, but unless the pricing is keen and the intention is a lower end gaming desktop I can't see the point. If you move to something like an FX8350, currently the only advantage is that it can support ECC somewhat cheaper than an Intel platform.

    With an offering like this, Intel has no incentive to increase core counts or decrease prices for their chips, especially the consumer level ones.

    1. jason 7

      Re: Dull and slow

      And for general usage could you tell the difference?

      1. EPurpl3

        Re: Dull and slow

        “And for general usage could you tell the difference?“

        Everything uses GPU this days so I guess that there will be an improvement. Even on surfing on internet, browsers support hardware acceleration, YouTube too. I am very optimistic regarding this new design but I bet that Apple has already patented this :))

      2. BinkyTheMagicPaperclip Silver badge

        Re: Dull and slow

        For basic productivity : probably not. For browsing, quite possibly. For recent games, definitely (moot point, don't play much bang up to date).

        For virtualisation and compiling, yes. I've looked at the benchmarks. AMD's performance vs Intel is embarassing in single threaded performance and it doesn't fare well in compilation benchmarks. The difference is most pronounced under Windows but there's still a noteable gap under Linux.

        I'd quite like to buy an AMD system, but the facts don't support what I want to be the case. There's some blather on Gentoo forums about the improvement with the latest GCC etc ,but until I see verification I'm putting this down as someone wanting to brag about their system.

    2. HamsterNet

      Re: Dull and slow

      When was the last time you actually used 100% CPU power? I bet its close to 10 years ago. CPU power isnt the bottle neck, HDD access was until SSD solve that.

      Even when playing games its my GPU thats maxed out, whilst my CPU isn't even at 50%.

      1. Anonymous Coward
        Anonymous Coward

        Re: Dull and slow

        "When was the last time you actually used 100% CPU power? I bet its close to 10 years ago. "

        Until recently I'd have agreed with you.

        Then a little while back I started OCR'ing a few scanned-to-PDF but not OCR'd manuals, using the free OCR built into the free PDF reader PDF-Xchange.

        That can use 100% on a sustained basis (at least on one CPU, can't remember if it's multicore capable), and works better than any free OCR I've been able to find in recent years (I used to have paid OCR too but gave up long before discovering PDF Xchange).

        It's a bit of a special case though.

      2. hazydave

        Re: Dull and slow

        When was the last time I used 100% CPU? Last night... and that was on my i7-3930K. Also ran the i7 at 80% at the same time my AMD HD6970 was reading 50%.

        And that sort of illustrates what AMD is onto here, in the long haul. While a GPU these days can certainly help in video rendering, the need for copying data, the loose coupling between CPU and GPU, it's adding dead air... time wasted on communications and architectural overhead.

        Now, this example won't come close to replacing my i7... and I have four 64-bit memory channels on that, in parallel with the GPU's own wide bus and the 16 PCIe links. But these aren't always efectively pipelined by software, because that makes for complex software. It appears that AMD is adding hardware to address this, too.

        The other obvious thing here is AMD pushing to mainstream OpenCL, by delivering a GPU style compute engine never intended for graphics. So for gaming, you'd use the system GPU for the usual graphics ops, and this for physics, video processing, whatever.

  17. earplugs

    Can't wait for PS5

    16xPS4's on-die, can pay for itself in Bitcoin

This topic is closed for new posts.

Other stories you might like