back to article Intel shells out $1.5bn for Nvidia tech

Intel and Nvidia are burying the hatchet, and it looks like it will be stuck in AMD's skull. And, perhaps equally importantly, Nvidia will be $1.5bn richer. With "eye candy" – snazzy media processing – being the most important aspect of personal computing these days, Intel can ill afford to be at war with graphics chip maker …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Flame

    Cor. Intel done something nearly sensible.

    First time in living memory.

    Carry on with this kind of common sense and they'll be announcing the EOL of IA64 soon.

    Mind you, I'm not sure what kind of "snazzy media processing" folk really need these days. If you can get decent HD out of a mobile phone, which allegedly you can, why does HD on a PC need a whacking great stupidly hot graphics card with a fan and an even bigger CPU with an even bigger noisier fan to do the same job as the fanless battery-powered phone for HD content? Oh I forgot, the phone's got an ARM in it while the PC is traditionally armless. And the phone has to do something useful whereas the desktop has to do something useful AND run Windows. Yep, that would account for it.

    1. Ammaross Danan
      FAIL

      Logic Fail

      "Mind you, I'm not sure what kind of "snazzy media processing" folk really need these days"

      Need I remind you that your desktop/laptop is what converts original "HD" content to content your phone can play? And, unless I'm mistaken, your phone doesn't have a 1920x1080 display. It likely doesn't even have a 720p (1280x720) display. Therefore, your "HD" content is likely "HD" compared to the old 4:3 content. But, to transcode video/audio to a decent-quality, lower res, higher compression file (read: stuff a 6GB DVD movie onto a 1GB or less file next to the mp3/mp4s on your phone) takes a bit of compute power. Even the new Sandy Bridge can only do 220fps, which for a 24fps vid is only 9sec of video per second. Fanless ARM chips can manage this without draining your battery (much) due to in-chip video decoding, just like what the recent crop of CPUs/GPUs have. Next time you want to foam at the mouth that your ARM (iPhone likely?) CPU is better than nVidia/Intel/AMD hardware, try transcoding Iron Man 2 on your phone (if you can) and see how long it takes. Be sure to plug it in and put it in the icebox first.

      /needed to be said.

      1. Anonymous Coward
        Flame

        And when the phone has an HDMI connector...

        All the trendy ones will have HDMI connectors Real Soon Now. Just like all the decent cameras and DV recorders.

        Wtf would I want to transcode something from PC format to phone format when the phone can play the PC format anyway, and has plenty of space on the MicroSD for exactly that?

        What's an iPhone?

  2. Eddy Ito
    FAIL

    Damn!

    I knew I should have bitten the bullet and sold off something last week but no, I was waiting to capture one last quarterly dividend transfer in my Roth IRA to starting serious scoops of NVDA. I was mildly concerned at the not-so-immodest CES bump but this has me kicking myself. Oh well, I'll get some street cred points over at marketwatch that I can use, along with $2.50, to buy a cup of coffee in the morning.

    Fail because, well... I deserve it.

  3. Mark C Casey
    Grenade

    Anonymoron

    If you knew anything about pc's you'd know they've been able to output video at 1080p for years, which is essentially what the mobile phone makers are claiming.

    A mobile phone does NOT have the processing capacity to decode for example a full 1080p blu-ray h264 video. A mobile phone however can decode a very low bitrate 1080p video that doesn't use some of the more processor intensive features of h264.

    An AMD 4350 graphics card (a low profile passively cooled GPU) can fully decode a bluray h264 bluray video without breaking into a sweat. As can my current 4550 which is in my media center, which by the way can simultaneously decode two bluray 1080p video streams. (certain blurays have special features like picture in picture etc)

    Modern GPU's and CPU's are designed to do more than display relatively simple graphics and light processor usage, unlike mobile phones. (the idea that somehow the graphics chipset in my iphone is even anywhere near on the same level as for example an AMD 4350 is laughable as is the idea that an ARM mobile processor could perform anywhere near the level of say an AMD 240e which is the cpu in my media center)

    Anonymous Coward doesn't know his arse from his elbow.

    1. Keith Smith 1
      WTF?

      Your joking right?

      Your kidding. Most of the latest android devices have a builtin hardware decoder for h264. A bit flakey at present but give it 6 mos. The round of phones/pads will have dual core ARM chips to go with improved h264 decoding hardware. They will use about 1/20th the power of a pc and easily be able to decode 1080p to a 1920x1080 screen flawlessly.

  4. Flybert
    Thumb Up

    it's interesting ..

    that the name "ATI" isn't mentioned ..

    definitely a constraint on AMD, preserves the nVidia name, and should solve Intel's GPU weakness

    1. fatchap
      FAIL

      AMD?

      Why is it strange AMD bought ATI a few years ago now and the brand is being depreciated in favour of all AMD branding.

  5. Hayden Clark Silver badge
    Unhappy

    Bye-Bye nForce.

    Shame.

  6. Anonymous Coward
    Anonymous Coward

    ""A mobile phone does NOT"

    Anonymoron here again :)

    "A mobile phone does NOT have the processing capacity to decode for example a full 1080p blu-ray h264 video. "

    Really? How long before the first one does have the power, d'ya reckon?

    While we wait briefly for that to happen, how about this year's ARM-powered fondleslabs then, especially those expected with multicore ARM and snazzy graphics coprocessor, maybe even on the same chip? How many watts will they use, typical and peak, in comparison with desktop x86 kit, kit which has historically been hampered by x86 compatibility and more recently by being designed to look pretty for fatuous and irrelevant stuff like Aero (and arguably less fatuous/irrelevant shootemups?).

    AMD 240e. Decent desktop/Windowbox processor. They'll be very roughly 50 watts, won't they? What cooling arrangements (size, cost) do they consequently need? Add a bit more for graphics. Battery power? Not really. Passive cooling: if you're lucky, and don't mind hot boxes with consequent short lifetimes.

    And an ARM with sufficient grunt to do the same job? A lot less.

    That's what ARM do, power-efficient computing.

    Until last week, ARM didn't do Windows. If MS are to be believed, that will be fixed soon. ARM already do Linux nicely, thank you. I'm not sure if users will care about Windows though, as perfectly illustrated by http://forums.theregister.co.uk/post/953813

    1. Anton Ivanov
      Thumb Down

      Apples and oranges

      Arm is power efficient in idle. I would not be so sure about ARM under load.

      Let's take my household app server/workstation as an example. It is an (AMD 3200+) with Nvidia Quadro NVS 290 and it idles at sub-55W with Linux (prior to any drives being counted in the equation). An equivalent ARM with an equivalent ARM-integrated GPU will probably idle around 20W. It cannot go much lower because the power supply efficiency drops as it goes outside its optimum spec range and it needs range to accommodate the spikes from drives spinning up. 30£ per annum on residential tariff is not that much. It is not really a selling point just yet. It may become a selling point only if you compare it to let's say a pre-Core2 Celery which ranks up at 110W for the same computing power. That however is an unfair comparison.

      So for a Linux machine which spends most of its life in idle ARM makes some sense. Not a lot (unless leccy goes in price at least 4-5 times). I would not be so sure about Windows especially if it suffers from the usual security/antivirus vendor infestation which keeps the load at 10%+ round the clock.

      However, once the machine starts shuffling bits, pixels, texels or whatever else it needs to shuffle ARM will probably end up being the same as x86. I see no reason for it not to be. If it was Google, Amazon and their ilk would have been running on ARM and not on x86.

  7. noboard
    Joke

    But, but, but

    "As part of the agreement, Intel and Nvidia are settling all outstanding lawsuits"

    What will the lawyers do now *sniff*

    1. Ammaross Danan
      Jobs Halo

      Lawyers

      "What will the lawyers do now?"

      Follow Steve Jobs around.

      /halo, since he's the one paying them....

  8. Arnold Lieberman
    Stop

    Oranges and Apples

    And I'm not talking about either fruit or CE companies...

    It's pretty difficult to compare embedded ARM processors to desktop x86, or desktop anything else really as requirements are so different.

    Let's examine ARM power consumption for a start. It's true that they are very efficient when idle, with quiescent currents an order of magnitude lower than Intel or AMD chips, but when a big lump of processing power is required then things are rather different - see how the battery life of an iPhone fails when running Android (2hrs standby time), merely because power management hasn't been implemented properly yet.

    My home server uses a Sempron 140, basically half a full-fat 240, with TDP of 45W. When sitting idle with 3 hard drives, a motherboard, gigabit ethernet and some unpowered USB devices attached via Windows 7 the whole PC pulls around 35W. It's not going to be good enough for a laptop but is more than acceptable given that it's a 2.7GHz 64 bit out-of-order processor.

    ARM chips able to decode H264 have dedicated circuitry to do the heavy lifting whereas any x86 made in the last few years is capable of doing said task using general purpose instructions. As always, dedicated funcations can be realised a lot more efficiently but are of no use when then next standard arrives (H265 anyone?). A PC with sufficient grunt will be able to adapt to the task, or a new processor dropped into an exising case - I could swap my weedy Sempron for a 6 core Phenom II, reboot and carry on. Try doing that on an embedded setup.

    In other words, ARM is going to be as sucky doing windows as Atom (both support in-order execution), if they want to play at the level that AMD/Intel are at in the general purpose computing space then we're going to see much bigger/hotter chips than present.

    1. Ramazan

      @Arnold Lieberman

      Do you think 35W at idle is OK? Mindspeed's Picasso chip (two ARM cores plus about dosen of Countach 64 DSP processors packed on the single die) processes 672 G.711 voice channels in realtime (i.e. no frame drops when OS (I mean you, Windoze) decides to swap something in/out). The chip consumes about 1.5W at full load. Code for new codecs can be compiled and loaded to DSP processors given there's enough room left.

  9. Anonymous Coward
    Pint

    "ARM is going to be as sucky doing windows as Atom"

    If it is, it's down to MS.

    Amongst other things, ARM gets more work done per KByte of code than any x86 design can. ARM can do this in part because it is a clean sheet design which no x86 design can ever be, and it includes features aimed specifically at efficient systems (e.g. code predication, ARM/Thumb subset, and more). Not an option in an x86.

    Fewer kBytes of code for the same workload means less memory and therefore less power.

    Fewer Kbytes of code for the same workload also conveniently gets the same amount of work done in the same time while running at a lower clock speed.

    What impact does clock speed have on power consumption? Lower clock = lower power.

    Wrt processing power per watt, any ARM is hard to beat. Some very hard.

    Who cares whether it's 64bit or not. 64bit is irrelevant to almost anything any non-server box will ever do; if you must have 64bit there are plenty to choose from.

    "see how the battery life of an iPhone fails when running Android (2hrs standby time), merely because power management hasn't been implemented properly yet."

    Best talk to Apple about that. Plenty of Android phones manage days on standby. My bargain basement ZTE Blade certainly does.

    "if they want to play at the level that AMD/Intel are at in the general purpose computing space then we're going to see much bigger/hotter chips than present."

    Bigger hotter than present ARMs. Not bigger hotter than present 'low power' x86.

    Thanking you.

  10. tempemeaty

    The bad the good and ATI....

    Bad, more hybrid Intel-nVidia crap graphics chips for your netbooks/notebooks that won't run on the nVidia drivers with your Linux OS.

    Good, Intel can not and will not ever under any circumstances be able to make a graphics processor and drivers that could hope to compete with ATI even with nVidia's help. Intel just can't.

  11. FrankAlphaXII
    FAIL

    ARM = Hype

    I remember this same sort of hype surrounding a certain processor made by Intel called Itanium. It was supposed to be the be all-end all processor for everything (much like ARM's being hyped to be). It seems like noone's even considered or thought about Itanic in regard to ARM. I must say that ARM is a bit more of a robust platform, but its not an x86 killer and its not intended to be, especially not with x86 going down the road of using APUs to enhance performance.

    Fail because noone remembers history.

    1. Jason Ozolins
      WTF?

      ARM != Hype, but remains to be seen how high it can reach

      I don't see how ARM can be compared to IA64 as far as hype goes...

      ARM is a decent general purpose chip. It has seen massive success in the embedded space (>10 billion units), in part because from its first implementation as the Acorn RISC Machine, it gave great performance from small chips. Its embedded and low power credentials have been built over 27 years from real advantages like its simplicity and fast interrupt handling.

      ARM has relatively few of the kind of misfeatures that make it hard to crank up performance. I'm not sure if ARM's warts are harder or easier to work around than x86's warts in going for really high performance; a lot of work has gone into making x86 go fast, and who knows how fast ARM could be going by now if it had as much money and person power as Intel brings to bear on x86.

      ARM Holdings was more interested in selling ARM licenses than in beating their chests and promoting their CPU to the general public.

      By contrast, IA64 was:

      - hyped so hard to both the industry and the buying public that it made some system manufacturers (SGI comes to mind) give up hope of competing with their own architectures, instead switching to Itanium

      - a very complex and ambitious architecture

      - big, slow, hot, late and unpopular in its first generally available version

      - never to reach the market volumes predicted by Intel and its sycophantic coterie of industry analysts.

      IA64 still exists pretty much because Intel has commitments to develop chips for HP. By contrast, ARM still exists after 27 years because it has succeeded in key markets. It's not because of hype.

  12. Anonymous Coward
    Thumb Up

    "noone's even considered or thought about Itanic in regard to ARM. "

    What a silly thing to say.

    To add to Jason's entirely reasonable writeup:

    ARM is a free market success. Lots of people are buying lots of different flavours of ARM, of their own free will.

    IA64 is a total disaster. Intel said it was going to be "industry standard 64bit computing". It wasn't, and isn't. The only significant people buying IA64 are HP, and HP's customers are only buying it because they need something which they can only get with their OS of choice (HP/UX, Tandem NSK, VMS). If their OS of choice was available on other hardware (AMD64 an obvious choice), then that's what they'd buy. Whether or not it's "IA64 Inside" is largely irrelevant to them.

    IA64 is a big fat ridiculous white elephant, dying on its feet. ARM isn't.

    As Jason said, "ARM still exists after 27 years because it has succeeded in key markets. It's not because of hype."

    The folks who know ARM know how good it is at what it does. Until now, ARM has been focused on particular market segments which are not that visible to Joe Public or even the IT industry media, even though Joe Public probably possesses two or three ARM chips already and an IT geek probably has more (there's a tiny tiny tiny chance Joe Public or IT geek will ever have an IA64 of their own).

    Courtesy of recent developments in the ARM market, and courtesy of CES last week, and even more strangely courtesy of MS at CES last week, ARM is finally starting to get a lot more coverage, even in the general press. And the cluelessness of the people here who are saying "it'll never compete with x86" is starting to show. As is the cluelessness of Intel HQ.

    Looking a little way into the future, what won't you be able to do with next year's follow on to this year's Samsung Galaxy Tab (or whatever) that your x86 will do for you? Other than run x86-specific malware, obviously.

    [Not associated with ARM in any way except as a user since 2000 or so]

  13. Anonymous Coward
    Anonymous Coward

    "using APUs to enhance performance."

    Auxiliary processors were architected into the ARM hardware and software architecture pretty much from day 1 and have been implemented in various ways by various ARM licen[cs]ees. Horses for courses.

    APUs may well be eing bolted on to the x86 architecture, but it's a retrofit.

    Intel are in catchup mode. They missed the boat with 64bit (IA64 or AMD64?), they missed the boat with system-board busses (Hypertransport or Quickpath) and they missed the boat with APUs (well, other than the 8087, obviously). Finally they're getting there with APUs again.

    It'll be a long time before the x86 or its AMD64 successors vanish. But other than Invidia and Dell, will anybody care?

This topic is closed for new posts.

Other stories you might like