back to article Nvidia boss: Intel suit to 'transform computer industry'

Nvidia CEO Jen Hsun Huang believes the US Federal Trade Commission's lawsuit against Intel could "completely transform the computer industry." On Wednesday, the FTC sued the world's largest chip maker over alleged anticompetitive practices. Among other things, the consumer watchdog accused Intel of illegally attempting to …

COMMENTS

This topic is closed for new posts.
  1. Robert Heffernan
    Coat

    Magic, but wholely useless

    It's all good to say the PC is becoming magical again but extremely powerful graphics hardware is practically useless when game developers are dumping the PC platform like yesterdays newspaper!

    It's no wonder nVidia and ATI are starting to focus the extreme number crunching performance of a GPU into other non-game or graphical related tasks. They see the writing on the wall and have started to branch out into other areas. What I feel they (or possibly myself) are missing is the scope of this redirection of focus. Apart from some minor assistance in the graphical content creation and encoding areas, for the majority of consumers GPU processing isn't really usable.

    While diversification is a good thing for business, I would also be trying to slow or reverse the migration of games to games consoles. I can imagine in the console graphics chip world, nVidia and ATI have the margins on each chip negotiated to within an inch of their life, where as the markup on consumer chips would be much better so why not try and keep that market alive.

  2. Anonymous Coward
    Happy

    Just shows you

    In life you can deceive people for a short or a long time, but in most cases you are discovered eventually. And then it all falls apart like a castle of cards.

  3. Neal 5

    I almost choked.

    Wasn't it just 18 months or so ago that nVidia and ATI themselves, were embroiled in a price fixing and anti competition lawsuit.

    I'm very sorry, I'm going to have to take anything said by any of these pricks with a pinch of salt.

    Memories are short, spin is cheap, blow some more sugar up my arse mr nVidia salesman, I'm a CONSUMER, nah,de,nah,de, nah.

  4. Chronos
    FAIL

    JHH

    == batshit crazy. Continue to take anything he says with a truckload of NaCl. The only people this may affect is AMD, and even that could go both ways. Until JHH sorts out Fermi, this constant, pitiful, desperate practice of renaming existing cards as new ones and, frankly, shuts the fuck up and stops making an arse of himself, nV will continue to be a laughing stock.

  5. Robert Carnegie Silver badge

    Odd language in the complaint

    A modern GUI exploits a good available GPU. I assume that video decoding also benefits.

    However, if I don't misunderstand the terms, a computer would be magical indeed if its GPU was good enough to exclude the need for a CPU.

    1. Nigel 11

      Silicon merger

      > a computer would be magical indeed if its GPU was good enough to exclude the need for a CPU.

      The future will probably be CPU/GPU integrated on one chip.

      The question will be whether it's the CPU core that does the heavy-duty number-crunching, or the GPU core. Further down that road, whether the two merge into each other, with a bunch of processing pipelines that can be dynamically grabbed by whatever needs them. AMD's recent "modular CPU" architecture is rather begging that question.

      AMD owns ATI. I can see why Intel might want to hold this trend back, if for whatever reason they can't buy NVidia. (Could NVidia/VIA be what Intel's paranoia is focussed on? )

      1. frankg778

        Silicon merger

        No I think Intel would rather control the market without acquiring one of the major GPU vendors. The paranoia is more about their culture, what was it Andy Grove said about paranoia?

        The problem for Intel is to maintain x86 hegemony once people start to consider alternate CPU/GPU architectures they have no inherit advantage over other firms. The GPUs can do massively parallel better than Intel can ever dream with their RISC architectures.

    2. amanfromMars 1 Silver badge

      The Bigger Picture is AJPanese ..... Applied JOINT ProgramMING*

      "However, if I don't misunderstand the terms, a computer would be magical indeed if its GPU was good enough to exclude the need for a CPU." ... Robert Carnegie Posted Thursday 17th December 2009 10:43 GMT

      Robert, All the GPUs needs to do, to control the CPUs, is to run a program/graphic tale which shows their dominance in the market place with ...... Virtual Pilots in Imaginative Business Projects. ....... Drone Enterprises from Stealth Underground Marketeers/Renegade Rogue Pirates into Private Civil Partnerships and JOINT Add AI Venturing. And that is much more a Natural Far Eastern Meme/Gene than anything of an Artificial Western Holywood Style Manufacture.

      * For the Beta Global Management of Perception which Creates AIReality Virtually for BroadBandCasting as a Future Presentation ....... in Alternative Reality Games, for one can fully expect that there will be a Creative Market for both Competing and Complementary Visions with Attractive Transparent Agendas to Entice and Engage with Assets/Customers/Beings.

  6. TeeCee Gold badge
    FAIL

    Intel? Graphics?

    "....accused Intel of illegally attempting to smother the makers of rival graphic chips......"

    Really? Doesn't seem to have achieved much as the market still seems to be ATI, Nvidia and also-rans. Even netbooks (where it ain't that important) get performance points for having an Nvidia ION rather than some Intel shite. Bit of a black mark here for the usually highly effective Intel Department of Bungs, Lies, Nobbling and Other Devious Practices.

  7. Jacqui
    Linux

    firebird

    This was a database designed around the premise that large numbers of processors would be available. This has not happened and other shortcomings on the design have led to it not taking off as well as it could have. OK its now oold tech but ther are other server techs out there that could easily make use of GPU's for real world apps.

    The other issue with GPU's is the "inbred" need to keep the API's a secret. Until Nv and co start opening up to the linux crowd, us server folks will try but expect GPU's to be "out-of-reach" in terms of stability and support even if we are some of the folks they should be trying to get on board.

    An example - we develop billing and near time configuration management systems for telcos and altough we are a very small business we often work with pre-release software and are often the technology deciders for large and very large system designs.

    Often a technology supplier simply looks at our turnover and our "non existent" VAR sales record (our customers are big enough to have thier own purchasing and contracts departments so sales never touch us) and decide we are not worth talking to - I have been told by tech vendors to literaly "piss off, you are too small to count" in the past, only to have the same vendor call back a year later when they found they were excluded from a number of large scale projects with "so who the f*ck are you people"? Yes, salesmen get angry with anything they dont understand :-/

    Nv and co have been ignoring "alternative" markets for years and will not talk to people like us - the ones who can "sell" thier tech (if we/they can prove it is stable)!

  8. frankg778

    Transform computer industry

    GPUs are very interesting technology. NvDIA GPUs with CUDA are being used in many compute intensive scientific and financial applications that are amenable to parallel execution.

    I could imagine an inversion of the CPU/GPU relationship where the CPU becomes a GPU loader and coordinator. Essentially loading code and coordinating parallel processes. A lot of this assumes that software people figure out how to develop for massively parallel systems. I think functional languages like Erlang can help us get their. You basically want a language that discourages implicit data sharing and will implicitly exploit however many core are there without the developer having to do much.

  9. Pascal Monett Silver badge

    "lessened the need for CPUs" - bollocks

    When I look at the latest Steam Hardware Survey data (http://store.steampowered.com/hwsurvey), I see three things that fly in the face of these words :

    1) Quad-CPU adoption has increased by 13.3% over the last 18 months

    2) Multi-core CPUs now account for over three-quarters of the survey base

    3) Multi-GPU systems still account for less than 2% of the survey base

    The way I read this situation is that GPUs have had next to no impact on multi-core CPU progression. I wonder why ? Could it be because GPU drivers have long been less efficient on SLI configurations than on single-card configs ? Could it be because a host of problems often plague SLI configurations while single-GPU boxes just game on ? Could physics end up having only an incidental effect on gaming ? Finally, could it simply be the prohibitive price of multi-GPU setups ?

    As a side note, about physics : has anyone else noticed that the most visible use of the tech is to add a plethora of additional particles to explosions ? Is there anyone else that finds such stupid use of that tech as annoying as I do ?

    Mr. Huang, you are already practically guaranteed one sale per PC. Just because you want to have two more sales per PC doesn't mean you deserve it. You say "great graphics have become one of the most important features for consumer PCs", and I totally agree with that. Unfortunately for all of us, great graphics is not just buying a good graphics card. Anyone that has followed the hardware benchmarking trials for a while knows very well that a PC is the sum of its parts. A good GPU is useless on a system with a slow CPU, feeble bus speeds, or little memory. Good graphics require a powerful GPU, a powerful (and nowadays multi-core) CPU, high data transfer rates across the board, lots of fast RAM and hard disks that don't get caught on their coffee break every other minute.

    In other words, the Steam survey says exactly the contrary of what you claim : gamers are upgrading their CPUs in order to follow your graphics cards' needs, not to smother them.

    To wrap this comment up, I would just like to add one more thing. The action the industry needs, Mr. Huang, is for you to pull out your finger and start doing serious progress in GPU technology instead of renaming last years' cards with a fancy new scheme and reselling your existing stock.

    Make something new and wonderful for a change, I can guarantee you it will sell. And Intel won't be able to do anything about it.

  10. Anonymous Coward
    Anonymous Coward

    Interesting deelopment...

    Intel has just announced that they will be modifying the Atom to incorporate the GPU... Hmmmm... me thinks I hear the muffled cries of nvidia and ATI as they get smothered by a pillow with Intel embroidered on it.

This topic is closed for new posts.

Other stories you might like