Nvidia doesn't want to buy VIA, the graphics chip maker's CEO has claimed. Nvidia is completely focused on being a "visual computing technology company", he said. Well, for the moment, at any rate... Speaking to CNet, Nvidia CEO Jen-Hsun Huang suggested neither Nvidia nor VIA are interested in acquiring each other's business. …
Is there much of a difference?
I am sure at the micro level there are, but if you are in the business of producing GPUs it is not too much of a stretch to switch CPUs into the line.
All these mergers are never good for the consumer, I rather like Nvidia and VIA, not sure I would be as happy to see them bundled into one though. Just dallying with the market I suspect.
Re: CPU GPU
Hardware-wise, the actual fabrication is done by TSMC, so that's irrelevant.
Software-wise, true, you could probably run your OS on an NVidia or ATI core nowadays, but that's not what it's designed for. It would be like rendering your 3D stuff in software on an Intel or AMD CPU. You could do it, but you'd be better off with the NVidia part.
I am not touching Via with a long stick ( think in temrs of light-years long) So why would you want to combine that with a good product like nVidia.
VIA is notorious for having incompatibilities .They yet have to implement a correctly working, certified (meaning passing ALL mandatory tests) PCI bus. I have video processing boards (as in videosignal as opposed to graphic : these boards process DV or HDV video in realtime) that absolutely refuse to run on anything VIA (heck the machine can't even boot when you install such a card. It crashes during the memory test.
Why : These boards use the complete bus mastering capabilities of PCI and PCI express. VIA is to cheap to implement these advanced features ( because they would have to pay licencing fees). Same boards work perfectly on intel and nVidia chipsets.
Jen-Hsun Huang has a point......
Jen-Hsun Huang may well have a point. Why are AMD and Intel so interested in adding parallel processing to their inherently serial chips?
I think we may well have hit the point where CPU's will become commodity items within the next few years. GPU's/RtPU's on the other hand have a very long way to go before they become a commodity item.
If your a CPU vendor it's a scary time. How do you sell a CPU that goes 'twice as fast' as the last one if that speed isn't needed (or easily utilized)?
More CPU cores won't make things better it will make things worse. It's damn difficult to write good parallel code unless the problem lends itself well to parallelization (3d gfx, physics and video encoding are the only ones that spring to mind in consumer apps).
If each new CPU has a shiny GPU bolted on and it's updated on a regular basis Intel & AMD can just keep going in the way they are used to.
companies like valve are starting to produce multi-threading toolkits in the same way as id produced FPS engines, so the difficulty of the software part to handle multiple cores will decrease. The other thing is that the software doesn't have to be explicitly parallelised so long as the OS can marshal different processes onto different cores. Individual tasks don't necessarily get much faster, but you could browse the web (if you don't think that's processor intensive, try a flash-heavy site on a Mac), encode video, maybe do some folding@home and whatever else the cool kids are doing, all at once with no slowdown.
Now what would be awesome...
... would be a proper nVidia 9000 series 3D graphics accellerator on Via's upcoming Mobile ITX system!
Get to it, miniaturisation boffins!
- Updated HIDDEN packet sniffer spy tech in MILLIONS of iPhones, iPads – expert
- Apple orders huge MOUNTAIN of 80 MILLION 'Air' iPhone 6s
- PROOF the Apple iPhone 6 rumor mill hype-gasm has reached its logical conclusion
- US judge: YES, cops or feds so can slurp an ENTIRE Gmail account
- Black Hat anti-Tor talk smashed by lawyers' wrecking ball