Apple will drop Intel’s integrated graphics chipsets from its new family of MacBooks in favour of Nvidia’s new mobile platform, according to speculative reports. The company is expected to announce that decision at a notebook-centric product launch tomorrow. If the rumours, made in an AppleInsider report, are correct, the 13in …
Not really that surprising
*Some* Nvidia video chips *might* be flaky, but *all* Intel video chips *are* rubbish.
Sourcing specialist graphics chips from Nvidia is one thing. Sourcing the core chipset is a completely different proposition and one that is bound to put Intel's nose well out of joint.
It would be a 'brave' move for Apple given how close they've been to Intel these past few years.
We'll see what tomorrow brings.
Is Apple finally seeing the light? Will this be the end of Apple's sordid flirtation with Intel? Let's hope so.
Of course, if they also happen to announce a programme to improve compatibility of applications with Linux, then I think I just may cream my pants ...
Better the devil you know....
As far as I've read this was a 3rd party fab problem and not strictly an Nvidia one.
I'm sure they've guaranteed against further quality defects. Maybe all the rumours about Nvidia leaving the chipset business will become "Nvidia makes Apple only chipsets"
Also since Snow Leopard will be able to offset cpu tasks to gpu, I'm sure Intel would throw a strop if Apple started buying gpus from AMD, that would certainly start a few cold sweats at Intel.
Does Kelly work for the Daily Sport?
..disappointing to see that The Ol' Reg has now also jumped on the mac rumors bandwagon... dear god can't you wait until tomorrow and report something more newsworthy?
The simple fact of the matter is... Apple don't release details of products until the media event.. everything else is just hear-say... ergo.. not worth reporting as fact.
Shame on you Kelly:(
Chipset != processor, fyi
Err, hold on a moment...
In the EU all consumer goods have a TWO YEAR warranty anyway. That is an irrevocable legal right, enshrined in law, which most manufacturers would rather you didn't know about.
It covers everything you buy from any retailer.
That's why manufacturers like to say '90 day warranty' in big letters - making you think you're onto a good thing, when in fact your statutory rights are way better!
...who cares other than Macturds.
Not a good move, IMO
Compare Intel's chipset reliability (core chipset, not GPU) in the x86 maket over the long-term to nVidia's. nVidia might be cheaper, but Intel has had far less issues. nVidia has had several broken firewall issues (broken in silicon, couldn't be fixed) that got swept under the rug, several SATA issues that took a few BIOS revisions to fix, and some stability issues that also took BIOS revisions to fix.
Compare and contrast to Intel. Sure, their integrated video is lousy. However, the Macbook Pro doesn't use integrated video, and the new X4500 from Intel should be good enough for video playback on the Macbook. And, Intel has been in the core-logic chipset business far longer, and has a better track record for reliability. If I was buying a Macbook Pro right now, I'd wait until the October 14th release, and buy one of the previous-gen models on clearance. I'd save enough to at least buy Applecare (which I'd do on any Mac notebook anyway).
Not really too odd
The only graphics company Intel has a shot at competing with on "performance" grounds in the graphics space is VIA, and I do not say that lightly. Now, does anyone in tech news remember VIA has integrated video tech though their purchase of S3 (a loooong forgotten time ago)? Fortunately VIA(/cyrix/s3) doesn't have the clout to throw around in order to make their tech "appealing" so they continue to silently innovate. The tech intel's 3d graphics are based off of were very good for about 2 months after it was released, but back then 3dfx (yes this is going waaaaay back) also came out with the first GOOD 2d/3d card (I'm only counting the ones that did 2d/3d on the same graphics processor, and understand others were doing this before, but not for the gamers). Ok, for the money the Real3D card wasn't too bad, on par with Matrox's add on 3d card, but again, these technologies were quickly surpassed by 3dfx (a looooooong time ago). Intel's only selling point of their current iteration of their integrated graphics is that it's so damn cheap they can only give the stuff away, like most embedded techs until relatively recently (when nvidia and ati started getting serious with their embedded tech.) Anyway, long story short, Intel is getting way too much press for something so mediocre. In fact, they have very little that isn't mediocre, or at least, they have very little that makes them stand out against the competition (unless you count bad press *ZING*). Their competition currently have very full tech portfolios, and most that have tech that stand out trump those techs (VIA's silence is a great example of what to do when you have to compete with the low tech space), this I believe is something Intel should do more often when they talk about thier "side projects", ie. not at all. They seem to be playing the Microsoft press game when in reality they really aren't bad at all when it comes to tech innovations, it seems they just forgot how to compete.
@ Mick F
NVIDIA fanbwas, perhaps? (presuming such creatures exist)
Intel being punished?
So Intel was supposed to have new chipsets in Feb. for Apple's last notebook revision right? But dropped the ball.
So Apple is probably punishing them by moving to Nvidia. Now when they choose to punish Nvidia for all those graphics cards that failed, I wonder who they will go to? Seems to take about 6 months to execute a punishment strategy, so in 6 months we'll see.
Re: Re: Yay!
> "Chipset != processor, fyi
Yes, I know. Now fuck off back to your Windows box, and leave the grown-ups to talk.
@ Mick F
Quick Mike maybe you can still get in at Commie-Con or go visit Ballmer. I hear he broke his last sheep and is looking at the flock for a new one.
No Nvidia for me, thanks.
Odd then, I do agree... I was a likely purchaser this time around. No longer if this one's true.
So far as I know, Intel isn't release the Core i7 architecture in notebook form for still a little while longer. So to me this sounds a little irresponsible.
The biggest problem in general with 3rd party chipsets has classically been the memory controller. This is because memory management within modern architectures has grown SO advantaced an complex that even when Intel is communicating the architecture internally, the chipset teams have difficulties getting it right. This is why Core i7 with integrated memory controllers is so important. It's also why 3rd party AMD chipsets don't fail too miserably.
I personally don't feel comfortable using a chipset designed outside of Intel for memory access.
I hope that this is a classic case of someone calling a dingy a yact. Maybe they'll continue to use Intel chipsets (at least until i7) and simply include an nVidia GPU.
Presumerbly this story is linked with this - http://www.theregister.co.uk/2008/10/10/graphics_card_wireless_hacking/