Intel's Larrabee many-core CPU-GPU mashup is so far behind schedule it will not be released as a standalone product. An Intel spokesman told The Reg on Friday afternoon that both hardware and software work on the much-anticipated part is tardy, and this means the chip won't be released into the wild as anything other than a …
I won't be mourning
Let's face it, we're (pretty much) stuck with the x86 architecture for our CPUs, but in a graphics card? Folks, it's not 1978 anymore.
I was going to buy many-many Larrabees in 2010
but it looks like I'll have to invest in racks and racks of Nvidia 295GTX's
A great shame, there were some amazing ideas in there. For a while it looked like Intel would wipe the floor with nVidia.
@ Francis, nice to see this 1990s attitude still holds in some quarters! It is funny how the "fundamentally faster" idea of RISC ISAs so profoundly lost to Intel's 1970s ISA, but what Intel proved was that the ISA doesn't really matter. Current x86 *architecture* bears absolutely no relation to 1978 x86 architecture, though. We're not stuck with anything, except a certain bytecode.
It's not like you ever see the ISA of a GPU anyway. Larrabee would have pleased anyone except hardcore x86 haters.
On old pending something they can copy
Never Fear --- As soon as someone who knows what they are doing releases a product that works Intel will be able to imitate it.
Cue . . . .
. . . . . sounds of Champagne corks popping at nVidia HQ . . . . .
Can't say I'm disappointed though. Intel have shown absolutely zero ability to produce any products outside of their area of core competency ie. CPU's based on their decades old crappy x86 architecture.
Hell, it even took AMD to properly extend even that to 64 bits!
Yup, one of my favourite giggle moments in the history of CPU design that.
Intel: "Here is the path to the Promised Land of 64 bit computing <trumpet fanfare, drumroll, hype>, the Itanium processor."
AMD: "Here is <kazoo noise, penny in cup, powerpoint slides> a 64 bit version of what you've got now".
World: "Yay! Gimme some of that shit!"
Intel: "Oh fuck. Didn't see that one coming."
We are only "stuck" with x86 for as long as we (as an industry) choose to remain chained to MS Windows.
As soon as we all manage to break that addiction we will be able to move on to newer and more useful architectures.
Hence the "pretty much" - I'll be first in line to buy an ARM-based netbook.
Make up your Mind Time
"Never Fear --- As soon as someone who knows what they are doing releases a product that works Intel will be able to imitate it." .... Anonymous Coward Posted Sunday 6th December 2009 00:01 GMT
As soon as someone can Driver and Secure Larrabee, will Intel Realise the Constant Increasing Advantage of Multiple Parallel Source Cored Architectures rather than just Lusting after them as Eunuch Punters at a Professional Orgy ........ which is obviously/probably Intel's Present Difficulty ......... Controlling Output to Intel's Agenda with Intel being Vulnerable and Heavy MetaDataBase Asset Light in the Multi-Core Environment.
"We are only "stuck" with x86 for as long as we (as an industry) choose to remain chained to MS Windows.
As soon as we all manage to break that addiction we will be able to move on to newer and more useful architectures." .... Goat Jam Posted Sunday 6th December 2009 00:01 GMT
Some are not "stuck" with any industry standards or chained to any off the shelf proprietary operating systems, which in common with all popular mass market products, has the usual wealth of opportunities for fine to outrageous tuning and customisation of Base Vehicle.
How Advanced do you Imagine the Present Progress/the State of the Art in the Remote Control of Virtual Operating Systems with Cloud Cover for ITs Command and Control Programs and CyberIntelAIgent Security Power Projects, is? Advanced enough to be Already Actively Relentlessly Stealthily Replacing/Disgracing Old Duff Components with Novel Automatic Supply of Customised Deep Open Source Technology.
surprised it's taken this long to die.
but i would think intel will take the results of this project to improve it's cpus.
massively multicore interconnects?
mesh/grid computing on a chip?
intercore cache sharing?
god knows what else they've come up with to make this thing actauly work, and work it shall. as a dev platform? bit of a hint there. i would have liked to have seen real time ray tracking at 1920x1200 though. maybe next year.
i'll hazard a guess that neither Nvidia nor AMD/ATi were really that worried about it from the get go.
RE: Cue & RE: @Francis Boyle
".....Intel have shown absolutely zero ability to produce any products outside of their area of core competency ie. CPU's based on their decades old crappy x86 architecture....." Really? Hmmm, I suggest you go do a little reading. For example, did you know that Intel, through their embedded graphics, are the most common PC graphics provider? In terms of shipment volumes, Intel has dwarfed AMD and Nvidia for years.
RE: @Francis Boyle
"...We are only "stuck" with x86 for as long as we (as an industry) choose to remain chained to MS Windows....." So, the fact that the most common environment for Linux is x86 totally escaped your attention, then?
The mis-truths of RE:Cue & @Francis Boyle
The dominos fall like a house of cards, check mate.
The fact that x86 is popular only supports the point that x86 is propped up by Windows. Abandon Windows and you can choose another architecture. That would encourage competition.
Stop it, and stop it now.
re Matt Bryant
You realize of course, that Intels onboard graphics penetration is why PC gamming is floundering. I hardly call GMA950 a product. It is the worst grphics card, reinforcing the point that they have zero ability to develop graphics cards. Of course leveraging their monopoly in CPUs is helping their IGP sales. However, I'm not sure you can call shafting someone with your CPU monopoly a product.
RE: @matt & RE: re Matt Bryant
"The fact that x86 is popular only supports the point that x86 is propped up by Windows. Abandon Windows and you can choose another architecture. That would encourage competition." So, which comes first, the chicken or the egg? To abandon Windows, you will need to get the current users onto something else, and unless it's going to be a big-bang change where everything gets swapped out overnight, that means something on x86. That's the chicken, the vast, cheap x86 estate already out there. To switch to something completely different means fighting economies of scale, which even Apple finally had to admit defeat on. Then the egg is the new OS. Problem is, if it has to go on x86 to start with, then it has to go toe-to-toe with easy to buy Windows. Note - I didn't say easy to run, but then Windows actually is for the 90% of home PC users that never try anything else, let alone all thoses businesses that are running Windows. So, which do you change first - the egg or the chicken? Either way, you're unlikely to succeed without a major shift in technology that somehow makes the Wintel combination redundant, beacuse 90% of users are just fine with a Wintel PC. And the reason is, even though you scorn Windows and the x86 platform, as a combination they are both effective and popular. And that's coming from a Linux and UNIX user.
RE: re Matt Bryant
"You realize of course, that Intels onboard graphics penetration is why PC gamming is floundering...." You need to realise less than 10% of PC users play top-line PC games. In fact, it's estimated more than 40% (including business users) never run anything more taxing than M$ Office, Explorer, iTunes and a low-level picture editor for their digital camera. For them, the Intel graphics do exactly what they want, at a very nice price. The majority of users don't want to open up their PC and put in a graphics card, so your point merely underscores your limited perspective. Whilst Intel would love to corner the gaming market by killing nVidia at the CPU socket, they have spent the last twenty years hoovering up the vast majority of users with embedded graphics and making a very nice profit from it too. They are quite happy to let nVidia and ATi fight over the power-users as they don't represent enough of an econmic incentive for Intel to get in there and fight for them.
And now I'm going to get my coat and go play at killing people on my Windoze PC (with nVidia graphics and an AMD CPU).