Intel has unveiled details of its new "Haswell" microarchitecture, and promises that it will deliver greatly improved compute and graphics performance, drastically lower power requirements, and developer-friendly improvements when chips based on it appear next year, branded as Intel 4th Generation Core Processors. "The great …
Not to throw a damper on this lovefest...
But the Haswell platforms employ Microsoft patented technologies for video. They are Windows Only. Direct X. No OpenGL, no Linux, no Android, no BSD to fully implement to technology in these chips. Intel has finally caved to Redmond and given them their Holy Grail: an Intel platform that best only runs Windows.
Do I have to say it's not for me? OK. It's not.
Never was the icon more appropriate. I was a HUGE Intel fan, but am quick becoming apostate.
Intel, if submitting to Microsoft is what I have to do to love you, then you don't love me back.
Re: Not to throw a damper on this lovefest...
Do you have a source you can cite for that? All the articles I've seen state Haswell supports OpenGL 3.2 and OpenCL 1.2.
Re: Not to throw a damper on this lovefest...
Again, have you a link for this? I can't find any mention of it elsewhere.
Other links suggest that this new architecture supports DirectX in addition to OpenGL and OpenCL etc. Can you re-read your source, and just confirm for us that you're not jumping at shadows? Ta.
Intel has it backwards with the GPU scaling??
They say a desktop chip will be delivered with a 'better' GPU version, even though that's the place where users will continue to replace a mediocre integrated GPU with a dedicated GPU.
In Ultrabooks on the other hand, there's not much room and manufacturers would rather not add cost and loose battery life with an extra chip that needs its own heatpipe.
So, the reality of laptop building and selling would dictate that you need some low power chips with the higher end integrated GPU, that can clock up and down in a wider range than Ivy Bridge.
Haswell's modularity means that the Intel can produce chips that are not Windows-only? Of course they'd only do that if there's a market for chips without the MS proprietary stuff but thinking more widely, Intel own WindRiver (VxWorks...) so there is the next generation of embedded x86 processors and it would be ridiculous to suggest that Intel will bet against their own OS products to back embedded Windows (whatever it's called, I forget). I doubt, too, that Intel are rowing away from all those promises they made about optimising for Android.
"Wintel" worked very well for Intel when Windows was "it" but that's no longer true. Intel knows that.
Is it just me?
or does anyone else think that one design is unlikely to be appropriate for a single-user tablet and an enterprise server?
Isn't there a good reason we don't use i7 in servers?
Re: Is it just me?
>Isn't there a good reason we don't use i7 in servers?
Yeah, it called 'market differentiation'. Modern Xeons already use the 'Sandy Bridge' architecture as their i7 brothers, but have different features enabled, such as support for multiple CPUs, ECC RAM and some virtual stuff.
one size fits all
It does sound to good to be true but they're talking about reusing the processing blocks that go onto a chip not the chips themselves. I do wonder whether there's enough actual commonality to allow all the resulting chip types to be optimal for their intended purpose.
Another thing I don't know is how well Intel have improved MIPS/watt. Ok, they make the point that the goal is to save power everywhere (the spiel about "power just as important in the datacenter as it is on a tablet") so I can buy the idea that a common design that does that is good for all uses but ARM are still the ones to beat here, I bet.
So, Off-the-Shelf versus Bespoke? The performance numbers will show whether they've got the whole cost(to them) / benefit(to the customer) curve right.
anything about HW support for transactional memory?
TM was supposed to be great thing about Haswell, curious how it pans out. It would be first consumer grade CPU with hardware support for TM, which is quite exciting assuming that gcc would support it (about) right.
- Pic Forget the $2499 5K iMac – today we reveal Apple's most expensive computer to date
- Geek's Guide to Britain Kingston's aviation empire: From industry firsts to Airfix heroes
- Analysis Happy 2nd birthday, Windows 8 and Surface: Anatomy of a disaster
- Review Vulture trails claw across Lenovo's touchy N20p Chromebook
- Adobe spies on readers: EVERY DRM page turn leaked to base over SSL