So you've mentioned "2-in-1" no fewer than 10 times in the article, without actually telling us what that means. Googling for the term doesn't help, as the first relevant result is this article itself.
Can anybody enlighten me?
As expected, Intel today took centre stage on the first day of Computex proper to launch its long-awaited lineup of Haswell processors. And Chipzilla heralded the new 2-in-1 form-factor - PCs that can turn into tablets - in which many of these new chips will find themselves. As one of the most widely trailed and hyped chip …
I think they're referring to the newly announced Frankenstein tablets that have both an i7 and an Atom chip on board and run two operating systems. The Atom chip runs Android and the i7 runs Windows 8.
Unless they've found a great way to share data between the two operating systems I can't see it working very well.
No, I think they're referring to the fact that it has a docking station/detachable keyboard, allowing it to be used also as a tablet. Also, when a second processor is used with an Intel chip, typically the Intel chip is in the docking station or keyboard, and the second processor is an ARM chip (not an Atom), located in the tablet part.
This means that when used as a tablet, the device runs Android, and when plugged into the keyboard/base station, it runs Windows 8.
I can see that combination working well for most people, given the different uses for tablets and desktops.
2-in-1 in this context probably refers to computers that are designed to be easily convertible between tablet and notebook styles of use.
2-in-1 could also refer to a CPU configuration where there are both low power and high performance CPUs in the device. The low power CPU runs until more MIPS are needed, then the high power CPU takes over. But Haswell has low power modes built in so this is unlikely.
It's about x3 more powerful than last gen Intel graphics based some youtube videos I've seen with benchmarks. Other than that the same are reporting about a 6% increase over ivy bridge, with the downside being it gets as hot as Hades. Hotter than ivy bridge!
And whos fault is that then. Intel designs its chips to run software that is required of it. If the graphics libraries such as DirectX are using poor techniques then you can hardly fault Intel alone for its Graphics chips. The designers of the PlayStation have control over both the hardware and software to deliver realistic visual effects.
So DirectX / openGL are the problem and not Intels hardware, insptie of the fact that both run fine on nVidia graphics chips, including those baked onto ARM, likewise with AMDs graphics solution that runs standalone, and is also baked onto AMD chips, and works better than intel.
Nope everything else working well is clearly no evidence at all, must be the APIs instead.
You've got to compare like with like. Sure, high end gamers aren't going to be throwing away their NVIDIA/AMD GPUs, but ultra-portables and tablets don't have those in anyway. How does it look when we compare to the competition in tablets, i.e., ARM based GPUs?
Intel HD is pretty good for most people, including for older or less high end games, and imo the driver quality seems much improved over the older GMA chipsets. I'd be interesting to see what the situation is for the Atom x86 processors though...
And even the high end gaming laptops benefit from Intel HD improvements, since things like Optimus will use the Intel graphics most the time, only using the dedicated GPU when needed.
Power pure and simple. AMD cpus take up more power than the intel equivalents due to the larger fabrication proess, they're still several steps behind intel in that sense.
And yes, I was comparing intel integrated to amd integrated, and partly to the ARM integrated also, although it's hard to draw a direct comparison with those.
Also I have to figure out why when I post as AC I seem to get more upvotes than when I don't. Really confusing.
The behaviour of the folks punting the 'Apple Invented Everything' edition of computing history show some striking similarities with the Creationists loons. It is a pity that these folks appear to be utterly impervious to information because they are striving hard to ensure that our descendants have a legacy of pig-headed ignorance forced upon them.
No, Apple didn't invent everything, but they are often the first to produce a popular design/form factor, after which the rest of the market comes piling in with near copies. The Air format was popular and Intel went pilling in with their Ultrabook brand to provide the same format in the Wintel space.
They weren't the first, they weren't the first to make it popular. The first popular ultra-portable laptops were netbooks I would say, which appeared around the same time as Apple Airs. I'm not sure I'd say the high end ultra-portables have ever been mainstream, with most people preferring larger more powerful laptops, and using smartphones now for when they're on the move.
Intel created the "Ultrabook" trademark in 2011, years after this, so no, it wasn't "piling in". Nor was "Ultrabook" a new format, it was more about marketing - giving a name, so people say "I want to buy an Ultrabook" and then restrict themselves to the choice of laptops that Intel get paid a fee on. (After all, Apple PCs use Intel too, so it's not like Intel were losing money from competition.)
Actually, the first "ultra thin" notebook was Sony X505, invented by Sony in 2004.
Google it up - that was good couple of years before Macbook Air.
Of course, Sony being Sony - they marketed the device for the CEO/CTO types, and priced it accordingly (it well above 3K EUR in Germany). Hence, it was not very successful.
But in terms of actual invention - this was "it". Apple just take more sane approach and priced the Air in the range of "affordable luxury" item - certainly not cheap, but well within reach of middle class.
Same flop (typical of Sony) was repeated with the Z series - Sony made the dream machine which was more powerful than most Macbook Pros (before 15" Retina) but ligther and actually thinner than the first-gen 13" Air. And Full HD 13" screen since 2010 - something that took Apple quite a bit of time to catch up. All in all, a perfect notebook - I know since I owned all Z models, before I switched to Macbook Retina 15".
Again, thanks to their ridiculous business model and practice of stuffing crapware (at some point Sony even had the audacity to ask $50 for a "clean" OS installation) world will remember Apple Macbook Air and Retina as the exemplars of ultra-thin and ultra-powerful machines, and not Sony X and Z series.
However, nothing changes the fact that it was Sony delivering the innovation years before Apple.
"However, nothing changes the fact that it was Sony delivering the innovation years before Apple."
Reading your post, isn't it obvious that Apple "innovated" the correct pricing and marketing strategy for these sorts of devices?
Apple rarely invents (or is responsible for) new hardware technology. The only example I can think of is Jobs asking Corning to make Gorilla Glass. Although they do seem to be designing their own CPUs these days which is an interesting development.
Apple's value add is that they make products with the correct features, qualities, and prices to be attractive to a broad range of people. Lots of companies sit around and say "hey, we could have done that" and yet they didn't...
"'it's going to take lower PC prices, increased availability of touch, the ‘re-launch’ of Windows with 8.1 and more stylish designs to really get the PC market going again'."
I guess that they still haven't realized that current and even recent generation smartphones are pretty nearly as powerful a computer as most people need and such people have no use for more powerful desktops or even laptops than they already have - assuming that they even have such in the first place.
I find it funny to see all the people saying this, are people who own laptops too. At least, correct me if I'm wrong - do you do everything on a phone? Whilst I've no doubt that there are some people using phones instead of laptops (this is Not News - I recall stories as far back as 2005 about this), this does not seem to be most people.
In fact, with "powerful a computer as most people need and such people have no use for more powerful" you've got it backwards - it's laptops that are as powerful as most people need, so there's no longer as much need to buy new laptops as often. But people buy smartphones and tablets more often, because there's still a reason to get a more powerful device.
When phones become as powerful as most people need, then the phone market will have "problems" too.
It's also worth noting that phone sales have fallen (I believe in 2012), I don't see doom and gloom about dying phone market. (The stats are hidden because the media look at the ill-defined "smart" phone market, so basically, phones marketed with an arbitrary label are selling more - you could do the same trick with an arbitrary subset of PCs, and claim they've increased in sales.)
It's obvious that phones will always have larger sales - phones are upgraded more often due to either being cheaper or tied to a contract, and they are also a personal device, where as laptops and desktops have often been shared. I presume that phone sales have been larger than PC sales for a long time - I don't know why we hear the doom and gloom about PCs only now.
If sales are so bad, could it be that Intel seems to think that all "updated" processors need to have new socket designs and thus new motherboards?
For many years there was some consistency between mobo's and many varied processors. AMD did this for the longest time.
Considering that there is little significant change in actual capability, (mostly power savings) perhaps Intel would sell more if they brought the Haswell technology to the 1156 socket so more could have used it?
Upgrading for the sake of upgrading is a VERY limited market these days.
Actually from what I have heard the new CPU's need special low power support in the chipset and BIOS, so a new motherboard would be required anyway. Sure they could make it an LGA 1156, but all that would achieve is to allow people to stuff the CPU into boards without the low power features and BIOS and thereby fry the CPU, so it makes sense, even if it is annoying.
Haswell is actually quite a disappointment from the CPU aspect. The GPU section is a nice performance bump but the CPU gained almost nothing. With the absurd Intel pricing these are going to be a tough sell with AMD's APUs being superior for portable devices at all power and price levels.
The PC market is probably a lot nearer saturation and my i7 920 Windows 7 64-bit PC should last a while; my has, and I'm loath to dump 24GB of RAM just to move to a different RAM banking etc.
I don't even need my PC for storage now that I'm using low power mobos to build multi-TB FreeNAS boxen, I have some VMs using the FreeNAS as live storage, and have tablets too, so it'll probably be at least a year or two before I need to replace my main PC innards again.
What would attract me is some seriously beefy 64-bit ARM mobos and 64-bit OS support, because they will undercut expensive Intel and less expensive AMD possibly by a serious margin! Intel graphics is a joke; my low end AMD FM1 CPU in my NAS has better graphics than most Intel CPUs, and was much better value.
Biting the hand that feeds IT © 1998–2019