Would they allow it if...
She named the album Norman There's a Village in Austria Named Fucking Rockwell?
It may seem counterintuitive to use the launch of a new Apple laptop to argue that the company is trying to kill off laptops altogether but that is exactly what's happening. Today, Cupertino announced a revamped version of its beloved MacBook Air – a lightweight laptop designed in 2010 that hasn't been touched since a small …
Indeed. It's pronounced "fooking", apparently, and the good burghers of the town got so sick of Anglophone tourists stealing their roadside signs that a few years ago they resorted to painting the town name onto a large boulder instead.
(This is from memory, so if it's wrong, bite me. I could look it up on Wikipedia but that would be cheating.)
I recall great hilarity when my youth orchestra went to Germany, on the way from Rotterdam to Hitzacker, possible near the Belgian border with Germany there where a bunch of signs for Wankum.
For a coach full of teenagers that was fun. The trip is also memorable for a tape of "Monty Python, Live at Drury Lane" that was played more or less continuously.
Pull the other one, it's got bells on!
Your mobile, low voltage components will be inherently less powerful, slower, & throttled as to keep the heat to a reasonable level, whereas a full desktop grade chipset will run at full power until either it or the fans seize up.
Your mobile chipset throttles back, downshifts, & plays "These boots were made for walkin'", my desktop upshifts, puts the peddle to the floor, & cranks up "I can't drive 55!" so it can sing along.
Relax, we don't know the context of the "as fast as the fastest PC" quote - the exec might have describing the speed of performing a particular quick task in Photoshop or iMovie or something. Desktop CPUs haven't been getting much faster at single thread tasks in recent years anyway - because they've been fast enough, focus has been on improving energy efficiency and or just adding more cores.
What Apple actually displayed as a bullet point was 'Faster then 92% of the notebook computers available last year" which Anandtech find very plausible:
Apple mentioned during the keynote that the new A12X is more powerful than 92% of the available laptops in the market, which isn’t very surprising given the performance levels we saw on the A12.
@Dave 126 Desktop CPUs haven’t been getting much faster at single threaded tasks in recent years because Intel et al have run out of things they can do to make them faster. Clock speeds have hit limits imposed by the switching speed of silicon vs power consumption. All the optimisations they can think of are implemented (a modern CPU uses a huge number of transistors). Fabrication process improvements have slowed to a crawl. They’re left with stuffing more cores onto a chip and adding custom hardware for specific tasks like video encoding/decoding
It will probably work fine for most users, but I run rather heavy (and well parallellised) image processing code, like stacking 44 1,000-frame 6Mpixel monochrome uncompressed videos (250+ GB of data) to create 44 6 Mpixel panes to stitch into a 100+ Mpixel lunar mosaic. I would be very curious to see how much time that takes on the iPad Pro. On my laptop it takes some 12 hours, on a Core i7 desktop it is quite a bit faster.
Regarding the MacBook Air: no SD-card slot is a definite deal breaker for me. If I want to transfer a lot of photos from my camera to the laptop, popping the SD card into the laptop is the easiest and fastest way by far
but I run rather heavy (and well parallellised) image processing code
How much of this can be done by the GPU? I suspect that RAM and bandwidth might be the limiter here, but, yeah, I suspect it will run significantly slower if at all on I-Pad than it does on your current hardware.
I've nothing per se against the I-Pad route (for many tasks a suitable replacement for a notebook), except that I do want full control of the OS so that I can install my own libraries. Wonder if there will ever be a version of IOS that gives us a terminal and sudo?
@ Steve Todd
Quite right, there have been technical challenges, and Intel's difficulty in moving to a smaller process size has been well reported. My point largely stands though; being 'as fast as the fastest PC' [at some tasks] isn't that high a bar. The amount of money Intel throws at solving their technical issues is determined by the market though, and if gamers are advised that most games don't benefit from anything faster than a Core i5 (since the game engines have evolved with GPUs) then the demand for high end CPUs isn't as high as it might otherwise be.
Intel have a huge budget to work on improved Core and Xeon CPUs (don't forget, they share the same architecture, and business users would snap a significantly improved version up). If they could throw 100% more transistors at the design and make each core even 20% faster then they would do it in a heartbeat. They have AMD breathing down their necks, and AMD are getting close to the same performance per clock cycle. The limiting factor for single threaded performance is clock speed (diminishing returns have long since cut in over improving the design logic), and the max clock speed hasn't changed much over the last three or four process sizes. Yes, you can make the chips go faster, but at the cost of serious heat.
The reason that an i5 is the recommended CPU for gaming is that the single threaded performance doesn't get much faster the higher you go up the range. Most games offload a huge amount of the work to the GPU, but then modern GPUs are very highly parallel so it's not a like-for-like comparison. Game writers work within the limits of what is available, and what their target audience can afford to buy. Most games therefore include a quality configuration setting that adds or removes eye candy, trading against CPU/GPU performance. Try your i5 at 4K extreme settings.
plays "These boots were made for walkin'",
You don't geddit. Wrong song reference. The process which you are observing is described in "Summer Wine" by the same songstress. Just Tim Cook offering you the wine. Listen carefully for a description of where you will end.
How many VMs can I run on one of these iPad Pros?
While I agree that you do have a point, how many people need to use lots of VMs? I suspect that people who really do need a lot of VMs are probably fairly flexible about their choice of host OS. Anyway, as I'm sure you're aware, VM performance is heavily I/O dependent and I think that's going to be the bottleneck here (along with the performance hit if you need other architectures). But could such a device be okay assuming it comes with a good SSH client to your VM park?
My biggest worry with all this is trying to force people to abandon local file systems. Yes, it's convenient if you do work with multiple devices that the relevant files are easily available on them all, but it's not worth the risk of them not being available because you have a shitty internet connection.
I'm watching all of this with interest, including Chromebooks. My laptop was fast enough for the level of CAD work I was doing years back, provided I didn't mind waiting a few minutes for a render (which I only rarely needed to do). Therefore any new machine has not been a necessity, and I can watch these platforms - iPad Pro, Win 10 on Surface, Chromebooks - develop with disinterst.
Back in the day I had no choice over my platform - CAD software essentially dictated I used an Intel Windows machine.
Features that were available but exotic a few years back, such as good stylus support, high Res displays, external GPUs, an alternative CPU architecture - are now or are becomming mainstream.
What my mobile workstation may be in a few years time I really don't know. It would be interesting if Photoshop or a similarly polished and featured alternative was available for ChromeOS.
> Autodesk already have a full 3d cad running in the browser.
Exactly. So do OnShape, using a desktop or Android app as a client. This is important because if it works as advertised I'm no longer tied to any platform - a Linux laptop or Chromebook becomes a viable option. Even MacOSX doesn't have all the main CAD packages available for it, such as Solidworks ( which package a company uses is often determined by its clients and suppliers).
It's worth noting that CAD (unlike video editing with its high IO demands) is suited to being run in the cloud - its descended from the mainframe / terminal model, multiple engineers may be accessing the same model simultaneously, and models hold commercially sensitive information that's best not held on laptops floating about the place.
Whilst charging customers per month is appealling to CAD vendors, it does open the door to competition who might charge per task, or per hour, which would work out cheaper for smaller companies for whom CAD is only a part of their work flow.
"It's worth noting that CAD (unlike video editing with its high IO demands) is suited to being run in the cloud"
Speaking as an experienced design engineer, there's a lot of IO on big CAD models. You can take my machine with 3 SSDs, 26 cores and 128GB of memory from my cold, dead hands.
> Speaking as an experienced design engineer, there's a lot of IO on big CAD models. You can take my machine with 3 SSDs, 26 cores and 128GB of memory from my cold, dead hands
The model is big, but it can worked on remotely - the only data travelling between your terminal and mainframe (or cloud) is user input and video output. Indeed, for bigger models than your's (think of visual effects studios) a render farm of clustered resources is used.
It's not for everybody, but in many circumstances a cloud CAD instance is suitable - plus you can rent more CPU/GPU power as you need it. Then of course there are the collaborative working methods common in many industries - colleagues need to work with changes you've made in real time, so having a model stuck on an individual workstation is not ideal (though there are ways of just uploading your changes rather than the entire model)
By contrast, dumping a day's shoot of 4K footage up to a cloud is impractical over most broadband connections.
Being able to drag around the 3D model of sprocket (fine, three sprockets) in a browser window may be good enough for the 3D-printer-happy "maker" generation but is not something I would call CAD software. Meanwhile out here in the real world even proper, compiled executables running on desktop / laptop class hardware die a thousand deaths as soon as you try dragging a mere component of an assembly in an ever-so-slightly non-trivial design in a parametric CAD.
The open source one I prefer is basically shitting bricks with a single assembly loaded simply because it happens to unluckily fall on the wrong end of the O(n^x) complexity involved in continually re-evaluating moving geometry - badly enough that it managed to get me to try to figure out whether that can be helped at all; and just so nobody gets the wrong idea, as anyone following the construction of the Marble Machine X should well know, its designer hit a brick wall not long ago splatting against the limits of Autodesk Fusion 360 running on real-world hardware.
So 'scuze me if I need to lay down to properly laugh at any suggestions that browser-based CAD running on an ARM-class CPU is "good enough" in any sense of the word.
>Being able to drag around the 3D model of sprocket (fine, three sprockets) in a browser window may be good enough for the 3D-printer-happy "maker" generation but is not something I would call CAD software
@Dropbear. I wasn't for a moment suggesting the CAD software run on ARM, merely that it doesn't matter if the *terminal* runs on ARM. The CAD model itself is running on a load of Xeon in the cloud ( or on your local network) the terminal just needs to accept user input and display the workspace.
Back in the nineties we were using CAD software on a UNIX mainframe accessed through X-Windows from a terminal - before it was practical for standalone workstations.
Conceptually this was no different to accessing computer resources on the cloud. The geographical location of where the model is being processed with respect to the user has absolutely *nothing* to do with how many components, sprockets or otherwise, can be in the model.
CAD can be very resource intensive, especially the kinetic simulations you're attempting. That's *exactly* why Fusion 360 and others work with Amazon Web Services - to bring you even more RAM and processing power than you can fit in your desk. The alternative is to buy your own CPU/GPUs, but then you have to use them enough to justify the investment.
Simulations, like renderings, are a good example of a task that doesn't require human input whilst then computer is crunching the numbers. Depending on which package you're using, you might find that it allows you to install clients on your other machines on your local network so that their CPU/GPUs can be harnessed to speed up the calculations.
NOTHING is "suited" to the cloud except websites. That sort of cloud computing is sheer insanity on some many levels.
I'll do MY computing on NON-rental SW, on a computer I own. I'll do backups not needing the cloud. I'll be able to work without internet and even for many hours with no power.
Real cost, privacy, security, availability.
"models hold commercially sensitive information that's best not held on laptops floating about the place"
It's easier to control a laptop's privacy/access and location than a cloud service. I certainly don't trust MS, Amazon. Google etc.
> Real cost, privacy, security, availability
For an engineering company, the issues of cost, security and availability are *exactly* why they would want their CAD models to sit on their own cloud hosted on their premises, with engineers accessing them via a terminal, X Windows, a browser, whatever. This means their IP doesn't have to leave the premises, and isn't stuck on an individual workstation inaccessible to colleagues.
You're a lone wolf, I get it, but most engineering projects involve professionals from various disciplines working together.
"This means their IP doesn't have to leave the premises, and isn't stuck on an individual workstation inaccessible to colleagues."
...and their IP is not on any kind of device where the manufacturer's cloud storage is the operating system default - like an iPad, Chromebook, etc.
There have been rumours for Macs with ARM processor for long time.
I'd say: There _will_ be Macs with Intel processors for a very long time. Current ARM processors are fine compared to a current quad core laptop, but don't come near the high end Macs (18 cores currently).
But if you see that new iPhones and iPads come with half a terabyte or a terabyte of storage, Apple could just let users run MacOS X on an iOS device, possibly as an app. Take an iPhone XR, run the "macOS" app, attach keyboard and monitor.
Many moons ago I was in a Stormfront store in the UK looking at the Apple macbook pro and I almost bought one, But shortly after they had a new model and it lacked the things I needed: RJ45 networking, USB-2 peripherals, built-in DVD drive, etc.
So I decided to buy a mid range Linux laptop From Entroware instead. Yes, it lacks some key features I might like, but it is mine. it does what I tell it to. And it cost a lot less, the remaining money which I spent on loose meals and nice women.
The new Mac Mini line looks to be in the same ball park as a refurb Mac Pro (6 core i7, 32GB, 1TB SSD, $2499), but it supports the latest Mac OS X and has Thunderbolt ports.
With some adaptoring, I think it'll even be able to drive my two Apple 30" monitors. I write code, so video resolution/performance has absolutely no bearing on my needs at all.
The new Mac Pro (next year?) needs to be a significant step up to have me looking away from a high end Mac Mini.
The last Mac Pro was very much tuned at a workflow that wasn't coding - ie, video editing and thus shunting a lot of data very quickly and frequently between onboard and redundant off board storage. That use case is almost the last justification for a workstation, since other workflows such as CAD or animation can just have a GPU farm steered by a modest desktop.
Which is a long way of agreeing with you; it's hard to see what next year's Mac Pro will do for you over a Mac Mini. Tim Cook says it will be modular, but ultimately how much power do you need on your desk before building a render farm out of commodity hardware seems like a good idea?
"other workflows such as CAD or animation can just have a GPU farm steered by a modest desktop."
Animation maybe, but CAD will usually be done on a high end workstation. A server farm introduces complexity and therefore cost, and using someone else's cloud is unsuitable for many tasks in the aerospace industry.
It's more important than ever to be able to demonstrate you have control of your data, and "someone else's cloud" is outside of your control. If the server farm is in your control, it's more expensive than a high end workstation, so why bother?
Can you repair it/replace parts or is it the same glued-up shite as everything else Apple make these days?
To be honest, I never had a need to replace anything on any of the MacBook's I've had. The one I have now is about the first where I have a problem (the trackpad click mechanism seems to have a problem, but as I use a mouse it doesn't bother me much) and as it's approaching 3 years I may update it instead. Not that I don't plan ahead: the reason I use Macbook Pro's is because they have multiple connectors so I have a fallback (which is why having simply 4x USB-C is IMHO perfection, hence my interest in updating).
Alternatively, I may just get a Mini and hook it up to the screen/keyboard/mouse combo I use at home (none of which is Apple) and use the savings to set up a local NAS.
Replaced HD with SSD in mine, and it’s on its third battery. All were easy to do. But my MBP is fairly old now.
The thing is, I used to do all of that too. Upgrade parts or changed faulty ones - I used to build my own machines and that started in the days when you still had to know cylinders and head settings to get a hard drive to work (by way of illustration, I started in the days of 5.25" floppies :) ).
What changed was (a) time, (b) reliability. I've had a few MacBooks now and maybe I've been lucky (knock on wood), but so far, the touchpad starting to play up is about the first problem I have had in some 12 years. Possibly (c) budget will have helped too, I tend to go for the "one-but-best" device in a range as it gives me the same lifespan but at a lower cost.
The last MBP I changed a part in was a 2013 one where I swapped the disk for a hybrid - much faster, but not quite the expense of an SSD. It was then I discovered that these hybrids also makes for very fast backup disks, but I'm a bit worried about how well the cache is flushed, although repeated tests have not shown a problem (I'm the sort that doesn't believe a backup exists until I have proven the restore works - burned my fingers often enough :) ).
There's a probably a (d) here too: despite the backup tests above, I can no longer be asked to mess around with hardware. I open the device, it works, I close the device, it locks the screen and sleeps, and once a week it gets a restart to clean out any dead processes. I really don't have the time for anything else. If you do, Apple gear may not be for you. Fair enough.
Let's say you're the kind of young hipster type that Apple would like to appeal to (and, in fact, do appeal to). The sort of person who likes to work in the kind of coffee shop where they can pretend to be cooler than they are. Are these people going to start turning up wheeling a trolley with a Mac Pro strapped to it, or what?
No, no, they're not: they're going to turn up with the same kind of shiny Apple laptop in their backpack they turn up with now, which is why Apple are not going to kill their laptop range.
Lots of older people are now using MacBooks. Many in my Camera Club have moved from Windows and never looked back. These are not techy people but just got really pissed off with Microsoft.
Go into any Non City Centre/Town Centre Starbucks/Costa/etc these days and do a random survey. I did that very thing yesterday at the Starbucks at Hickstead on the A23 sarf of London. there were 6 laptops in use at the time I stopped by. 5 were Apple devices and one Dell, and the age range of the users surprised me. Two of the MacBooks were being used by people well older than me and I'm 60 ish.
If Tim Cook wants rid of MacBooks then he will piss off an awful lot more than the Hipster brigade.
It's been a surprisingly persistent myth that only hipsters use MacBooks - it might just be that they are the most visible users in coffee bars. Generally the baby boomer generation has more money, and it should be clear its a demographic group Apple have always aimed at, regardless of their advertising.
For your camera club, the Windows ecosystem was a bit late to support a few important things for photography though they're now mostly fixed. First, traditionally Macs had colour management, then Windows caught up. Then OSX had better support for high resolution monitors, then Windows caught up but Adobe dragged its feet (so that Photoshop would have ridiculously small UI elements on high Res displays). Macs had 16:10 displays whereas PCs fell almost universally into 16:9, until the Surface range helped prompt PC vendors into producing laptops with 3:2 screens.
In fairness to Microsoft they evidently now know what is required (the colour accuracy of their Surface Pro is highly praised, stylus support is good, resolution very high, aspect ratio is useful), but in the last decade there have been times when their software and hardware partners have been slow to wake up.
I wasn't really claiming that only hipsters use macbooks, I was just poking fun at the kind of person who might spend their time writing code in a coffee shop. However that's silly, because I write code in coffee shops sometimes, and I'm way too old to be a hipster.
My point really was that things like iPads and iPhones depend for their success on a copious supply of apps, and those apps don't get written on the devices they target, both because those devices lack things like keyboards which people who write code tend to want, and because they don't in fact host the kind of development environment which you use to write the apps at all.
And then, if Apple laptops go away, that leaves either non-practically-portable machines on which to write code, or assumes that the people who write apps for iPhones and iPads will do so on Windows or Linux laptops.
I don't think the non-portable machine option is something that can be taken seriously: I started my career with machines which sat in rooms you had to go to to use (and those rooms were often pretty cold) and the entire trajectory of computing since then has been to make things more portable. I'm 55 and I quite like working in coffee shops: they're, frankly, often nicer than open plan offices.
I think the deveopment-will-happen-on-non-Apple-laptops option is more possible, but is probably long-term suicidal for Apple: it's the exact opposite of the kind of tie-people-in strategy which has worked so well for them.
There's a final option which is that it all happens in the clown and you access the clown via your iPad. You still need a keyboard, so it will be some kind of iPad that looks a bit like a laptop, an apple version of a chromebook I suppose. That might just fly, but actually issues of latency, connectivity &c are kind of hard to deal with well (and I do a lot of work-based stuff via VNC so I do quite a lot of what is, essentially, this).
So, in summary, the market for machines on which developers write iPad and iPhone apps is relatively small, but it is absolutely critical for the continued existence of those platforms. I don't think Apple are stupid enough to cut off their own air supply like that: they will keep on making portable development machines -- laptops in other words.
(As a note to this: this is one of the things that I think doomed Sun: they owned the university desktop (which were called workstations of course) market) in 1990. But enterprise systems were far more profitable and Windows & Linux started eating their workstation market. So they gave up on it, people stopped writing things for Solaris since they no longer had Solaris machines, and Sun in due course ran out of air. The Solaris-desktop market was tiny, but critical.)
"What you might not know is we've sold more iPads in the last year than the entire notebook lineup of all of the biggest notebook manufacturers," he said.
A little mental addition reveals that the sales of the entire notebook lineup of all of the biggest notebook manufacturers comes to about 122 million, versus iPad's 44 million. That's not more. That's less. A lot less.
Had he said "each" instead of "all," that would be more accurate, but not a terribly useful statistic. There's only one maker of iPad, but there are lots of makers of notebooks, and they outsell iPads by almost a factor of three. Just imagine if they weren't all crippled with Windows 10!
"A little mental addition reveals that the sales of the entire notebook lineup of all of the biggest notebook manufacturers comes to about 122 million, versus iPad's 44 million. That's not more. That's less. A lot less."
I came here to say that, and also it's meaningless. The number of iPads is dwarfed by the number of TVs sold, but you wouldn't suggest swapping all your iPads for TVs.
They are different things.
Apple has indeed suffered for years now from what I call Mac malaise. Watching the VP of Mac hardware speak, I got one clue of why that's the case. There are no doubt dozens of others. That this presentation did no include a new Mac Pro is outright abominable. Mac neglect remains profound at Apple. But it's just neglect. There is not-a-sign of Apple being determined to kill off their laptops. Not a one. From my perspective, I can't even consider this article an opinion piece. To have an opinion requires some modicum of indication that the opinion might possibly be realistic. This one is not.
So let's get back on track. Apple has severely neglected the Mac. The arrival of macOS 10.14 Mojave has saved their butts regarding the Mac operating system. 10.13 High Sierra was a half-assed mess that was only squeezed out, IMHO, for the sake of meeting a time deadline. It is best forgotten and never used. The prime reason why was that the new APFS file system wasn't finished, wasn't fully documented for developers, bringing suffering and hardship upon its high end users. High Sierra was shameful. Mojave finishes APFS at long bloody last. It literally took Apple 10 full years to finish the thing, The first beta of this son-of-ZFS project was in 2008.
IOW: The Mac, and most especially MacBook laptops, are here to stay. No hard core professional considers the iPad Pro to be remotely a replacement. Neither does Apple. However, Apple requires some severe slapping for the shame they've brought upon the Mac in recent years. At least the new MB Air and Mini are signs of repentance and revival.
Yeah the whole article reads like clickbait with a side order of troll to get all the usual suspects shrieking and foaming about how they are "content creators" and can never ever ever manage without their dual-disk 17" 16-core 'laptop' that needs a pickup truck to move more than 18 yards.
The more probably explanation is that Apple refuse to involve themselves in the cutthroat world of 'landfill laptops' where a good profit margin is 47c on a $300 dollar item, and are trying to pitch high-end and presumably high-margin ipads as a viable alternative to that segment. Fair enough, can't fault them for trying. Whether that is a practical alternative is up to the shopper to decide.
Apple lost me after ios 9, in fact my iPhone 6S+ is still running ios 9, and when that finally fails I wont be buying another. What attracted me to Apple is no longer present. They used to be the one software creator who recognised the value of simplicity, a clear quiet peaceful path through the woods to your goal with no noise to distract and a beautiful UI. And now....well, now nothing works properly and the new versions of iOS are full of annoyances; annoyances that cannot be turned off. It is less intuitive, slower than iOS 6 or 7, and whilst boot time has been cut, the actual time of full functionality is increased. So I will make the shift to Android, because whilst Android is also not perfect, notifications and the option to root and actually own the device wins and I can buy a generic Chinese brand 8gb ram dual camera high spec processor phone for under 250. With an SD card slot. The Apple is definitely rotten.
"They used to be the one software creator who recognised the value of simplicity, a clear quiet peaceful path through the woods to your goal with no noise to distract and a beautiful UI. And now....well, now nothing works properly and the new versions of iOS are full of annoyances; annoyances that cannot be turned off."
Oh, this, THIS! I had always been a Windows user, for the simple tasks that I use a computer for (email, word processing, spreadsheet). I am not an IT person, just running a small retail business and my personal stuff. But, when I needed to use a Square card processor years ago for temporary location sales, Apple was the only system Square played nicely with. So, got an iPad mini. I started using the Mini for other sorts of stuff, and thought about migrating off Windows (laptop) to Apple. The more I used Apple, the more I came to hate their system!! It's like an overly intrusive date that wants to cut your meat for you! And, one of the first things I did was throttle that bitch, SIRI, who was the hard-of-hearing old geezer who mishears what you said, then goes off on a long-winded story about some totally off-topic random thing.
Q: "How do you know if someone has an MBA?"
Other than it being the suffix to their email signature name, business card name, linkedIn name, no doubt the numberplate on their leased/PCPd Audi is something like 'AB18 MBA'?
The only other people I've seen do this are junior HR personnel who take a course and suddenly their last name is CIPD, despite the office being full of people with degrees from bachelors to PhDs who think it vulgar to do so.
I just wander, what is laptop's share in the apple business? While the iphone is about $1K and it is everywhere, how much do they care about laptops? Ah, laptops, these devices which have a keyboard you can type and really work on instead of showing the finger!
[The side question will be here: What the etiquette really is about the finger on the apple device?]
God bless you!
the problem here is that when you measure "user base" according to "new item sales", instead of "what they're using", you get a distorted picture of what the customers want and use.
(the article gets it right, though, from what I read)
much like the PC market, yeah. Slabs seem to be improving enough to sustain the 'new, shiny' demand. For now. But not so much for desktop/laptop sales. People hang onto what they have.
But I disagree with every false point (that is, every point) in your argument. Apple has said it remains committed to the Mac, and will never try to merge mobile and desktop/laptop OSes (despite their considerable shared code heritage), unlike the dubious "success" of other outfits in attempting to do so. I see no reason to disbelieve them on this score as (1) they make more money this way, and (2) they insure continued goodwill of more users this way. And oh yeah, they also keep introducing new hardware in both product lines.
Sorry, this was plain silly.
I see that the way the Reg is trying to get along better with apple is a whole article advert for the iPad.
You don't do things the Reg would normally do like point out that iPad sales have been slowly declining or that laptops still outsell iPads nearly 4 to 1 according to apples own chart (which doesn't include a whole load of minor sellers)
You also follow the apple line by almost forgetting the other things like the mini that may be of some interest as the updates to that actually look kind of useful.
"What you might not know is we've sold more iPads in the last year than the entire notebook lineup of all of the biggest notebook manufacturers,"
When will the advertising world realise that we're sick of their shitty spin on words.
36.9+32.1+23.5+13.9+13.1+2.4+0.4+0.3 does NOT equal 44.2
But consider this: when someone in a presentation has to use very carefully worded statistics to create a false impression, it's actually because they are worried. One assumes that a lawyer or two parsed that statement to ensure that there couldn't be SEC comeback.
So who is it for? Not the analysts, that's for sure. It's aimed at Joe Public to persuade them the iPad is a good deal.
Back in 2016 there was a report that Bluetooth headsets had overtaken wired ones in the US. They had, by dollars. But by count, 80% were wired. The report didn't exactly emphasise this and one assumes the intention was to persuade Joe that the lack of a headphone socket on his new phone was because wired headphones were obsolete.
There are lies, damned lies, and sales statistics.
simply the best aid/adaptation for someone with fucked eyesight (like my wife, thanks to MS).
Just the screen size alone.
Manufacturers: remember that customers with accessibility issues are an advance warning of what your mature market will need. If a 40-something with fucked eyesight and mobility problems can use your kit, it's more likely a 70-something will prefer it too.
Design for a wide range of people is called Universal Design, and with the demographics of wealthy nations a company would be stupid to not implement it.
It's a good idea in general, but sometimes the details of such adaptations can conflict.
You know about accessibility ramps, of course. Make it possible for wheelchair users to get in and out of places where elevation change is required. What could be bad about that?
Well, the mother of a person I know well had an above-the-knee prosthetic leg, and while going up the ramp was fine, she couldn't descend it. The knee joint in the prosthetic would bend and she'd end up on the ground (and she was old enough to where that could cause serious damage because of osteoporosis).
Stairs, though, she could go up or down without a problem.
The disability accommodation had made it impossible for her because of her disability. In this case, the people in charge of the place had thought that a ramp is all that one needs, and that it could replace stairs (with the assumption that stairs = abled and abled people can use anything).
There's no one true way to engineer something where it will meet everyone's needs. You can try to lasso as many people as you can into your target market, but there's no such thing as one size fits all, no matter how well engineered.
I had a first-generation iPad, courtesy of my employer, and I concluded, then, that the device was "useful" but completely rubbish at the same time. Back then, the only channel to and from the device was iTunes and the concept of creating any sort of content on it was laughable. Roll forward to this year when I experienced the iPad pro (second generation) for the first time and the situation cannot be more different.
Today, my iPad pro and my phone are the only devices that I carry with me when I travel.
I am a software architect and the concept of coding on the iPad is ridiculous but, frankly, I absolutely loath coding on the run or on the factory floor and so I have no problem with this limitation. Instead, the iPad lets me draw sketches and diagrams and ideas and lob those over the fence in the direction of my dev. PC back in the office. That's a far more productive use of my time than hunching over a greasy coffee table, trying to hack out lines on a tiny notebook monitor and inadequate keyboard.
The iPad serves for communication, entertainment and more, too. I would never give up my beefy desktop workstation for it but it has already replaced my laptop and I am happy with the result. It also means that I can leave my Kindle at home without suffering the reading experience of a phone screen.
After the first generation, I thought that the iPad would whither and die if it did not evolve quickly. Apple have successfully effected the necessary evolution.
Disclaimer: my beefy workstation runs Windows and my phone runs Android; don't count me among the ranks of Apple loyalists. Also, don't even get me started on the topics of Windows 10, the Android platform and bloat and Android hardware disparity, diversity and quality.
This worrying... I produce magazines using Adobe software, and doing that on an iPad sounds nightmarish. If I can't update to another cheap Mac, like the Mini, I would have to migrate to another OS. After 27 years or so with Apple computers! (Yes, before the Mac there was a Linotype Apple system for print design.)
In airport lounges, hotel corridors and occasional bars, can I prop it on my lap? I've had several tablets and the Surface Pro was marginal, all others inadequate and that's before I start cursing the keyboards.
Aren't convertible devices inherently topheavy? In a real laptop, the guts of the device are in the base, and the screen is just the screen. On top of that, they often don't have (or need) touchscreens, making the lid/screen that much lighter.
On a convertible device, the keyboard is just the keyboard. The guts are all in the screen, and it necessarily has a touchscreen. They could add weights to the keyboard to balance it, but then it gets heavier to lug around.
I have an ultraportable laptop that isn't much thicker than an iPad, and it has the added advantage of having the fragile screen on the inside when it's closed, protected. I've never wished I had a large tablet while using it... of course, part of that may be that I don't like Apple, Google, or since 2015, Microsoft. I used to think them the lesser of the three evils before Windows 10.
I've got a laptop on either end of my travels - but there's a time at the start of a plane journey when they say laptops and large electronic devices must be stowed, phones and tablets may continue to be used provided they are in flight mode. That's when I take off the keyboard and watch Netflix. :-)
One assumes they drop a load of iPads to empirically determine how much force they can withstand before breaking, then calculate how much polyurethane case to enclose an iPad in to meet their specifications.
Otter Box started out making flight cases to house military kit before they used their reputation to flog phone cases.
Despite the word being thrown around endlessly by marketurds (and journos who ought to know much better), there is extremely little true innovation in the IT industry. Like aerospace, it's a business that uses the word constantly but does it basically never. If you stop and consider all the advances of the last 20 years in IT (make it nearer 35 for aero), they are almost entirely incremental improvements. They are not radical or evolutionary inventions. Even the first iPhone was mostly a clever repackaging of stuff already done elsewhere before.
Yes, Apple's design is excellent. Yes, its QA/QC of their oriental slavemaster manufacturers is excellent. Yes, they get green screens built for them by Samsung. Yes, the OS is good, and I very much like their attitude to user privacy.
But the latest iPad is ... just another expensive tablet (and arguably superfluously over-powered for 99% of use cases). The lappie is a marginally improved and expensified version of something you could buy four years ago. The idea that any of this shiny stuff will "change the way we think about computing" is simply risible. It's marketing drivel.
Ultimately, and as the article hints, practical cost-effective decision-making doesn't favour the tablets for serious work. Here's how it goes:
* It's nice, it's powerful, it's pretty and fashionable—but I need a good keyboard here and now. I got work to do!
* Ok, so here's the laptop ... and it's short of ports, has no removable media and I can't swap the battery ... plus it's eye-wateringly expensive
* Oh look over there: way less money buys me a good brand well-built laptop that has all the things I need, at least as much power as the iShiny, lots of expandability and versatility
* So, do I need to work, or pose on a catwalk? If the former, I won't be buying the Apple. Lovely as it looks, it just makes no sense.
And no, I'm not an Apple hater. I think their products are great. But the value for money is ... nonsensical. You're buying humdrum, readily available tech for jewellery prices. It's crazy.
Laptops are so last decade. Tablets are obviously better.
However, it would help if tablets had a proper keyboard and more ports. The keyboard could be attached to the screen by a sort of hinge. Also it would probably help to put the brains in the keyboard part rather than the screen. That way it would be more balanced and more comfortable if , for example, you had it on your lap.
Laptops are so last decade. Tablets are obviously better.
However, it would help if tablets had a proper keyboard and more ports. The keyboard could be attached to the screen by a sort of hinge. Also it would probably help to put the brains in the keyboard part rather than the screen. That way it would be more balanced and more comfortable if , for example, you had it on your lap.
And since you won't be holding it with one hand while using it with the other, the touchscreen would be extremely cumbersome. One's arms get heavy fast trying to hold them out in front of you at the distance people use non-handhelds on a table or their lap. Better add a touchpad (or a pointing stick) to fix the ergonomics, if the keyboard base didn't already have one. And if you do that, you might also get rid of the now superfluous touchscreen.
I've never owned an all in one, but the people who do that I've asked about it say that they seldom if ever use the touchscreen while the keyboard is docked. Laptops, of course, always have the keyboard docked. Eschewing the touchscreen also means you can use an OS and applications that don't have all of the touchscreen compromises baked in (oversize controls, lack of informative hover effects, that sort of thing).
>Also it would probably help to put the brains in the keyboard part rather than the screen
The 'brains' of an iPad weigh very little. The battery probably accounts for a fair fraction though.
The MS Surface Book approach was to place a second battery in the keyboard section for the weight balance reasons you outline.
"But it loses key ports – including the SD card port and the old Thunderbolt port – which people have typically used to transfer video and audio files and attach peripherals."
Do you know anyone who attaches peripherals natively to the old Thunderbolt port, as opposed to via an adapter?
No, nor do I. So you are living the dongle life either way, with the big differences being the availability of multifunction dongles for USB-C, and the fact that native peripherals will doubtless become more available than they ever were for the old port.
Anyway, anyone who thought Apple would do anything else must have been living in a cave for the last 3 years. By far the most interesting change is them removing the proprietary Lightning port from the iPad Pro: since that port collects a license fee from every peripheral made for it, and at the same time ensures that those peripherals remain incompatible with other manufacturers' products, I found Apple adopting an open standard there (even if it is the same as their laptops use) genuinely surprising.
They really should have said that apple removed the standard USB A ports, which they did and people actually use a lot. I have a mac from a few years ago with two traditional thunderbolt ports. I have never connected anything to them. The only thing I could possibly have used is some displays that have that connection.
To get fewer features. Changing I/O ports, no SD slot etc makes the current model less attractive than what came before (although an 11” MBA has not has a SD slot for at least the last 3 years).
So far as running Photoshop on an iPad...ok, and the rest of the adobe suite? Hook it up to a calibrated monitor? What if I do a teathered capture? Oh...no port for that. And what if I use Capture One rather than Lightroom? No port to IOS for that...
Someday a tablet could be designed with the proper ports. With RAM and GPU capable of manipulated a >60MP image, connect to an external monitor, network and storage system. It would be a bit bulky...maybe you could permanently attach the keyboard so it could also serve as a screen protector.
Then, you could call it a “Laptop”
F’kin idiots at Apple....
So, they really think that you would want to run high end Adobe apps on ipads? 5 mins in it will be red hot and run at a snails pace. Already have problems with our ipads out in the field, so have resorted to these at a mere £200 each ipad. https://www.gps.co.uk/x-naut-active-cooling-mount-for-ipad/p-0-3177/?gclid=google&utm_medium
I'll extend that a bit.
If I want portability, I'll use a smartphone (or old phone if all I want is communication, but probably a smartphone).
If I want raw power, I'll use a desktop.
If I want a large amount of computing power but available in many places, I'll use a laptop. I can bring it with me in a backpack, and can use it on the go from its battery.
I don't need a tablet because it is as big as a laptop (the slight difference in weight doesn't make much difference) but can't do as much. I don't need something as powerful as the portable thing but as portable as the thing I need a backpack to carry.
I *hate* getting fingerprints on a laptop (or desktop) screen. Yes a pad is different because it is all touch screen. Also on a lapto you have your hands on the keyboard, moving them up to stab at the screen is neither natural or ergonomic. The extra functionality of the Touch Bar over normal function keys gives more options for less keyboard real estate.
It used to be that Apple computers were the choice of creative professionals. The iPad and iPhone are consumption devices that are difficult to create things on. A mechanical keyboard is far easier to type on than a virtual keyboard on a flat piece of glass. A mouse or tablet is easier to use than a touch pad/screen. A quick look outside does seem to confirm that the vast percentage of the hordes are consumers and create nothing (except for making good targets as they wander around with their heads buried in their phones/tablets).
Tim Cook is a decent manager, but doesn't have the visionary component that Steve Jobs did. Sure, they can take the MBA approach of putting all their efforts into luxury consumption devices that yield the most profit margin, but they are doing it at the detriment of of their future. If people migrate to the Windows world to create, they'll likely optimize more for Windows devices such as Surface. That will make Apple have to play a compatibility games and be seen as just another device maker competing in the same market as the surface. Just because you are selling a lot of blades, it doesn't mean that you can stop putting effort into making good handles.
I don't see how making the MacBook Air a bit thinner and a bit lighter makes any difference. I'd rather have a slightly thicker laptop with more mass put into the battery. Thin stuff flexes and breaks. Being able to use a laptop for an entire day on one charge would be awesome. Just fiddling with spreadsheets isn't a big drain but I'm importing and processing photos which does eat up battery life like crazy. The more time unplugged that I have, the more I can get done. I took a dead battery from my old MacBook to create a power interface and used it to shove power from an external battery into the computer. It's ugly and heavy, but it works a treat.
Apple has been disappointing for the past several years. It seems like the bulk of their efforts have gone into creating more and more expensive phones.
44.2 million iPads vs world-wide laptop sales... which add up to 122.6, nearly 3x the iPad sales.
Ford could likewise claim that the Edsel was the most successful automobile of its era because more adults bought one than kids did bicycles, skateboards, or roller skates.
A false comparison.
One is a mainly consumer product mostly for content consumption. The other is product now for people creating content.
You might as well compare Toyota car sales and truck sales.
I have though said for years that the Mac has not much of a future. Laptops will not be replaced by iPads even though MS seems to be trying to kill windows,
Whatever else Apple is up to, Lana is such an amazingly hot, over-talented, singer who writes her own lyrics and controls her own 50s-styled videos. Jazz? She'd almost make me like jazz.
Like, she's a really good, rainbow-unicorn-level Youtube-launched artist, and really doesn't hurt looking at either.
So this delicate slab of glass can replace a laptop or my home tower PC?
So can it:
- Dual boot into GNU/Linux and windows or any hobby OS I can chuck at it?
- Access data I have backed up to optical disc without an adapter?
- Have a firewire card added to it so I can capture SD or HD video off my mini DV tapes.
- Connect to a scanner without an adapter.
- Rip any of my DVD's containing movies I have bought on the cheap from charity shops, without an adapter.
- Run a game I used to play on windows 95 running in a virtual machine.
- Develop applications using an IDE of my choice, in a language of my choice.
- Print stuff to a printer that does not have wifi, without an adapter.
- Keep itself cool while running at full speed all day while I render videos, compile some code and play that game on my VM.
- Connect to a surround sound system using a set of 3.5mm jack plugs or optical cables.
- Access the SD card that is used in almost every DSLR to this day.
- Connect to another ipad via ethernet cable to transfer user data across a gigabit connection faster than using the AC wifi that is already over used by other devices.
- Play Crisis?
- Allow me to add a serial RS232 port to it so I can program a PIC microcontroller using a home built programmer sold as a kit that is sold today on Amazon
- Let me use my IBM Model M keyboard for that retro clicky feel.
Yes, that slab of delicate glass containing components designed to overheat or expire after the warranty, that can not be repaired by myself without me being called a counterfeiter, that will be considered "vintage" by the manufacturer who will refuse to repair even the simplest of issues for less than the cost of a new machine, blatantly lying to my face in the shop, will replace my 2012 lenovo laptop that contains an i7 CPU or my Ryzen 5 6 core PC.
Pull the other one apple. My 2012 lenovo beats your ipad.
I've not tried an iPad Pro but I would love to. Although in that format I'd probably prefer the MS Surface because it is basically the same thing, at the same price, but it DOES run regular app.
I do think of all their products, the iPad is the one Apple does best objectively i.e. I don't buy into Apple ecosystem but if I want a decent tablet I would only buy iPad (if I don't want a decent one I'll get a Fire HD)
Biting the hand that feeds IT © 1998–2019