I'm sure Silicon Valley is skewing it
Salaries here being absurd and iPhones being abundant.
2161 posts • joined 18 Jun 2009
Salaries here being absurd and iPhones being abundant.
One of the more surprising things is that car (/auto) insurance seems to be a lot cheaper here in the US _despite_ also needing to include the medical bills of anyone you hit per their lack of nationalised healthcare. I doubt that $5m of insurance will cost more than a few hundred dollars a month, especially if these cars really are much less likely to have accidents.
Replying to myself on the reliability issue, having tried to source more objective numbers...
Per Squaretrade — http://www.squaretrade.com/htm/pdf/SquareTrade_laptop_reliability_1109.pdf — an insurance company that you might expect to want to amplify total numbers but that probably has no reason to be disingenuous about the manufacturer spread, 17.4% of Apple laptops fail within the first three years. The industry best was Asus at 15.6%, worst was HP at 25.6%.
So the worst big supplier produces machines that are 24% more likely to break than Apple and 64% more likely than the best. Apple's are only 11% more likely to break than the best.
Apple is also beaten by Toshiba and Sony but it's objectively amongst the good build quality guys.
For certain niche professions, it's a huge benefit that Thunderbolt exposes full-speed PCIe to outside lets you add arbitrary expansion devices when docked. E.g. Sonnet sell a box that lets you attach a regular PCIe card. I'm told it's quite the thing for pro audio.
For the rest of us? The build quality is usually good*, the battery life usually superb.
* no, your three anecdotes plus the times it has actually been newsworthy because the build quality wasn't good don't disprove that assertion. Though my anecdote doesn't exactly prove it either...
It shows up the first time you pair the mouse with the computer. Automatically. No interaction required.
Which, yes, is like those annoying applications that force you through a tutorial because they're insufficiently easy to figure out.
Find the power button by using your eyes, probably only once. No need to read the manual any more than you had to read the manual to figure out where to plug the monitor into your desktop PC.
Ports round the back are an awful idea for USB-like things that are likely to be plugged in and unplugged frequently (you know, versus things like monitors). I feel like Apple has stumbled into the problem — when the ports first headed round the back, the keyboards were still wired and had USB ports directly on them. So no need to touch the main computer at all. Then the wireless keyboard became the default and the fumbling began. Never mind the risk of scratching the machine with the USB device. More design thought required, I think. The sides would have been a good solution but Apple has collapsed those to being about 5mm wide so that option's out.
Clicking on text:
• single click places the cursor;
• double click selects a word;
• triple click selects a paragraph.
If you don't know about triple click then you can stick with manually locating paragraph bounds and clicking and dragging over the whole thing, in much the same way that, whether a Windows or Mac user, you can use File->Save instead of pressing [Control/Command]+S. It's just less convenient.
... to help relieve the RSI that's inevitable from using an Apple mouse.
The 'Magic' Trackpad is pure genius, however.
Would that be the same Apple that was first to ship a computer completely free of legacy ports, creating the first third-party market for USB peripherals?
I thought it was more to fire something trivial off, then disappear for 5–10 seconds?
The device has a feature called 'factory reset' that doesn't reset it to a factory state. That's different from having a feature called 'format the hard drive' which establishes the correct formatting on a hard drive. The first feature doesn't do what it promises, the second does what it promises but is sometimes falsely assumed to do something else as well.
That being said, it sounds like an easy bug to fix. A quick pop-up to explain that if the purpose is to remove confidential information then a full erase should be performed which will take X minutes rather than Y seconds and a couple of buttons would do it. It's such a fringe feature that it's probably not worth investing more time in than that.
I've enjoyed this being the year of the S5 and the 5s.
Indeed. Had the author alleged a strong correlation between use of that file format and intellectual property theft then he'd have been closer to making the point he intended, I think.
I saw the headline of his death and assumed a heart attack or something. I read the article, which said suicide, and immediately thought "oh, so there was nothing wrong with him". Only then did I consciously correct myself.
So I'm thoroughly of the opinion that mental health needs to be taken seriously but apparently still catch even myself not doing so. It's deeply sad that Robin Williams was sick and that his sickness killed him; I hope regressive instincts similar to mine didn't contribute and can be overcome more widely.
Let's not forget the Motorola ROKR, not matter how much Motorola and Apple might like us to. There were quite a few MP3 apps for Series 60 phones too, so you could rehabilitate your old Nokia 7650 if you really wanted. It's a slider though, so I guess something a little later like the 6600 would be more relevant?
Internet Explorer was the default bundled browser with all Macs from 1998 until 2003 or so. It was branded as version 5 but by the time it jumped to OS X had diverged completely from the Windows branch and, at the time, was the most standards-compliant browser (no, really — look up the Tasman layout engine).
I guess its development was tied to the patent cross-licensing deal; in any case Apple seemed to be aware that the agreement wouldn't last forever as Safari was conveniently ready in the wings.
Speaking professionally, even as late as 2009 I'd get extremely lazy web developers trying to justify leaving their half-decade old IE6 code alone despite the company's significant move into iOS development (and, therefore, Mac deployment) because "there's an IE for Mac, right?"
That article doesn't mention OpenGL even once — your conclusion is akin to citing an article that states that ebola is quite deadly as evidence that it is more deadly than the bubonic plague — and the 0.2% of people that care about "high end gaming hardware" are unlikely to play games in their browser.
Depends how you define peak Apple. In terms of peak influence, yeah, we're past that. But iPhone sales continue to go up year-on-year. The broadening of the market just makes Apple an increasingly less singularly notable player.
Yes, that's how Net Applications classifies browser users. Users who "have to be thankful for what they get" get counted as mobile.
Also: users who recently lost a shoe get counted as desktop, users who confuse effect and affect get counted as desktop, users who daydream a little too much get counted as mobile.
The tragedy of Microsoft, from its point of view anyway, is that there are about a hundred different things it could have done to own the mobile market and it succeeded at none of them.
What's certain is that it was a losing strategy to recreate the Windows desktop at 256x160 and expect people to poke at scrollbars barely a couple of millimetres wide with a stylus, then respond to the launch of the iPhone with laughter and disdain for it as a business product — i.e. if you think a large touch screen phone would be good for any part of your business then you're not the sort of company Microsoft is willing to treat seriously. Never mind waiting until long after Android has hit its stride finally to release a usable touch interface, then hobbling any potential uptake by applying a bunch of strict criteria about required buttons that means that if you're Samsung or whomever you can't just decide whether to put Windows or Android on at the final stage of manufacture.
The market is now Google's to lose.
True, but it does mean that your web page is more likely to be viewed by an Android user than an iOS user. Plan appropriately.
Umm, we don't know that because that isn't true. Chrome, Opera, etc, were counted towards the Android total. It's 44.83% Safari, 21.86% Android Browser; adding in the other options is how Android gets to number one.
In future maybe don't just make things up?
The quotes were meant to show that even the best selling Apple-related press doesn't blindly do "Wow! Apple's revolutionised everything, again! That's the third time this year!" but I actually think the original iPhone was a fantastic deal — don't forget to factor in the unlimited data versus the 18p/megabyte sort of charging that was common at the time. Though you'll recall that Symbian kept the vast majority of the market more or less up until Android came of age, so I don't think it would be accurate to say that the majority of people were sold.
It probably wasn't published in your version of reality but from MacWorld's contemporaneous review of the original iPhone — http://www.macworld.com/article/1058733/iphone_rev.html :
"... the volume buttons are located a bit too close to the switch, and on several occasions I found myself pushing the switch (which won’t budge) in a vain attempt to boost the iPhone’s volume."
"It’s a standard 3.5-millimeter jack—the very same sort used on the iPod—but because it’s recessed many third-party headphones won’t fit, especially if they’ve got a large plug or one that turns at a 90-degree angle. It’s too bad that a clunky add-on accessory will be necessary for aficionados of high-quality headphones"
"On the iPhone’s back face is the tiny lens of its compact, two-megapixel camera. It doesn’t zoom and doesn’t work well in low light"
"... the lack of [copy and paste] to transfer text from one place to another can generally hamper interaction between different iPhone programs."
"The iPhone also lacks a quick-dial feature that you’ll find on many other phones"
"Moreover, the iPhone doesn’t filter mail, nor does it have any built-in spam catcher. That means if you’re relying on a client-side filtering program ... you’ll be stunned at the amount of spam you’ll see on your iPhone."
"Steve Jobs has promoted the Web-browsing experience on the iPhone as one that brings you the “real Internet” ... Apple has brought the iPhone most of the way toward that goal, but it still falls a few notable steps short. ... there are a few limitations that prevent Safari on iPhone from truly showing the real Internet. The biggest is the fact that perhaps the most common browser plug-in in existence, Adobe’s Flash, is nowhere to be found."
"The bad news is that Text can’t send MMS messages [...] Because of this limitation, you can’t send a picture you snap with the iPhone’s camera to another phone via Text. [...] What’s worse, the iPhone has no support for any Internet-based instant-messaging network. "
"... the Notes program is fairly useless. ... And not to get too font-nerdy on you, but the Marker Felt font used in Notes is extremely ugly and, sadly, can’t be changed."
"The only thing missing from the Maps equation is that the iPhone doesn’t know where it is. Not via built-in GPS (it has none), nor by triangulating signal strengths from nearby cellular phone towers."
"I’m aware of numerous complaints from iPhone buyers ... about long, drawn-out issues with activating their phones. Still others have complained about poor customer service on the part of Apple and AT&T during the product’s first days of existence."
You're always prompted and asked whether you want to install a new version of the OS. You opt in, not out.
As a rule of thumb, iOS gets slower with the major releases, faster with the minors. Big performance degradations seem to have followed the release of iOS 5 (which significantly rejigged the Objective-C runtime, to add self-zeroing weak references amongst other things) and iOS 7 (all the rendering changes), with iOS 4's addition of blocks probably adding up to a bit of degradation over time as developers switched to Grand Central Dispatch perhaps a little more quickly than Apple was able to optimise it. Empirically, from optimising via the profiling tools, there were definitely huge changes between 4 and 5 in the logic applied by the OS to thread pooling.
The iPad 1 I still have knocking around on iOS 5 is much, much slower than the iPad 1 kicking around the office with iOS 4.
That scam will be released as soon as Apple starts allowing third parties to post EFI updates via its software update channel.
It can be downloaded direct from the web but either way you need to supply the administrator password to install.
Yesterday I received three 2011 MacBook Airs, having seen them available for a bargain price and decided to help out a charity. I'm not a support person so having multiple of the same computer in my control has never happened before. I therefore may not have proceeded in the most intelligent fashion, criticise as you must.
I decided to run all available software updates on all three, which were otherwise seemingly identically configured — each completely clean with the SSD given a volume name of 'Mavericks 10.9.2', implying a recent wipe and install though I doubt that will have had any effect on the firmware.
All three have now been updated, including to EFI 2.9.1, and work perfectly.
The only oddity was that all of them kept presenting the update as available even after it had been installed. In all three cases I installed the update last night and then again this morning, with software update now finally silenced. In one case I attempted to install twice last night, with the second go downloading and rebooting but not attempting to install anything. In the other two cases I stopped after the first try last night.
So installation still seems to have some issues, but only so as to create extremely minor inconveniences — nothing bricked — and may just have been some sort of propagation problem as 2.9.1 replaces 2.9.
... because there are no other iOS devices. I think the market share is going to continue to contract but not precipitously. If the Mac still makes money for them, the iPhone probably always will.
Conversely, Android manufacturers are fungible, sufficient hardware only gets cheaper and Samsung continues to target Apple in its advertising, providing a textbook answer to what happens if you can't stop fighting the last war. But with its manufacturing and component expertise I doubt it'll wilt quite as quickly as did its predecessors, Motorola and HTC.
Greater diversity is arguably the sign of a more mature market so the upside is: we all win.
My contactless card is a Visa Debit. Not credit. I didn't ask for it, the NatWest just sent it when the preceding debit card expired. I use it on the bus infrequently partly because I don't live or work in London any more but, given that I often spend an entire week or two there, primarily because it doesn't aggregate with my Oyster. The Oyster caps you at the travelcard cost; if I were to use the Oyster on the tube then the debit card on the bus then I'd quite likely end up paying more.
If the option becomes available to consolidate my Oyster entirely into my card then I will do so for the convenience. Also, in London contactless seems to be reasonably popular for things like sandwiches and beers. Anything where there's a benefit to the retailer in having the peak queue move more quickly and corresponding social pressure on individuals to pay and get out of the way.
My main problem with the Samsung advert is that Samsung is the world's most successful phone manufacturer by volume (amongst other things). So what the advert appears to be saying is:
You know those people that didn't buy the same kind of phone as the overwhelming majority of people? That stick with a particular kind despite showing a conscious awareness that it severely limits their hardware options? Well they're just sheep.
A PlayStation 2 does exactly everything I want from a video game console. That doesn't mean I think that people with anything from the XBox 360 or Wii onwards are implicitly facile.
I was once made fun of by a girl for wearing brightly-coloured trainers. Later I saw a different girl praising brightly-coloured trainers. This proves that all women are hypocrites and that the lot of them owe me an apology.
Don't even get me started on Google software. Of the current Android development stack, Android Studio uses "significant energy" even when left completely untouched for a prolonged period; the Android device simulator (with Intel HAXM, admittedly) does not even when in use.
So: text editing is apparently very costly, but running a whole other virtualised OS is quite battery efficient.
Apple's implementation of timer eliding relies on the application switching from saying "I want a timer that fires every X milliseconds" to "I want ... plus or minus Y milliseconds". Applications that don't specify the tolerance continue to receive the old behaviour. Knowing Apple that's likely only a transitional move but is in effect as of v10.9.
Incidentally, anecdotal observations: one uses significant battery power by waking often regardless of the amount of work done. An application I wrote woke regularly 50 times a second but used barely 3% CPU time — it was an emulator, doing the most obvious thing. It got into the hall of shame. Switching to an adaptive timer, only when wakes drop below about 10 times a second does it stop being named as a problem. Of course, that's empirical without being particularly scientific and I've quite possibly made a dumb mistake elsewhere, etc, etc.
The Z10 can be had unlocked in the US for $210.72 on Amazon which would put it in the league of this article (as £150 is $256.30) except that in the UK Amazon wants... £187.49.
They've got form: see 2012's FTC settlement over the exploit of a Safari bug — though the fine was more about promising clearly and directly that tracking wouldn't occur, then exploiting a browser bug to track regardless. So it was a false advertising issue more than anything.
Project Zero would presumably just have had a quiet word with Apple.
.., it'll still know your broad location though, as it will still know which cell towers it is near. That being said, the cell towers know when you're near them so if a government wanted to track a mobile phone user in slightly broad terms, it could do so regardless of the handset.
The article claims that Apple's "year-on-year growth shrank by 1.7 per cent.". You can't turn a positive into a negative by shrinking by a percentage. You can negate it but you can't negative it. Therefore Apple continued to grow.
It subsequently claims that "Worldwide PC [saw] a year-on-year decline of 1.7 per cent." which appears to state that the market as a whole declined.
Given that Apple's relative share within the US declined, as you observe, and assuming El Reg has subedited properly, I guess what we have to conclude is:
• Apple grew;
• the US market grew more quickly;
• the worldwide market declined.
The real absurdity is that an article that should be about Lenovo's incredible growth seems to want to focus on a point here or there for the fourth-place supplier.
Apple's profits are likely still the best or second best in the industry but this survey is about what consumers are buying, not about whether any particular company is about to have to shutter up. So it's useful to people like software companies. Even if Apple were more profitable than every other manufacturer added together, if they had only 0.1% market share then where would you focus your development resources?
Apple did poorly against Lenovo because Lenovo did spectacularly. Nevertheless, Apple saw growth while the market as a whole declined?
Of course, I wouldn't exactly have described the PC market as "Apple's lunch" in the first place. Compared to shipments of other types of device and of other suppliers I'd have called it maybe "that small snack somewhere around 4PM that Apple eats a very small piece of".
It's the complete absence of PCPs in my area that they cite. So there's no trunk level at which it is economical to install fibre. They'd need either to install PCPs or add fibre direct from the exchange to every individual building. The local fibre company that has jumped on that opportunity charges £20k/building for installation even after they've got sufficient interest in subsequent subscription.
Sadly the phone reception is also quite dodgy in my specific flat (though I don't think that's endemic, just bad luck on my part) so LTE-or-whatever isn't much of a fix, and the local MP has been trying to obtain funds but is a Lib Dem so I suspect BT may just be running out the clock on any effort he's creating for them. Meanwhile Boris has marked his funds for Wifi on the DLR so local government doesn't have the resources.
... and, as I said, I can therefore only imagine what service people in the countryside get, probably with just as many logistical problems but most of them so specific that there's not even an MP on their side to ignore.
Ditto around the place I bought. In London's Zone 2. Apparently because it's the Docklands, with residential properties developed in a very ad hoc fashion, overwhelmingly since the privatisation of BT, they've just sort of thrown the wiring together and don't have the money to fix it. Most properties are copper all the way to the exchange, a few kilometres as the crow flies so I can only speculate what it is in cable length.
I can only imagine the problems in less-dense areas.
You've obviously missed the memo from Silicon Valley. What coders do nowadays is (i) innovate; (ii) disrupt.
Working calmly and rationally on solid products isn't part of the plan.
I think tablets are a much preferable option to laptops for a large number of people. I think that any deceleration is, as the poster suggests, saturation. Tablets are now on a refresh cycle for people that want them.
You can't soundly conclude either (i) that any particular proportion of sales were people buying into the hype (though almost certainly some will have been); or (ii) that laptop sales will climb.
... didn't it once materialise around itself?
On the contrary, it looks like you've assumed I don't have a clue and are therefore suffering confirmation bias whenever you read my comments. Try to be less prejudiced.
The specific issue being discussed is whether "as new cameras come out it's unlikely they'll get any love from Apple". i.e. if Apple never writes another line of code for Aperture, will it be able to import RAW images from newer cameras?
There's little industry consensus. The sensors companies use are almost entirely orthogonal to the file formats they use. You can find the same sensor in cameras from three different companies but to deal with processing the results, you'll have to handle three different RAW file formats. Manufacturers are also very inconsistent, constantly changing formats seemingly arbitrarily. That's why there's already something like 150 of them.
99.99% of them, across the entire range of cameras, utilise Bayer RGGB filters. That will likely be true for a long time to come. It's not great for entropy but there it is.
So actually the thing that would likely determine whether Aperture keeps working is exactly _whether it can get the sensor information from the files_. I'm saying: it will keep working because the stuff of getting the raw sensor information from the files isn't built into Aperture. It's built into the OS.
I guess a rough analogue is dealing with printers back in the 1980s.
Everyone here is fully aware that there's complicated, often proprietary mathematics behind getting to a vanilla RGB from raw sensor information, that companies like Adobe have spent a lot on doing that really well and have accumulated a lot of value in doing so.
That's a completely unrelated issue.
There's no technical reason. Apple's RAW API provides the raw sensor information if you want it. It will demosaic only if asked.
Only if you were falsely to assume that it returns RGB pixels would you think there was a problem. That's not the case, I don't believe that's the case and I didn't state or imply that was the case.
There's no reason those programs couldn't use the built-in support other than that Adobe rarely uses OS-native anything. I guess they'd argue cross-platform consistency. They managed to support the Fuji non-Bayer filter a lot more quickly than Apple so it's not necessarily a bad thing. A bit more bloat, arguably, but creating a more competitive market.
But all I meant was that, relevantly to the original poster: (i) Aperture uses it, so it will continue to support new cameras for as long as it remains compatible with the OS; and (ii) any future photography app from Apple will use it, so it'll continue to be updated.
That's not really how it works under OS X — RAW image decoding is a service the OS provides, just as it can open PNGs, JPGs, etc. As long as Apple provides any photography application at all they'll keep updating the RAW support.
That said, it took something like a year to get support for my non-Bayer X-Pro1, which is 'semi-pro' at best, so I can't completely endorse Apple's approach.
I have a Sputnik* which takes stereoscopic photos onto 120 film, i.e. images are captured at an absurdly high definition 60mmx60mm per lens, at the cost of only six fitting on a roll of film. On the plus side it's the same format used by hipsters in their Holgas so film, development and printing are still widely available; also suitable reels are available for most developing tubs so you can cut out the middle part. It's even easier to develop at home than 35mm because there is no cartridge. It's just literally a roll of film.
Good luck with 110 nowadays. I'm sure someone can handle it but it's going to cost.
* http://camerapedia.wikia.com/wiki/Sputnik — named before and independently of the satellite as the word literally means a thing that goes with a traveller, I think.