You think Apple hired a crowd... of two? They must have bought into the idea of decline.
2130 posts • joined 18 Jun 2009
You think Apple hired a crowd... of two? They must have bought into the idea of decline.
Then don't use iTunes. It isn't required.
As per jeremy 3; Apple didn't allow apps to support both the full iPhone 5 screen and the ARMv6 instruction set used by pre-3GS devices. So a lot of developers had to drop 4.2 to support the iPhone 5. Similarly Apple is not going to accept 64-bit builds that also attempt to support iOS 5.
So developers often don't shout about dropping support because there's often quite a lot of coercion involved.
A new Apple iOS device usually receives OS updates for three years. Without breaking it down, to quote Ars Technica's review of iOS 7, posted today: "The length of the iOS device support cycle remains about the same as it has been for the last couple of years. If you buy an iOS device when it’s brand new, you can (with some notable exceptions) expect three to four years of software support"
Microsoft didn't support upgrades from Windows Phone 7, which is three years old, to 8. 8 isn't yet three years old.
Taking Android's case at its strongest, Google provided updates for the Nexus One for less than two years — from the January 2010 launch of the phone until the October 2011 release of Android 4.0. It provided updates for the Nexus S for only very slightly longer — from a December 2010 launch through to the November 2012 release of Android 4.2.
The Galaxy Nexus and Nexus 4 have received the latest versions of the OS. The Galaxy Nexus will be two years old in November.
So at the very least, Apple devices are definitely not doing worse than the market average.
Per Berners-Lee and museums worldwide, the Next was the platform on which the web was developed. That's pretty much settled.
But why does Jobs get credit for that? If SETI@home finds something then do we need to find out which particular client filtered that bit of data and award Jobs, Gates, Torvalds or whomever with credit for finding extraterrestrial life?
Further to the anecdotes: when I collected my A-Level results in 1999, from my well-to-do upper-middle-class school, exactly four people out of about a hundred had mobile phones which were passed around widely for phoning parents, and only one of those was actually the property of the student.
Starting university later that year, almost everybody ended up getting one within a term or two. They were cheaper than the halls' pay phone besides anything else.
I was wondering whether the semantics of Objective-C might have something to do with it. With only two special cases, all objects are created on the heap. Although the C primitives are available, what that means is that quite often you're doing things that seem a little ridiculous like passing around a temporary 32-bit number by putting it on the heap and kicking around the pointer.
On OS X, 64-bit pointers have to be quad-word aligned to be valid. So three quartets of pointers are invalid. Apple has used certain subsets of them so that e.g. 32-bit NSNumbers are encoded directly into the pointer, with nothing stored on the heap. So creating them, accessing them and destroying them is much cheaper. It's heap semantics but actually the entire object is on the stack.
If iOS does the same then that presumably would be a way that the move to a 64-bit runtime benefits programs much more than just by providing extra registers.
Apple has made it clear that binaries that include 64-bit code must have a minimum deployment target of iOS 6. That's about 95% of active users so it's not so awful but it is indisputably a reduction in legacy support.
I think Apple has solved the iTunes problem by just cutting it out — we're beyond people being grateful it's no longer needed and well into people not specifically remembering the last time they used it.
Here's what worries me: having taught us to ensure charging our phones pretty much every day, the market now thinks it can sell us things like watches that need to be charged every single day.
If you've ever visited the US then the US government already has your fingerprints — all visitors are required to provide them at the border.
If you carry a mobile phone then you almost certainly already allow yourself to be tracked — probably you provided details of your identity to obtain the device but even if not then you can likely be identified by the contents of your communications.
As I've visited the US and carry a contract mobile, I seem already to have sold myself out. Something about locking stable doors jumps to mind. It'd be nice to believe that everybody else has managed to avoid becoming traceable but it sounds unlikely, so while I strongly ideologically support a stand against increasing biometric intrusion, it's likely a token gesture.
I didn't expect the processor to go 64 bit but Apple are saying it brings performance advantages since you get twice as many registers, rather than just literally having 64 bit addressing where the previous has 32 bit. So that shows what I know about the ARM architecture.
I didn't expect ES 3.0 to make an appearance but that's just because Apple has always lagged so much on the desktop. There's still no geometry shaders and Apple already supported extensions for occlusion queries and compressed textures so I guess the step forward for developers isn't so great.
Otherwise? If the finger-print sensor works then it'll be useful but Motorola's a few years ago was never any good. I don't have a Nike+ Fuelband but gamification of health is exactly the sort of thing a feeble-minded person like myself would be manipulated by.
The idea that products succeed or fail based on tick lists of features is absurd even if you ignore the motivation behind the list selection criteria.
Here's what most people used a phone for in 2007: calls, texts. Here's what they could do with an iPhone: calls, texts, the web. With a usable interface thanks to multi touch. And without sending you bankrupt because Apple strong armed the carriers into unlimited data.
What is laughable is the preceding interfaces for the web, full of modality and fixed-level zooms and web pages reduced to the system font, and the idea that people would pay 50p/mb for the privilege.
Apple opened the door, Google charged through it.
The move to 4k televisions will be a fantastic thing, because it'll mean that the standard panel resolution becomes 4k, to the massive benefit of every laptop that isn't the Retina MacBook or the Chromebook Pixel.
For TV itself? Films are already that resolution without the hassle of needing to be redigitised, YouTube can stream in 4k and smartphones have been announced that can record in it. So there'll probably be a pincer movement on production television. I mean, it won't make much visible difference, but you can at least realistically see it happening.
I'm not sure. I think Chrome is a product of trends rather than a producer of them.
In my opinion the development of browsers has been directed by the web's move from the desktop to phones and tablets. That has tightened optimisation requirements and, because there's still a healthy competitive market in mobile, has made everyone much keener to ensure that nobody else has de facto control over the standards.
WebKit alone is possibly the more interesting story. It was always about standards compliance, being first to pass Acid2, and has been a frontrunner on performance. Although it's most often associated with Apple it was powering the Symbian browser as early as 2005 and Google started committing more than Apple in 2010. So it's a pretty good indicator of where the major mobile players of the last few years think priorities lie.
From there Chrome is a gateway and an irony. It makes desktop browsing thoroughly up to date so as to make desktop browsing less relevant.
It's actually a frustratingly easy mistake to make with Apple's APIs — those CFIndexes pop up in quite a few places — and Xcode ships with the implicit signedness conversions warning disabled. It's one of the things I always enable when I'm starting a new project. Just enabling that would probably help them catch stuff like this.
That said, if it's a latent problem in initial table setup then the true diagnosis is probably that whomever guarantees the signed value would always be positive needs fixing, so they'd probably just have thrown the explicit cast in and forgotten about it.
If I understood the article, it sounds like it's meant to be calculating something related to the string width, and getting that wrong because by passing from one place to another it interprets a glyph count of -1 as a glyph count of UINT64_MAX.
So if you're asking why floating point arithmetic is used, it's because fonts are designed with floating point arithmetic and rendered with floating point arithmetic. The OS X graphics system uses the same drawing primitives as PDF and Postscript.
When I find an article like this I make sure I click through every advert on the page. Good stuff!
Even if it is just a fluctuation, that's surely news because it destroys the overall narrative of Android doing now what Windows did a couple of decades ago?
It all sounds healthy regardless. I like Bernard's free market interpretation: Apple is managing to respond to competition.
I always thought it was more like one of the later Freescape games — Castle Master, maybe — except that the relatively static movements were for a completely different reason.
When Microsoft achieved monopoly status in the Internet client market, it declared that the browser was then fully evolved and left us with the five-year gap between IE6 and IE7. When its years of work on smartphones didn't translate into sales it said that obviously consumers did not want smartphones. It essentially spent a decade pushing bad products and insisting that it was everyone else that had the problem. Besides being exceedingly late to market with the decent Windows Phone iterations, Microsoft has created for itself a very tough reputation to overcome.
When Nokia had the number-one smartphone OS it gave us extremely contextual user interfaces that only a computer scientist could love, labyrinthian APIs and barely functional tools, and expected developers to rally behind Series 60 because it omitted such frivolities as being able to display anything other than the system font. Dithering this way and that on where to go next, it was like they heard Apple had made waves and decided to copy them but accidentally copied Apple circa 1993. That's how Samsung, HTC, etc, stole the market from them.
Both companies have shown sufficient penitence since but it's probably too late. A complete merger is probably a smart move but only as a final roll of the dice.
*cough* Minidiscs were magneto-optical, exactly like CD-RWs and all rewritable optical formats since. The magnet aids in the writing process but the thing is then read optically by laser, and can be read optically even by mechanisms with no magnetic element.
Sony's incredibly restrictive ideas about DRM meant that you weren't permitted to use a computer directly to manipulate their data. Their licensing ideas about compression formats also meant that by the time they allowed write access by USB, that meant the computer having to transcode, usually from MP3. So they managed to reintroduce generational degradation to digital electronics and restricted everyone to their lousy software.
It was just one stupid move after another, really.
Morse code? That doesn't even have the prefix property. Maybe a Huffman coding on the alphabet with frequencies dictated by your language?
That's what the MagicCode product mentioned in the article deals with — executing ARM code on MIPS. So they've got the native code problem solved.
Apparently not obviously enough for you.
DrStrangeLug's comment parodies the way that Apple often acts and/or is treated as though it invented things like the MP3 player, smartphone, etc, rather than merely launching commercially and critically successful versions. JDX's comments parody the image of an Apple fan as genuinely believing the invention myth, and of Apple announcing each new iteration of a product as revolutionary when it's merely a minor step forwards.
The clues were: DrStrangeLug's abstract description of a desktop computer, and JDX's singling out of 1997's iMac — the first big Jobs launch that did a lot to create the modern Apple — as the original computer.
For extra amusement, lurker appears to think JDX is being serious and, apparently, so do you. I further found it amusing that, given the level of debate around here, it's unclear whether the down votes are from anti-Apple people not getting the joke or pro-Apple people taking things too reverently.
I haven't specifically checked the schedules but there are probably some sitcoms on BBC3/CBS (delete as per your side of the Atlantic) this week that will cater to those that need this level of explanation.
What do we think on the down votes here? Is it the blindly pro- or the blindly anti-Apple failing to get the joke?
The lens flare wasn't fake, at least in the sense of added on later. it was created by the actual set versus the actual lenses.
... which should probably teach us that the difference between something being immersive and being distracting is not solely related to the extent that digital tool were used.
If you'd used FUNC+K then think how much time you could have saved!
It was always Starship Command for me, and some Repton 2. Oh, and Gaunlet, the Defender clone.
The Slogger has a really neat trick — if the program counter goes beyond D000 (if memory serves) then writes automatically go to the video page. Othwise they go to the shadow page. Why? Because that's where the ROM routines for graphics output are and the original machine ROM is used unmodified.
You take "Apple is preparing to train its shop staff to use Microsoft Windows in a bid to flog more Macs to business users reared on PCs" to mean "users who don't want to use OS X"? That's like saying that if shop staff are trained to demonstrate a browser then that means they're targeting customers who don't want to use desktop applications. It's a false dichotomy.
VirtualBox is available and just as good as elsewhere. Giving everyone the benefit of the doubt and assuming all decisions were made purely on merit, I imagine that Apple might prefer to push Parallels because it does a lot of work tightly integrating the two environments. For example, file associations work in both directions, from the Finder onto the Windows desktop and vice versa, the start menu is added to the dock, your Windows applications appear in /Applications and hence in the Launchpad, etc.
It's actually why I stopped using Parallels and started using VirtualBox. I needed Windows for a particular application only and otherwise to be contained in its box — I actively didn't want Parallels to start messing about with my whole system.
There's really no basis to conclude that Apple is "trying to sell systems to people that don't want to use their operating system". Much more likely they're trying to sell systems to people that do want to use the Apple operating system but falsely believe that they can't because a vital piece of software isn't available in a native port.
Otherwise they'd be demoing BootCamp rather than Parallels.
I've seen architects with Macs, but they've all been running BootCamp. I suspect Autodesk noticed the same thing, hence why AutoCAD is now available in a Mac native port again after a nearly two-decade absence.
This movie hasn't sold. Deadline's summary of the weekend US box office results was: "Oprah’s PR Blitz Helps ‘The Butler’ Open #1 With $25M: Soft Box Office As ‘Kick Ass 2′ Falls, ‘Jobs’ Biopic Dies, ‘Paranoia’ Bombs"
See the original — http://www.deadline.com/2013/08/did-oprah-affect-box-office-the-butler-tops-box-office-friday/ — for the full text, but highlights are:
Flopping in wide release (2,381 theaters) was ... Jobs [which] came in only #7 with a meager $6.7M despite a plethora of TV ad buys. ... Rotten Tomatoes critics only gave it 24% positive reviews because of its superficial made-for-TV depiction of a complex creative and business icon. Still it’s surprising how many Apple devotees stayed away despite the marketing’s psychographic targeting to them.
If they'll be shipping it with only 1gb of RAM and given that iOS doesn't use virtual memory for process storage, why would Apple want to transition? The only use under iOS as currently designed would be to allow larger sections of the disk to be memory mapped (a virtual memory use Apple does permit), which is not exactly a limitation developers often run up against.
I could understand it if the risk were ignoring the next step until it's too late but the 64-bit ARM architecture is ready to go and Apple controls both the tools and the channel of distribution so it can force very quick changes in those.
As Apple doesn't usually implement technology until there's a pressing business need to do so, this rumour doesn't sound all that likely to me.
While it's possible Apple's next release will shake things up, years of Intel's next processor always being the one to unseat ARM, the next Windows Phone to be the one that storms the market, next year to be the year of the Linux desktop, etc, I've come to the conclusion that tech forum commenters have a very skewed idea of how quickly these things change.
General trends will probably continue.
It'd be wrongful too, given that we give employees a right to notice. So it wouldn't happen in the UK for at least two reasons.
On the contrary, quite a lot of the fundamental employment legislation — e.g. the Equal Pay Act 1970 — was brought in only either to meet EU obligations or to bring UK law into line with European expectations with an eye towards future membership.
The way EU law works in the UK is that it has to be explicitly enacted by our parliament. They have an accelerated process for a lot of the piecemeal regulations but whenever something big comes along they often write a full-blown domestic bill for it.
They're very similar to the colours of many cases. And they're made out of plastic so that you don't have to surround your anodised aluminium in a plastic case.
I own: an iPad 3 and a new Nexus 7.
The 7's user experience still isn't quite as smooth as the iPad, for technical reasons that are fairly easy to understand: iOS has Core Animation, which handles 60fps user interface transitions in a separate thread with all the tricky main thread interaction stuff handled for you, and has essentially programmatic layout where writers need optimise for only three screen layouts. Android is declarative and people can't reasonably optimise for every device out there.
If there's a reason the iPads feel more responsive then it's probably that, but it's a very shallow impression. If I actually do things like time how long it takes an application to launch, how long it takes a web page to appear, etc, then the Nexus wins — and it does so for half the price.
Are you Michael Dell?
I'm curious about this too, but with the caveat that the PowerPC architecture obviously has some advantages for some demographics — the first generation XBox was x86 based but the 360 switched to PowerPC, even at the cost of Microsoft having to buy a company to supply emulation software for backwards compatibility. Similarly you'll find PowerPC-derived parts in the PS3 and every Nintendo since the Gamecube.
Then again, both Sony and Microsoft have switched to essentially the same x86 architecture for the next generation so maybe that demonstrates nothing.
Some Android titles are built using the native development kit, which builds directly for the ARM and whatever else the developer chooses. If an app is compiled for native execution then usually it's a game — it's especially likely with ported titles.
Besides customisability, amongst ARM's other strengths are code density (in thumb mode, anyway) and power efficiency.
I guess in England & Wales they could try to argue passing off — that the second company is trying to acquire goodwill by naming their product similarly to the first company's, and in so doing is damaging the first company's standing. But it requires a proper misrepresentation whereby Bang With Friends would actually be meant to be perceived as a Zynga product rather that just having a bit of fun with the name so, yeah, it feels unlikely.
Per GeekBench numbers the iPads really divide into three groups:
The iPad 1, with a score of 454.
The iPad 2, iPad 3 and iPad Mini with scores of 781, 791 and 745 respectively.
The iPad 4 with a score of 1756.
So if you bought an iPad 2 then you've got a device a shade faster than one of the current generation models. Naturally app makers are therefore still going to optimise for that level of performance.
Conversely if you went out and bought a Motorola Xoom or one of the other early Android tablets then not only did your specific device fail to capture enough of the market to be interesting in itself but also the devices that have powered the subsequent explosion in Android share are noticeably faster (yes, even the Kindle Fire). So app makers are unlikely to spend much time optimising for that level of performance.
Did you two read the same article as I did? The third paragraph says:
"... 'pads, including white box devices, grew 43 per cent to 51.7 million standalone products."
The overall tablet market is very healthy; the story is simply that Apple's shipments are down 14% and that in a quickly inflating market that has translated to a drop in market share of almost 20 points. So there's a bit of a changing of the guard going on, with various flavours of Android (ie, I'm counting Amazon's fork) in the ascendant.
I would imagine Samsung's impact on Apple is very limited compared to that of Asus/Google and Amazon.
Depends how they implement it; the (in my personal experience, quite frequent) home button failures are to do with moving parts whereas a fingerprint scanner — at least the sort you usually see on laptops — doesn't have any moving parts. So I guess it depends on what's going to happen to the button overall.
I don't think they're selling very well at all. The Apple third quarter results press release (http://www.apple.com/pr/library/2013/07/23Apple-Reports-Third-Quarter-Results.html) says this:
The Company sold 31.2 million iPhones, a record for the June quarter, compared to 26 million in the year-ago quarter. Apple also sold 14.6 million iPads during the quarter, compared to 17 million in the year-ago quarter. The Company sold 3.8 million Macs, compared to 4 million in the year-ago quarter.
The Apple TV isn't mentioned anywhere in the press release summary; one can therefore assume it's not significant to the larger report.
... and as someone aware of OS/2 and exposed to the standard array of Acorn machines that filled UK schools in the late 80s through to the early 90s, I am amused to see Amiga owners moaning that they had pre-emptive multitasking a decade before 90% of the market given that they didn't have (i) built-in buttons, menus or any other widgets; (ii) protected memory; or (iii) any significant inter-app communications.
It was a diverse mix back then — no single system ticked every box.
I think this is Google's attempt to sweep away DLNA and AirPlay in favour of building programmability into the receiver based on emerging web standards — assuming Google's "everything must be approved by us before deployment" stance is temporary that will remove the gatekeeper from the process while allowing parties with content to control development and deployment cycles.
Although a lot of TVs have very similar programmability built in, few of them are mutually compatible and most of them have an app store in the middle. If you're Amazon or whomever, you can't just write your mobile client and your web app and have the former instruct the TV to move to the latter.