Hooray for the removal of bloat from iTunes?
This story has one upside, at least.
2069 posts • joined 18 Jun 2009
This story has one upside, at least.
Some sort of side-view real time strategy game, I think, clearly for the BBC Micro.
You should have a word with Microsoft. They seem so sure that the Intel-bearing Surface can't be made cheaply that they endured a US$900m write-down halfheartedly trying to push an ARM version.
The Surface Pro 128GB list price was £900; presumably the Haswell-powered Pro 2 will be the same. Chuck Apple an extra £200 for the entry level 13" Retina MacBook Pro and they'll give you a larger, higher resolution screen and a 50% faster CPU.
They don't say that. Only an AC could make such a ridiculous straw man argument, etc, etc, etc.
I attended university at the turn of the millennium; I was a young child during the '80s and packed my teenage years entirely into the '90s.
My experience, shortened to the interesting bits: I received an obsolete micro from the classifieds somewhere in the early '90s; left to figure things out on my own as at that stage the computer had no magazines or commercial support I achieved some things I'm very proud of but was remarkably naive in other areas.
In the late '90s we got a PC and the Internet. So suddenly I had access to unending reams of documentation and properly technical people to discuss things with. My abilities took a huge leap forward. I progressed much faster than I probably would have if I'd continued in independent study or muddling through with a single book or two.
As a result, just as others above think the most educational environment was having limited choices and needing to figure everything out for themselves, I think the most educational environment was taking a bit of time to get the absolute fundamentals down then being exposed to the breadth of everything available. Probably people a decade younger than me that the best way to learn is to be dropped in immediately amongst the breadth.
It'll be interesting to see what the second article advocates but too many of the commenters seem to be confusing causation and correlation so as to jump from perceiving an experience to be common to suggesting that it's a good idea.
It runs slightly contrary to the rose-tinted nostalgia of some of the other posters, but many of the worst coders I've worked are those who start with increment coding and debugging and proceed to the conclusion that the correct way to figure out how libraries work is by empirical investigation. Reading documentation just takes time, right? And if nobody's going to read it, why write it in the first place?
If I were asked to come up with a related rant immediately it would be about people who think that date handling is easy, so they wrote it all themselves, and mysteriously enough their code gets the length of a day wrong twice a year. But that's okay because their 200 lines of date handling "was a quicker solution" than five lines of API calls that would require you actually to have learnt about what's already provided. If you wanted a rant tomorrow? Probably something else.
If memory serves, the Ark set is used to bookend the series — the Cyberman story at the end returns to and reuses parts of it. That was presumably some sort of budgetary jiu-jitsu?
In this case they didn't — it's a standard buffer overflow attack.
But if you're asking who would? Well, early-1990s Microsoft did in WMF. They closed that hole back in 2006 but you should be able to find ample reporting from then.
I have a Western Digital 'My Passport' USB 3 1TB device. It's just for Time Machine so I plug it in and ignore it. Everything is working without issue.
Naturally I haven't installed any of WD's custom software because I'm about as enthusiastic about that as I am about consumer printer drivers — one just assumes it'll inexplicably install a 2GB boot-time kernel extension in order to be able to say in a booming voice "You have 20GB of space left; visit www.westerndigital.com to buy another hard disks" every ten minutes. And probably require a network connection.
The shell companies comment makes it pretty clear whom AC2 thinks is most culpable. It's also unnecessary since we know exactly which company is taking legal action and which other companies are behind it.
Of the various company pairs, Nokia was first to sue Apple, Apple was first to sue Samsung and HTC, Motorola was first to sue Apple, Microsoft extracts royalty payments from all of those except Apple and has made FRAND claims against Motorola (leading to a countersuit in layman's terms at least), Samsung was first to sue Ericsson, Sony's main involvement has been via a holding company with Nokia which has sued Apple, Google has been involved primarily through support for HTC in a countersuit and, of course, through its acquisition of Motorola, and BlackBerry (/RIM) appears to have suffered only at the hands of companies other than all of those.
Nobody's hands are clean.
@AC1: that's bound to be what he's reacting to, and he's right. The market would be much more advanced if the focus was exclusively on how the things consumers like can be made even better, not who has the best legal protection on those things. Pleasing consumers should be paramount in a functioning market but the way that inventor/creator protections have been so massively distorted tends to subjugate them somewhat.
@AC2: the Rockstar Consortium is who is suing. Apple deserves to be attributed with responsibility for some of that. So do Microsoft, BlackBerry, Sony and Ericsson.
The iPad 3 came out last March with a big press event and lots of publicity about its 'retina' screen.
The iPad 4 came out eight months later in November, having been mentioned almost in passing at the iPhone launch with "oh, and we've added the new connector to the iPad too".
The iPad Air has had the big keynote treatment again.
So, yeah, 4 versus Air is technically accurate if you're asking about Apple's year-on-year health, but 3 versus Air would be more realistic, I think. Otherwise this is more like trying to determine how much Apple benefits from its usual method of product introduction.
Blaming Apple for anything here is absurd — video games consoles have been a walled garden since the NES's lockout chip.
I think the difference is that BlackBerry were overtaken by technology — everyone else's products were much better while they ran around buying QNX and desperately trying to get BlackBerry 10 ready. The same isn't true of Apple. A majority of people prefer alternatives and there are a bunch of things Apple consciously disallows but you don't get the sense that they're technologically behind. They're just control freaks with very specific ideas of what a device should be.
But have you heard about the really amazing new bread they're getting in next Wednesday?
The Apple shop employees are paid to clap and cheer like morons at every product launch. They'll even insist on giving you high fives on your way, though they are at least doing it in order to provide themselves with shelter, warmth, food, etc, rather than because they honestly think that shaving 180g off an already successful product is really exciting.
I guess the issue on Apple's side, when launching modest annual evolutions, is whether to risk a damp squib of a launch event like this, or to forego one for the first time since the iPad was announced. Damp squib it is.
It was set in San Francisco — specifically all of those flat parts of it that look spookily like Vancouver.
Per http://www.w3counter.com/globalstats.php the global web browsing market share for every version of iOS in the top ten (ie, 6 and 7) is 8.64%. The equivalent share for Android (which includes versions 2 and 4) is 5.22%.
From that we can conclude that, for web browsing, iOS is used about 165% as often than Android. But that doesn't differentiate tablets and phones so standard comments about pay-as-you-go contracts apply — because Android devices are more affordable, they're much more likely to be bought by people that don't intend to spend a lot on data. That doesn't mean they're not being used for Temple Run.
And to put the whole thing in perspective, Windows (XP + 7 + Vista) is used about 745% as often as iOS.
Apple of the 1990s is why Apple no longer thinks it is safe to try to fence off one corner of one market.
Given that the company is very good at product launches, it has usually followed a strategy of expanding into neighbouring markets rather than expanding its range within a market. From computer to MP3 player, from MP3 player to phone, from phone to tablet. It probably helps that it's so bad at the other idea: witness the cheap Mac (starting at only US$599!) and the cheap iPhone (only US$99 on a contract!).
They'll probably be healthy with 10% of the high end of three markets. If they want to expand they'll find some other market to try to muscle into.
H.264 is getting on a bit and if the MPEG formats remain the industry standards then the MPEG members are more likely to make some serious money on HEVC/H.265.
Cisco makes a lot of money from telepresence equipment, including software and the physical stuff you kit out your boardroom out with. If it's automatically compatible with Android and iOS and everything else that comes with a browser but has no plug-in architecture at all or for which it would be a major hassle to maintain a plug-in then that's a big plus.
My 4S was the same, though if memory serves it could do only barely more than a day when new, so battery decline isn't so much of an issue as the battery life being bad in the first place — albeit along with most of the rest of the industry.
I've considered all the possibilities and I think: we'd have missed out on Clippy.
Snide remarks aside, respects to Bill Lowe. How many of us can claim to be be so sure in our belief about where the future lies as to overcome the inertia of a company like IBM and shape it ourselves?
There's no evidence whatsoever to suggest that "The decision to remove CD drives is also cost driven". It further strikes me as at leart an order of magnitude more likely that the decision was motivated by the desire to make the machines thinner and lighter (it's one of Apple's favourite boasts) and to boost the battery life (both by eliminating the last moving part and by making more space available for those custom shaped batteries they love to glue in).
There's a difference in evaluation between a premium product and a budget product. Whether you think Apple produces a premium product or not is immaterial — some do, and they believe they're paying for (i) the hardware; (ii) the design; and (iii) to enter or remain in the ecosystem. They don't evaluate on component mark-ups alone because they believe that (ii) costs money and (iii) adds value beyond the components.
So they ask themselves "do I want the machine with the 5 megapixel screen that runs below 20 decibels for 12 hours between charges and weighs less than 2kg?" not "which machine is the cheapest for the performance level I've deemed adequate?"
If you think most people don't agree that Macs are worth the money then you're right. That's why at least 85% of buyers pick something else, even according to the most optimistic estimate of marketshare that Google could find me.
The new Macbook Pros are cheaper than the old. The 15" isn't a like-for-like comparison because the new entry level omits the discrete GPU and the old didn't, but last year's 13" was $1,499 at the time of discontinuation and the new is $1,299. That's a bit more than 13% cheaper.
I also think we're already several years past the point where people who like to upgrade their computers would consider a Mac?
On the contrary, the Mac is still better than a PC for creative work in a whole bunch of key areas — see e.g. http://arstechnica.com/information-technology/2013/09/making-the-ultimate-creative-content-os-ubercreate-os-1-0/ in which the author starts from the position that some people will be turned off by the new Mac Pro so, hypothetically, what would the perfect alternative platform be?
The Mac comes out as by far the closest thing to the ideal, based on much better out-of-the-box support for professional formats (HDR, EXR, etc), better multitasking under load (so you can do something else while a render occurs), system-wide scripting (though the author dislikes AppleScript, he likes the Automator) and search (ie, in every file dialogue in every app), much more mature HiDPI support (for people working with 4k video and without magnifying glasses) and a bunch of other things.
Windows wins only in 10-bit video output (which sounds quite significant to me, but isn't enough in isolation) and aero snap.
... though it's working well for me at the minute. The last update appears finally to have resolved the bug whereby new emails sometimes display as empty until you restart the client, and even the iOS version no longer occasionally decides someone is trying to email me from 1969*.
So far I've had no issues whatsoever with any part of 10.9. But why do I feel like I'm tempting fate?
(* the UNIX epoch versus the PST time zone, I assume)
Contorting Jobs' statement through selectively strict interpretation is about as meaningful as if I insisted your statement, "User-land code should only affect applications", couldn't possibly mean the logically corrected version, "User-land code should affect only applications", because that's not what you wrote.
Jobs often used 'the Mac' to mean any combination of the hardware, the OS and the applications that run on it.
User land code did only affect applications. Apple meant "... is the number one cause of crashes on Macs", not "... is the number one reason OS X crashes".
If you get started on that line of argument we'll be overrun with Opera users. Both of them.
I know he sued EMAP — and won — over the inclusion of the Spectrum Elite plus an emulator on a PC Review cover disc in 1995. I doubt either he's going to take this as good news, especially as the brand is about to go back into use. Activision might have a thing or two to say about Pitfall, too.
146 mb? So that's, what, 0.22% of even the stingiest drive* Apple has shipped in recent times?
(* the 64 gb flash in older Macbook Airs was the smallest I could find)
That's exactly what he meant — almost 67% of active (ie, not long ago resigned to a sock drawer) iOS devices have been updated. Which I think is meant to be a roundabout way of saying you should buy Apple because you'll definitely get OS updates, for a while anyway, and you should develop for Apple because you can adopt the latest frameworks pretty much as soon as they become available.
Per http://developer.android.com/about/dashboards/index.html about 70% of Android users now run 4.x so there are definitely more Android people — in both absolute and proportional terms — running the latest major version of Android than the latest iOS. The Android market doesn't tend to update as swiftly but at this point in the cycle that matters about as much as which phone manufacturer managed to ship colour screens first.
"Thank you so much Black berry team. I was waiting this app. It is really great user friendly and smooth."
I wouldn't describe that as *the* problem. Others features of Apple's meandering lost decade: Copland, OpenDoc, RAVE, QuickDraw GX, the Newton (especially the eMate), computers delivered by chauffeur.
Sounds more like a launch-day bug to me. Mavericks had me reinstall Java, presumably because it's still closely tied to the OS even though Apple no longer maintains it in house, but in no sense did it wipe anything out or leave any sort of trail of destruction. After a quick 66.7 mb download (versus 5.<something> gb for the rest of the OS), everything worked fine — though in my case everything is: Cyberduck, Android Studio.
I haven't tried iMovie (at all, probably for years) but can confirm that all of Pages, Numbers, Keynote, iPhoto and OS X itself are definitely not freemium. Unless they're being super optimistic about the interest in OS X Server, of course.
I really thought it was more about bringing their collaboration stuff up to speed with Google Drive (née Docs), and finally updating the desktop versions of Pages/Numbers/Keynote after four years of mere maintenance, simultaneously resolving a lot of the interface deficiencies.
Microsoft's failure to bring Office to the mainstream mobile OSes has already probably sealed their fate. After years of having no choice but to sample the alternatives, people have learnt that there's life beyond Office.
Yeah, that's what happened. Customers picked ANDROID because they were all really worried about which kernel their phone runs and absolutely demanded Linux. That's why ANDROID owners often laugh at iOS users — their kernel is explicitly monolithic and they can't believe how hilarious it is that Apple's is still nominally a microkernel in many areas.
... and weren't the MacOS (/System) releases prior to 6 free as well? That would make this not exactly a new era.
Is there a hardcore of designers who obsess over every millimetre in the same way that some programmers obsess over every processor cycle? I think maybe some of us are finally learning what it feels like to be the one giving the glazed-over look following an optimisation boast rather than the one receiving it.
That completes the list of things I know about Wokingham.
Mavericks is a no-brainer only if you assume Apple still has any interest in showcasing it. Mountain Lion was announced via their web site, and its release date was unveiled as part of the Q3 earnings call. Mavericks managed to ascend to the WWDC keynote for its announcement but the Q4 earnings call is on Monday the 28th so they might easily just upload it to the App Store and not comment until then.
Having now been able to try them both, Apple's fingerprint sensor is the same sort of thing as Motorola's in the same sense that a modern scanner is the same sort of thing as a fax machine. This is a case of Apple waiting for the technology to be made sufficiently useful by external forces before implementing it.
Apple marketshare in Q2 2012 was 18.8%, with 28.9m shipments. As of Q2 2013 share had dropped to 14.2%, but shipments were up to 31.9m. Source: http://www.gartner.com/newsroom/id/2573415
So, Apple: marketshare down 25%; shipments up 10%.
Between Q2 2010 and Q2 2011, RIM's shipments increased from 11.6m to 12.7m. Share dropped from 18.7% to 11.7%. Source: http://www.gartner.com/newsroom/id/1764714
So, BlackBerry: marketshare down 37%; shipments up 4%.
I couldn't find Q3 2013 figures, but the quarter ended less than half a month ago. I picked the quarter in which RIM performed best.
By Q4 RIM were, year on year, at 40% marketshare decline and an 11% decline in shipments. So if your comparison is correct then Apple's fortunes should decline very quickly.
However by using Q3 numbers, I have completely overlooked any effect the iPhone 5S/C may have on Apple's fortunes. So I'd suggest that Apple's decline isn't going to mirror BlackBerry's. Q3 and Q4 will probably show significant shipment increases if reports so far about the 5S are correct; the question is how deeply sales will dip in the middle of next year when the bump of a new product launch fades again.
Android should adapt relatively easily — the vast majority of apps run through the Dalvik virtual machine so all it takes is for someone at the centre to port that and those apps will take advantage of whatever ARM64 mode offers without any per-app reengineering.
The NDK apps will run because a backwards compatibility mode is present in current designs. They won't run as well as if they had been recompiled.
Although most Android devices are ARM based, there are a few that use x86 or MIPS processors. It's the Dalvik VM that mostly enables that.
Crittercism is a third-party SDK that people integrate in order to get real-time crash reports and in-the-field profiling. So the information comes from sampling the applications of those developers that have opted to use their SDK.
Apple supplies crash reports too, but only in a very rustic form.
From Real World Technologies more than a year ago, when ARM64 was first announced and long before people started factoring their feelings about Apple into the assessment:
"Like x86, ARMv7 had a fair bit of cruft, and the architects took care to remove many of the byzantine aspects of the instruction set that were difficult to implement. The peculiar interrupt modes and banked registers are mostly gone. Predication and implicit shift operations have been dramatically curtailed. The load/store multiple instructions have also been eliminated, replaced with load/store pair. Collectively, these changes make AArch64 potentially more efficient than ARMv7 and easier to implement in modern process technology." (http://www.realworldtech.com/arm64/)
Apple's marketing division is hyping it based on 64 being a bigger number than 32 but that side of things almost certainly isn't why moving to ARM64 is a performance win.
In any case it's false to say that if something does not need a 64-bit address space then moving from ARMv7 to ARM64 is of no advantage. The feature Apple are shouting about may or may not be pointless; the improvements they aren't mentioning are real.
I think the downvotes are more because the Linux kernel and its team are actually pretty good at security, and because Android implements Java via its own Google-specific virtual machine, using none of Oracle's code and therefore shouldn't be tainted with the same brush.
Apple doesn't stipulate which advert libraries you can use.
Example third-party libraries with explicit iOS SDKs include Google AdMob (https://developers.google.com/mobile-ads-sdk/download), Flurry AppCircle (http://www.flurry.com/appCircle-a.html), InMobi (http://www.inmobi.com/products/sdk/) and MoPub (http://www.mopub.com/resources/open-source-sdk/).
The main reason this is far less likely on iOS is that Apple doesn't allow any application to collect text messages, phone call history or contacts. There are no APIs at all for the first two, and contacts can be collected only by a call that shows some Apple-defined user interface and eventually returns a single contact if the user confirms that course of events.
So on the iOS side it'd have to be a security privilege raising exploit as well as a trojan, rather than merely a trojan.