1749 posts • joined 18 Jun 2009
Re: To coin a phrase
Yeah, ignoring the Apple side of things entirely, would Samsung even be on the list if it weren't for Google? Therefore do they really deserve to be above Google?
Re: @ThomH: You were doing ok (@Tom 13)
I buy the version where the surprise success of Windows gave Microsoft the idea to renege — they stabbed their partner in the back as soon as an opportunity arose but had not expected or been planning for the opportunity.
I can't think of another reason why they'd create the multitasking, new executable DOS 4, barely license it and then push all its code off into OS/2, subsequently picking up DOS from the version 3 code base.
Re: to the extent that Microsoft deserve criticism? (@Tinker Tailor Soldier)
I'm unclear who you think is rewriting history — essentially both the PC and the Mac came from companies that understood protected memory and MMUs perfectly but choose to omit the hardware for cost purposes. There was an MMU in the Lisa, there wasn't in the Mac. There was one in any number of IBM machines going back decades, there wasn't in the PC.
The history of Apple's multiple subsequent internal OS development screw-ups is interesting but quite distinct from Microsoft's errors. The stories start similarly but Apple get to the point between 6 and 7 where the resources were such that they could have afforded just to run multiple instances of 6 simultaneously and preemptively to multitasking properly with memory protection on those devices with MMUs, but instead they double down on cooperative multitasking and spend the newly spare resources on rewriting a bunch of things in C. After the PowerPC move they have a full preemptive, protected memory handling nano-kernel which is used to run the existing OS. For quite a while large parts of the stack remained in 68000 code and just ran through an emulator — they spent engineering time getting the emulator down small enough to fit entirely within the processor cache because it was a more effective way to transition than dealing with the OS proper.
I've been on OS X for almost a decade now — I feel like I've probably seen that grey screen that tells you that you need to restart the computer three or four times. I guess that's the same as NT's blue screen, which I saw with about the same frequency before I switched to the Mac.
MS-DOS and early versions of Windows were designed for a processor without protection domains. You couldn't do anything to protect the OS against a crashing process. In Windows you not only couldn't protect the OS but you had to rely on third-party hardware providers writing good drivers.
I guess to the extent that Microsoft deserve criticism, it's not transitioning their OS fast enough as the x86 architecture matured. With computer sales growing exponentially throughout the 80s and early 90s that made legacy software compatibility much more of a problem than it needed to be. But probably they genuinely believed OS/2 would happen.
Re: I'm a Law Lecturer - FAIL (@Nicho)
... but the HRA provides that public institutions, including courts, may not contravene ECHR rights. Which has been used in cases like Campbell v Mirror Group Newspaper to import additional criteria onto existing civil wrongs — once you've established standing for an action and are in court, the argument goes that the court can't act so as to contravene your convention rights and hence those rights themselves become enforced against a third party.
You'd therefore expect that, if the professor was making a threat at all, he was meaning to say that he'd go to court on contractual grounds and, once there, could likely use the HRA to bolster his argument.
What it actually sounds like to me is not so much that anyone was disputing the right to free speech so much as that the Easyjet employee tried to divert the conversation and failed miserably.
They restricted Android testing to only the interface that has really succeeded in the market in order to test only the feature arrangement people actually seem to like enough to buy. They tested only one major version of Android because there's been only one major version of Android on the market for the last year or so.
They tested two major versions of iOS because the one segued into the other only last week.
So the circumstances are different. This particular feature of the study does not suggest bias. Though based on the comments other people have made re: sponsorship, if true, it feels moot.
Re: Can you spell horseshit? (@Sil)
Studies with massive polls always put Apple on top too — it's about the Mac but see last week's http://www.theregister.co.uk/2013/09/17/apple_tops_all_windows_pc_in_customer_satisfaction/
Scroll down to the 21:29 AC to see how those usually go down around here.
Re: so does that mean
OS X has had virtual desktops since v10.5, i.e. 2007. It's therefore very unlikely — though definitely not impossible — that you tried an Intel Mac that didn't come with them; likely they were just disabled. As of 10.7 they've also added a built-in widget that makes apps go full screen by creating their own distinct virtual desktop.
Mouse-over activation isn't supported. Probably you installed the X11 server (which runs on the desktop, not separately) and were using xterm. The normal terminal works exactly like every other app.
Re: I don't mind
If you're running an i5 at 4.5Ghz then you probably also have a freezer down there to cool the thing? Though Google says you're not alone in pushing that clock speed.
I'm happy I can buy a Mac but I'm also happy you can mix and match and overclock, and optimise for whichever of price, performance, noise, etc, you value most.
Re: 20+ Inches of case...
You're wrong: there's room for a CD drive. Space was not the deciding factor.
Re: Apple users in a nutshell
Yeah, next thing you know those sheep will want search engines to order their results so that those other people think are more relevant appear closer to the top. The mindless fools should click through every result and make their own mind up.
Re: 40% gross margin
I think they use it to counter the argument that Apple's future is unsteady due to the huge figures being racked up by Samsung et al — i.e. the company is very profitable. It won't close soon.
I don't think they use it in their role as surrogate salespeople.
Re: one swallow doesn't make a summer
Maybe to give themselves a chance to say things like "Have you seen the numbers? What happened?!?"?
That's a cheap shot though. Giving the device the benefit of the doubt and assuming an audience exists, does anyone imagine them queuing up overnight or otherwise dashing to the shops during the first weekend? It's a slow burn device for people to consider when their contract comes up for renewal.
Re: Absolutely Ridiculous
I think the point isn't so much muggings as ordinary thefts. But even then if you leave your device on display then thieves are probably still going to grab it since it might not be updated or attached to Find my iPhone; if they get it without seeing what it is first (via pickpocketing, stealing a bag or whatever) then they're unlikely to come and give it back.
They confirmed that a well-known way to fool fingerprint scanners fools a particular brand of fingerprint scanner — I don't think anybody was seriously expecting it to take that long.
I guess the best advice is: if you can't be bothered with a password then the fingerprint scanner is better than nothing.
The iPhone 5s introduces integrated Nike Fuelband. The Nexus 5 introduces integrated time travel.
Re: Apple copying Google again
If it's just track, wipe and remote message, and you want to reduce everything to 'copying' then that's Google copying Apple again — it's been available on iOS devices since 2010. See http://en.wikipedia.org/wiki/Find_My_iPhone
As to what the article is actually about, rendering a device permanently locked, regardless of reboots or firmware reinstalls, or having it always display a message,under the same criteria, that's not currently possible under Android and it never will be. The freedom to reflash cuts both ways.
Hackers will likely circumvent Apple's measures but it's now an arms race. The Android audience hasn't been welcoming to devices that are actively built to lock you out so the same thing likely won't happen. Which is not entirely a good thing or entirely a bad thing.
Re: Again with the server overload rubbish...
App deletion is not the norm — the Apple updater doesn't delete apps. If the employee thought it was normal, maybe that's because he specifically did a clean install?
I doubt Apple would switch to a staggered release because it would ruin their PR. The boast is always "best launch ever, <whatever>% of users updated on day one, <huge number of> downloads", trying to draw a distinction with Android in the minds of developers.
You think Apple hired a crowd... of two? They must have bought into the idea of decline.
Then don't use iTunes. It isn't required.
Re: About time!
As per jeremy 3; Apple didn't allow apps to support both the full iPhone 5 screen and the ARMv6 instruction set used by pre-3GS devices. So a lot of developers had to drop 4.2 to support the iPhone 5. Similarly Apple is not going to accept 64-bit builds that also attempt to support iOS 5.
So developers often don't shout about dropping support because there's often quite a lot of coercion involved.
A new Apple iOS device usually receives OS updates for three years. Without breaking it down, to quote Ars Technica's review of iOS 7, posted today: "The length of the iOS device support cycle remains about the same as it has been for the last couple of years. If you buy an iOS device when it’s brand new, you can (with some notable exceptions) expect three to four years of software support"
Microsoft didn't support upgrades from Windows Phone 7, which is three years old, to 8. 8 isn't yet three years old.
Taking Android's case at its strongest, Google provided updates for the Nexus One for less than two years — from the January 2010 launch of the phone until the October 2011 release of Android 4.0. It provided updates for the Nexus S for only very slightly longer — from a December 2010 launch through to the November 2012 release of Android 4.2.
The Galaxy Nexus and Nexus 4 have received the latest versions of the OS. The Galaxy Nexus will be two years old in November.
So at the very least, Apple devices are definitely not doing worse than the market average.
Re: Not Steve Jobs then...
Per Berners-Lee and museums worldwide, the Next was the platform on which the web was developed. That's pretty much settled.
But why does Jobs get credit for that? If SETI@home finds something then do we need to find out which particular client filtered that bit of data and award Jobs, Gates, Torvalds or whomever with credit for finding extraterrestrial life?
Further to the anecdotes: when I collected my A-Level results in 1999, from my well-to-do upper-middle-class school, exactly four people out of about a hundred had mobile phones which were passed around widely for phoning parents, and only one of those was actually the property of the student.
Starting university later that year, almost everybody ended up getting one within a term or two. They were cheaper than the halls' pay phone besides anything else.
Re: It's not about addressable memory
I was wondering whether the semantics of Objective-C might have something to do with it. With only two special cases, all objects are created on the heap. Although the C primitives are available, what that means is that quite often you're doing things that seem a little ridiculous like passing around a temporary 32-bit number by putting it on the heap and kicking around the pointer.
On OS X, 64-bit pointers have to be quad-word aligned to be valid. So three quartets of pointers are invalid. Apple has used certain subsets of them so that e.g. 32-bit NSNumbers are encoded directly into the pointer, with nothing stored on the heap. So creating them, accessing them and destroying them is much cheaper. It's heap semantics but actually the entire object is on the stack.
If iOS does the same then that presumably would be a way that the move to a 64-bit runtime benefits programs much more than just by providing extra registers.
Re: So why bother with a 64-bit chip at all?
Apple has made it clear that binaries that include 64-bit code must have a minimum deployment target of iOS 6. That's about 95% of active users so it's not so awful but it is indisputably a reduction in legacy support.
Re: Nail on the head Andrew (@Darryl)
I think Apple has solved the iTunes problem by just cutting it out — we're beyond people being grateful it's no longer needed and well into people not specifically remembering the last time they used it.
Here's what worries me: having taught us to ensure charging our phones pretty much every day, the market now thinks it can sell us things like watches that need to be charged every single day.
Re: Blah blah blah -@AC 14:24
If you've ever visited the US then the US government already has your fingerprints — all visitors are required to provide them at the border.
If you carry a mobile phone then you almost certainly already allow yourself to be tracked — probably you provided details of your identity to obtain the device but even if not then you can likely be identified by the contents of your communications.
As I've visited the US and carry a contract mobile, I seem already to have sold myself out. Something about locking stable doors jumps to mind. It'd be nice to believe that everybody else has managed to avoid becoming traceable but it sounds unlikely, so while I strongly ideologically support a stand against increasing biometric intrusion, it's likely a token gesture.
Did anyone expect affordability?
I didn't expect the processor to go 64 bit but Apple are saying it brings performance advantages since you get twice as many registers, rather than just literally having 64 bit addressing where the previous has 32 bit. So that shows what I know about the ARM architecture.
I didn't expect ES 3.0 to make an appearance but that's just because Apple has always lagged so much on the desktop. There's still no geometry shaders and Apple already supported extensions for occlusion queries and compressed textures so I guess the step forward for developers isn't so great.
Otherwise? If the finger-print sensor works then it'll be useful but Motorola's a few years ago was never any good. I don't have a Nike+ Fuelband but gamification of health is exactly the sort of thing a feeble-minded person like myself would be manipulated by.
Re: the first iPhone was laughable by European (ie: Nokia) standards
The idea that products succeed or fail based on tick lists of features is absurd even if you ignore the motivation behind the list selection criteria.
Here's what most people used a phone for in 2007: calls, texts. Here's what they could do with an iPhone: calls, texts, the web. With a usable interface thanks to multi touch. And without sending you bankrupt because Apple strong armed the carriers into unlimited data.
What is laughable is the preceding interfaces for the web, full of modality and fixed-level zooms and web pages reduced to the system font, and the idea that people would pay 50p/mb for the privilege.
Apple opened the door, Google charged through it.
The move to 4k televisions will be a fantastic thing, because it'll mean that the standard panel resolution becomes 4k, to the massive benefit of every laptop that isn't the Retina MacBook or the Chromebook Pixel.
For TV itself? Films are already that resolution without the hassle of needing to be redigitised, YouTube can stream in 4k and smartphones have been announced that can record in it. So there'll probably be a pincer movement on production television. I mean, it won't make much visible difference, but you can at least realistically see it happening.
I'm not sure. I think Chrome is a product of trends rather than a producer of them.
In my opinion the development of browsers has been directed by the web's move from the desktop to phones and tablets. That has tightened optimisation requirements and, because there's still a healthy competitive market in mobile, has made everyone much keener to ensure that nobody else has de facto control over the standards.
WebKit alone is possibly the more interesting story. It was always about standards compliance, being first to pass Acid2, and has been a frontrunner on performance. Although it's most often associated with Apple it was powering the Symbian browser as early as 2005 and Google started committing more than Apple in 2010. So it's a pretty good indicator of where the major mobile players of the last few years think priorities lie.
From there Chrome is a gateway and an irony. It makes desktop browsing thoroughly up to date so as to make desktop browsing less relevant.
Re: It'll be intersting to see how Apple handles this.
It's actually a frustratingly easy mistake to make with Apple's APIs — those CFIndexes pop up in quite a few places — and Xcode ships with the implicit signedness conversions warning disabled. It's one of the things I always enable when I'm starting a new project. Just enabling that would probably help them catch stuff like this.
That said, if it's a latent problem in initial table setup then the true diagnosis is probably that whomever guarantees the signed value would always be positive needs fixing, so they'd probably just have thrown the explicit cast in and forgotten about it.
If I understood the article, it sounds like it's meant to be calculating something related to the string width, and getting that wrong because by passing from one place to another it interprets a glyph count of -1 as a glyph count of UINT64_MAX.
So if you're asking why floating point arithmetic is used, it's because fonts are designed with floating point arithmetic and rendered with floating point arithmetic. The OS X graphics system uses the same drawing primitives as PDF and Postscript.
When I find an article like this I make sure I click through every advert on the page. Good stuff!
Re: I wonder if it's straightforward role reversal (@JDX)
Even if it is just a fluctuation, that's surely news because it destroys the overall narrative of Android doing now what Windows did a couple of decades ago?
It all sounds healthy regardless. I like Bernard's free market interpretation: Apple is managing to respond to competition.
I always thought it was more like one of the later Freescape games — Castle Master, maybe — except that the relatively static movements were for a completely different reason.
Reaping what they sowed
When Microsoft achieved monopoly status in the Internet client market, it declared that the browser was then fully evolved and left us with the five-year gap between IE6 and IE7. When its years of work on smartphones didn't translate into sales it said that obviously consumers did not want smartphones. It essentially spent a decade pushing bad products and insisting that it was everyone else that had the problem. Besides being exceedingly late to market with the decent Windows Phone iterations, Microsoft has created for itself a very tough reputation to overcome.
When Nokia had the number-one smartphone OS it gave us extremely contextual user interfaces that only a computer scientist could love, labyrinthian APIs and barely functional tools, and expected developers to rally behind Series 60 because it omitted such frivolities as being able to display anything other than the system font. Dithering this way and that on where to go next, it was like they heard Apple had made waves and decided to copy them but accidentally copied Apple circa 1993. That's how Samsung, HTC, etc, stole the market from them.
Both companies have shown sufficient penitence since but it's probably too late. A complete merger is probably a smart move but only as a final roll of the dice.
Re: The tape is dead,
*cough* Minidiscs were magneto-optical, exactly like CD-RWs and all rewritable optical formats since. The magnet aids in the writing process but the thing is then read optically by laser, and can be read optically even by mechanisms with no magnetic element.
Sony's incredibly restrictive ideas about DRM meant that you weren't permitted to use a computer directly to manipulate their data. Their licensing ideas about compression formats also meant that by the time they allowed write access by USB, that meant the computer having to transcode, usually from MP3. So they managed to reintroduce generational degradation to digital electronics and restricted everyone to their lousy software.
It was just one stupid move after another, really.
Re: Another application?
Morse code? That doesn't even have the prefix property. Maybe a Huffman coding on the alphabet with frequencies dictated by your language?
That's what the MagicCode product mentioned in the article deals with — executing ARM code on MIPS. So they've got the native code problem solved.
Re: It'll be the.. (@Ted Treen)
Apparently not obviously enough for you.
DrStrangeLug's comment parodies the way that Apple often acts and/or is treated as though it invented things like the MP3 player, smartphone, etc, rather than merely launching commercially and critically successful versions. JDX's comments parody the image of an Apple fan as genuinely believing the invention myth, and of Apple announcing each new iteration of a product as revolutionary when it's merely a minor step forwards.
The clues were: DrStrangeLug's abstract description of a desktop computer, and JDX's singling out of 1997's iMac — the first big Jobs launch that did a lot to create the modern Apple — as the original computer.
For extra amusement, lurker appears to think JDX is being serious and, apparently, so do you. I further found it amusing that, given the level of debate around here, it's unclear whether the down votes are from anti-Apple people not getting the joke or pro-Apple people taking things too reverently.
I haven't specifically checked the schedules but there are probably some sitcoms on BBC3/CBS (delete as per your side of the Atlantic) this week that will cater to those that need this level of explanation.
Re: It'll be the..
What do we think on the down votes here? Is it the blindly pro- or the blindly anti-Apple failing to get the joke?
Re: Lens Flare...
The lens flare wasn't fake, at least in the sense of added on later. it was created by the actual set versus the actual lenses.
... which should probably teach us that the difference between something being immersive and being distracting is not solely related to the extent that digital tool were used.
Re: My First...
If you'd used FUNC+K then think how much time you could have saved!
It was always Starship Command for me, and some Repton 2. Oh, and Gaunlet, the Defender clone.
Re: Ah memories
The Slogger has a really neat trick — if the program counter goes beyond D000 (if memory serves) then writes automatically go to the video page. Othwise they go to the shadow page. Why? Because that's where the ROM routines for graphics output are and the original machine ROM is used unmodified.
Re: Wow (@Arthur 1)
You take "Apple is preparing to train its shop staff to use Microsoft Windows in a bid to flog more Macs to business users reared on PCs" to mean "users who don't want to use OS X"? That's like saying that if shop staff are trained to demonstrate a browser then that means they're targeting customers who don't want to use desktop applications. It's a false dichotomy.
VirtualBox is available and just as good as elsewhere. Giving everyone the benefit of the doubt and assuming all decisions were made purely on merit, I imagine that Apple might prefer to push Parallels because it does a lot of work tightly integrating the two environments. For example, file associations work in both directions, from the Finder onto the Windows desktop and vice versa, the start menu is added to the dock, your Windows applications appear in /Applications and hence in the Launchpad, etc.
It's actually why I stopped using Parallels and started using VirtualBox. I needed Windows for a particular application only and otherwise to be contained in its box — I actively didn't want Parallels to start messing about with my whole system.
Re: Wow (@Arthur 1)
There's really no basis to conclude that Apple is "trying to sell systems to people that don't want to use their operating system". Much more likely they're trying to sell systems to people that do want to use the Apple operating system but falsely believe that they can't because a vital piece of software isn't available in a native port.
Otherwise they'd be demoing BootCamp rather than Parallels.