Re: Cool! (@Gav)
So did I imagine seeing Hartnell in Brighton Rock, This Sporting Life, The Mouse that Roared, etc?
2116 posts • joined 18 Jun 2009
So did I imagine seeing Hartnell in Brighton Rock, This Sporting Life, The Mouse that Roared, etc?
The good thing about git repositories is that checking out is cloning. So everybody has their own fully functional git repository, and the remote is backed up quite well even if you've no formal process.
If GitHub vanishes temporarily, pick any local machine to be the new central repository, and remember to push from there when GitHub comes back up.
Not criminal, as it turns out, but definitely a good example of something. Not something I like.
Top tip: in 10.8 and above (so, I assume also the pending 10.9 though I haven't used it), hold down 'option' while the file menu is open. The hated 'Duplicate' will turn back into 'Save As...'
If your hands are up to it, shift+option+command+S is the appropriate keyboard shortcut.
Total paid to Apple since 2005 if you kept up with every release (so, you bought 10.4 and everything since) in US dollars: 129 + 129 + 29 + 29 + 19 = $335
Total paid to Microsoft since 2005 according to the same rules (starting with an upgrade to Vista, assuming 'home' versions): 99 + 119 + 119 = $337
Being a 1997 attempt to fix the problems stemming from a belief that "[t]he Unicode standard is a fixed-width scheme ... [that] uses 16-bit encoding", it was immediately irrelevant because UTF-8 had been presented in 1993. It's also modal, so lacks self synchronisation, and complicates things by defining character sets by language. As the paper acknowledges, 'a' is present separately as an English character, a French character, a German character, etc, etc, with the intention being that all those different 'a's are mapped back to the same thing after the fact.
Per the Ars article, it's a software hack. Samsung's build of Android spots when popular performance tools are being run and turns everything up to 11. The device couldn't run like that ordinarily without significantly shortening its lifetime, both per charge cycle and in general.
What Ars did was to create two versions of the benchmarking tool Geekbench — one that the system could identify as Geekbench and one that it couldn't. The code remained identical.
When it was identifiable by the system as Geekbench, the code ran up to 20% faster.
If the phone doesn't lock when paired with a hands-free device then the phone of anyone with a bluetooth headset is permanently unlocked. So as soon as one of those people leaves their phone on the bus, on their desk, is pick-pocketed, etc, the story would be that a security vulnerability had allowed their contacts/email/the rest to be accessed.
I also think Siri is Apple's attempt to integrate calendar, dialler, maps, GPS, etc. I'm don't think it's a good implementation but it's an attempt, at least.
Regardless, answer this: are there — as the poster suggests — any grounds whatsoever to conclude that because one of the supplied, first-party applications displays information when we as users wouldn't want it to, the operating system must be "creaking at the seams"?
No — that was true across the board when the feature launched but Apple lets the individual carriers dictate it. E.g. AT&T announced in May that they'd be allowing access by the end of the year (source: http://www.theverge.com/2013/5/20/4348672/att-will-allow-all-video-chat-apps-on-its-network-by-end-of-2013 ) and various individuals started reporting access somewhere around mid-June. I've no idea if it's nationwide yet but if it isn't then it will be soon. Other networks no doubt have similar plans.
Are you actually reading the stories?
Here's what users want a phone to do: (i) lock itself if left unattended; and (ii) be usable hands free, such as in cars.
The 'security' problems with iOS are nothing to do with the technical underpinnings. They're the direct result of the inherent conflict of those two goals. There's no hint of creaking internals whatsoever.
I'm not sure everyone here has quite kept up with the news.
This is Apple's solution for shipping EU-compliant chargers: http://store.apple.com/uk/product/MD820ZM/A/lightning-to-micro-usb-adapter — micro-USB goes in one end, the proprietary lightning comes out of the other.
There's no bad news for Apple here. They'll put the one very small external thing inside the box instead of the other.
The 2013 Nexus 7 replaces the 16:9 screen of the original with a 16:10. So it's not just Apple and not just expensive devices.
Personally I'm a fan of 3:2, as on the old titanium Powerbooks, amongst others. But I don't think that's likely to make a comeback any time soon.
Saying "this person also deserves criticism" with the implicit point being that blame doesn't divide along party lines isn't really an indicator of political leanings.
Conversely, trying to frame any criticism of Bush as necessarily liberal propaganda does suggest somewhat of a bias.
My interpretation of events since 2001 — the terrorist attack, not the change of administration — would be that these agencies have spiralled beyond anyone's control. The whole point of the constitution is that it creates competing interests and no single actor has control of all powers. Trying to pin all your national problems on this president or that party is inaccurate and unhelpful.
I assume you single out Obama on technical grounds? The well-known warrantless wiretaps under his predecessor were declared retroactively legal, after all.
Yeah, with their 162 computers.
I fact checked myself and had my numbers confused: it's 58% that would support greater gun control now! almost a year after the event. So I think probably the main point to be made is: it's no more accurate to paint America as a land of gun-obsessives than it is to paint it as a democratic panacea that European nations should aspire to. Americans are extremely diverse and likely just as many of them have a negative opinion of someone shooting up consumer electronics for YouTube as do Brits.
(aside: as a Brit currently resident in the US with interests back home, I'm currently part-funding both governments)
Here's what's going on in the US right now: last year, one party proposed an economic plan. That plan lost in a nationwide election by a healthy margin, cost the party Senate seats and substantially reduced the number of votes it received for the House of Representatives. Now that party is saying "it's either that plan or we shut down the government". When they did the same thing twice, two decades ago, the result was that the economy lost $1.5tn. It's line is being championed by someone in the Senate who (i) demanded the bill passed up from the House contain specific provisions; then (ii) took the floor to protest for 21 hours because the bill had exactly what he wanted in it; then (iii) voted in favour — alongside every other member of the Senate — of the procedural step he'd just spent 21 hours delaying.
Here's what happened last year: following a specific extreme criminal act, 58% of those surveyed wanted stricter gun controls. Legislation was debated but failed following threats from the gun lobby.
So: should anyone, anywhere in the world, take lessons from America on its system of legislation?
Yeah, ignoring the Apple side of things entirely, would Samsung even be on the list if it weren't for Google? Therefore do they really deserve to be above Google?
I buy the version where the surprise success of Windows gave Microsoft the idea to renege — they stabbed their partner in the back as soon as an opportunity arose but had not expected or been planning for the opportunity.
I can't think of another reason why they'd create the multitasking, new executable DOS 4, barely license it and then push all its code off into OS/2, subsequently picking up DOS from the version 3 code base.
I'm unclear who you think is rewriting history — essentially both the PC and the Mac came from companies that understood protected memory and MMUs perfectly but choose to omit the hardware for cost purposes. There was an MMU in the Lisa, there wasn't in the Mac. There was one in any number of IBM machines going back decades, there wasn't in the PC.
The history of Apple's multiple subsequent internal OS development screw-ups is interesting but quite distinct from Microsoft's errors. The stories start similarly but Apple get to the point between 6 and 7 where the resources were such that they could have afforded just to run multiple instances of 6 simultaneously and preemptively to multitasking properly with memory protection on those devices with MMUs, but instead they double down on cooperative multitasking and spend the newly spare resources on rewriting a bunch of things in C. After the PowerPC move they have a full preemptive, protected memory handling nano-kernel which is used to run the existing OS. For quite a while large parts of the stack remained in 68000 code and just ran through an emulator — they spent engineering time getting the emulator down small enough to fit entirely within the processor cache because it was a more effective way to transition than dealing with the OS proper.
I've been on OS X for almost a decade now — I feel like I've probably seen that grey screen that tells you that you need to restart the computer three or four times. I guess that's the same as NT's blue screen, which I saw with about the same frequency before I switched to the Mac.
MS-DOS and early versions of Windows were designed for a processor without protection domains. You couldn't do anything to protect the OS against a crashing process. In Windows you not only couldn't protect the OS but you had to rely on third-party hardware providers writing good drivers.
I guess to the extent that Microsoft deserve criticism, it's not transitioning their OS fast enough as the x86 architecture matured. With computer sales growing exponentially throughout the 80s and early 90s that made legacy software compatibility much more of a problem than it needed to be. But probably they genuinely believed OS/2 would happen.
... but the HRA provides that public institutions, including courts, may not contravene ECHR rights. Which has been used in cases like Campbell v Mirror Group Newspaper to import additional criteria onto existing civil wrongs — once you've established standing for an action and are in court, the argument goes that the court can't act so as to contravene your convention rights and hence those rights themselves become enforced against a third party.
You'd therefore expect that, if the professor was making a threat at all, he was meaning to say that he'd go to court on contractual grounds and, once there, could likely use the HRA to bolster his argument.
What it actually sounds like to me is not so much that anyone was disputing the right to free speech so much as that the Easyjet employee tried to divert the conversation and failed miserably.
They restricted Android testing to only the interface that has really succeeded in the market in order to test only the feature arrangement people actually seem to like enough to buy. They tested only one major version of Android because there's been only one major version of Android on the market for the last year or so.
They tested two major versions of iOS because the one segued into the other only last week.
So the circumstances are different. This particular feature of the study does not suggest bias. Though based on the comments other people have made re: sponsorship, if true, it feels moot.
Studies with massive polls always put Apple on top too — it's about the Mac but see last week's http://www.theregister.co.uk/2013/09/17/apple_tops_all_windows_pc_in_customer_satisfaction/
Scroll down to the 21:29 AC to see how those usually go down around here.
OS X has had virtual desktops since v10.5, i.e. 2007. It's therefore very unlikely — though definitely not impossible — that you tried an Intel Mac that didn't come with them; likely they were just disabled. As of 10.7 they've also added a built-in widget that makes apps go full screen by creating their own distinct virtual desktop.
Mouse-over activation isn't supported. Probably you installed the X11 server (which runs on the desktop, not separately) and were using xterm. The normal terminal works exactly like every other app.
If you're running an i5 at 4.5Ghz then you probably also have a freezer down there to cool the thing? Though Google says you're not alone in pushing that clock speed.
I'm happy I can buy a Mac but I'm also happy you can mix and match and overclock, and optimise for whichever of price, performance, noise, etc, you value most.
You're wrong: there's room for a CD drive. Space was not the deciding factor.
Yeah, next thing you know those sheep will want search engines to order their results so that those other people think are more relevant appear closer to the top. The mindless fools should click through every result and make their own mind up.
I think they use it to counter the argument that Apple's future is unsteady due to the huge figures being racked up by Samsung et al — i.e. the company is very profitable. It won't close soon.
I don't think they use it in their role as surrogate salespeople.
Maybe to give themselves a chance to say things like "Have you seen the numbers? What happened?!?"?
That's a cheap shot though. Giving the device the benefit of the doubt and assuming an audience exists, does anyone imagine them queuing up overnight or otherwise dashing to the shops during the first weekend? It's a slow burn device for people to consider when their contract comes up for renewal.
I think the point isn't so much muggings as ordinary thefts. But even then if you leave your device on display then thieves are probably still going to grab it since it might not be updated or attached to Find my iPhone; if they get it without seeing what it is first (via pickpocketing, stealing a bag or whatever) then they're unlikely to come and give it back.
They confirmed that a well-known way to fool fingerprint scanners fools a particular brand of fingerprint scanner — I don't think anybody was seriously expecting it to take that long.
I guess the best advice is: if you can't be bothered with a password then the fingerprint scanner is better than nothing.
The iPhone 5s introduces integrated Nike Fuelband. The Nexus 5 introduces integrated time travel.
If it's just track, wipe and remote message, and you want to reduce everything to 'copying' then that's Google copying Apple again — it's been available on iOS devices since 2010. See http://en.wikipedia.org/wiki/Find_My_iPhone
As to what the article is actually about, rendering a device permanently locked, regardless of reboots or firmware reinstalls, or having it always display a message,under the same criteria, that's not currently possible under Android and it never will be. The freedom to reflash cuts both ways.
Hackers will likely circumvent Apple's measures but it's now an arms race. The Android audience hasn't been welcoming to devices that are actively built to lock you out so the same thing likely won't happen. Which is not entirely a good thing or entirely a bad thing.
App deletion is not the norm — the Apple updater doesn't delete apps. If the employee thought it was normal, maybe that's because he specifically did a clean install?
I doubt Apple would switch to a staggered release because it would ruin their PR. The boast is always "best launch ever, <whatever>% of users updated on day one, <huge number of> downloads", trying to draw a distinction with Android in the minds of developers.
You think Apple hired a crowd... of two? They must have bought into the idea of decline.
Then don't use iTunes. It isn't required.
As per jeremy 3; Apple didn't allow apps to support both the full iPhone 5 screen and the ARMv6 instruction set used by pre-3GS devices. So a lot of developers had to drop 4.2 to support the iPhone 5. Similarly Apple is not going to accept 64-bit builds that also attempt to support iOS 5.
So developers often don't shout about dropping support because there's often quite a lot of coercion involved.
A new Apple iOS device usually receives OS updates for three years. Without breaking it down, to quote Ars Technica's review of iOS 7, posted today: "The length of the iOS device support cycle remains about the same as it has been for the last couple of years. If you buy an iOS device when it’s brand new, you can (with some notable exceptions) expect three to four years of software support"
Microsoft didn't support upgrades from Windows Phone 7, which is three years old, to 8. 8 isn't yet three years old.
Taking Android's case at its strongest, Google provided updates for the Nexus One for less than two years — from the January 2010 launch of the phone until the October 2011 release of Android 4.0. It provided updates for the Nexus S for only very slightly longer — from a December 2010 launch through to the November 2012 release of Android 4.2.
The Galaxy Nexus and Nexus 4 have received the latest versions of the OS. The Galaxy Nexus will be two years old in November.
So at the very least, Apple devices are definitely not doing worse than the market average.
Per Berners-Lee and museums worldwide, the Next was the platform on which the web was developed. That's pretty much settled.
But why does Jobs get credit for that? If SETI@home finds something then do we need to find out which particular client filtered that bit of data and award Jobs, Gates, Torvalds or whomever with credit for finding extraterrestrial life?
Further to the anecdotes: when I collected my A-Level results in 1999, from my well-to-do upper-middle-class school, exactly four people out of about a hundred had mobile phones which were passed around widely for phoning parents, and only one of those was actually the property of the student.
Starting university later that year, almost everybody ended up getting one within a term or two. They were cheaper than the halls' pay phone besides anything else.
I was wondering whether the semantics of Objective-C might have something to do with it. With only two special cases, all objects are created on the heap. Although the C primitives are available, what that means is that quite often you're doing things that seem a little ridiculous like passing around a temporary 32-bit number by putting it on the heap and kicking around the pointer.
On OS X, 64-bit pointers have to be quad-word aligned to be valid. So three quartets of pointers are invalid. Apple has used certain subsets of them so that e.g. 32-bit NSNumbers are encoded directly into the pointer, with nothing stored on the heap. So creating them, accessing them and destroying them is much cheaper. It's heap semantics but actually the entire object is on the stack.
If iOS does the same then that presumably would be a way that the move to a 64-bit runtime benefits programs much more than just by providing extra registers.
Apple has made it clear that binaries that include 64-bit code must have a minimum deployment target of iOS 6. That's about 95% of active users so it's not so awful but it is indisputably a reduction in legacy support.
I think Apple has solved the iTunes problem by just cutting it out — we're beyond people being grateful it's no longer needed and well into people not specifically remembering the last time they used it.
Here's what worries me: having taught us to ensure charging our phones pretty much every day, the market now thinks it can sell us things like watches that need to be charged every single day.
If you've ever visited the US then the US government already has your fingerprints — all visitors are required to provide them at the border.
If you carry a mobile phone then you almost certainly already allow yourself to be tracked — probably you provided details of your identity to obtain the device but even if not then you can likely be identified by the contents of your communications.
As I've visited the US and carry a contract mobile, I seem already to have sold myself out. Something about locking stable doors jumps to mind. It'd be nice to believe that everybody else has managed to avoid becoming traceable but it sounds unlikely, so while I strongly ideologically support a stand against increasing biometric intrusion, it's likely a token gesture.
I didn't expect the processor to go 64 bit but Apple are saying it brings performance advantages since you get twice as many registers, rather than just literally having 64 bit addressing where the previous has 32 bit. So that shows what I know about the ARM architecture.
I didn't expect ES 3.0 to make an appearance but that's just because Apple has always lagged so much on the desktop. There's still no geometry shaders and Apple already supported extensions for occlusion queries and compressed textures so I guess the step forward for developers isn't so great.
Otherwise? If the finger-print sensor works then it'll be useful but Motorola's a few years ago was never any good. I don't have a Nike+ Fuelband but gamification of health is exactly the sort of thing a feeble-minded person like myself would be manipulated by.
The idea that products succeed or fail based on tick lists of features is absurd even if you ignore the motivation behind the list selection criteria.
Here's what most people used a phone for in 2007: calls, texts. Here's what they could do with an iPhone: calls, texts, the web. With a usable interface thanks to multi touch. And without sending you bankrupt because Apple strong armed the carriers into unlimited data.
What is laughable is the preceding interfaces for the web, full of modality and fixed-level zooms and web pages reduced to the system font, and the idea that people would pay 50p/mb for the privilege.
Apple opened the door, Google charged through it.
The move to 4k televisions will be a fantastic thing, because it'll mean that the standard panel resolution becomes 4k, to the massive benefit of every laptop that isn't the Retina MacBook or the Chromebook Pixel.
For TV itself? Films are already that resolution without the hassle of needing to be redigitised, YouTube can stream in 4k and smartphones have been announced that can record in it. So there'll probably be a pincer movement on production television. I mean, it won't make much visible difference, but you can at least realistically see it happening.
I'm not sure. I think Chrome is a product of trends rather than a producer of them.
In my opinion the development of browsers has been directed by the web's move from the desktop to phones and tablets. That has tightened optimisation requirements and, because there's still a healthy competitive market in mobile, has made everyone much keener to ensure that nobody else has de facto control over the standards.
WebKit alone is possibly the more interesting story. It was always about standards compliance, being first to pass Acid2, and has been a frontrunner on performance. Although it's most often associated with Apple it was powering the Symbian browser as early as 2005 and Google started committing more than Apple in 2010. So it's a pretty good indicator of where the major mobile players of the last few years think priorities lie.
From there Chrome is a gateway and an irony. It makes desktop browsing thoroughly up to date so as to make desktop browsing less relevant.