1751 posts • joined 18 Jun 2009
Disagree on a bunch of points, but by no means all
(c) and (h): these extra features tend to be things that people don't actually care about, outside of a tech blog niche, and in any case can generally be found on Android phones too — with similarly few people particularly interested.
(e): I attended a Nokia development day recently, where we were given free phones and lots of information, and told about the latest cash prize development competitions. The Nokia employees were very nice and are clearly trying very hard.
(g): actually, I think quite a lot of people can tell you that Nokia phones come with Carl Zeiss lenses. They just also (very erroneously) think that it's a made-up brand ala Matsui. So this hits the (c)/(h) point of people not being particularly interested. On screens they're not really any better. The iPhone still has the leading DPI, and I think that Samsung's AMOLED screens provide the best overall colour. I have the feeling Nokia use a similar OLED technology at the top of the range, but they're definitely not ahead on that front.
On the purely technical/internals front (which I think people definitely don't care about at all, whereas I accept that some people do care about whether they can connect up via HDMI), Nokia are almost alone in being yet to produce an ARMv7 phone, and tend to go with the less powerful Broadcom GPUs rather than the good PowerVR stuff. So I wouldn't say they're technically brilliant.
That said, you're completely right that they do the phone stuff brilliantly. I used a Nokia phone up until 2008 and was very often the only person able to get a signal, especially when I lived in Cumbria. However, I find the OS a bit confused and inconsistent (eg, on the N8, just talking about built-in apps, some scroll areas require you to touch and pull a scroll bar, some are direct manipulation with no inertia, some have inertia but it varies from app to app), have never understood their holy devotion to having just one slightly peculiar font — especially as it makes web pages look really awful.
So, ummm, conclusion: hardware very good in some areas, good enough in all others, software definitely needed a change.
It'd just look like the TV is constantly halfway between a fade from one to the other, surely?
If it follows the normal App Store rules, then you can have unlimited downloads for as long as the product is available. Which, I appreciate, answers only one of your very minor concerns, but there you go.
Might be smart to do a completely clean install, grab a Time Machine backup right then, and any time you want to refresh just chose 'restore from Time Machine backup' via the recovery disk that came with your machine (which was also the OS disk, at least up until now).
Who are you talking about?
You seem to have some sort of confirmation bias. Scroll up the screen and look at the comments posted before the 31st of May at 13:28 and there's nothing like a flock of anyone in particular, and almost no whining.
It's 70% of those returned
Since the overall proportion returned is probably, I don't know, 10% at the absolute worst, the average smartphone punter isn't imputed at all. What they're probably trying to do is put pressure on Google by stating publicly that Marketplace policies have given them a dramatic increase in returns.
Like you, I suspect that this isn't much of a problem at all to most people.
Fingers crossed for new development tools
You're probably quite safe
Demographic differences are the most relevant thing I think - amongst the Mac demographic is a significant group of people with no technical grounding. A desktop Linux user is unlikely to believe that there's some magical piece of antivirus software installed that they didn't know about, and weirdly never saw before becoming 'infected', or alternatively that you can virus scan from within a browser, and is very unlikely to act without secondary confirmation (by manual inspection of the filing, possibly) and without first checking the web for suitable open source tools.
At a guess...
... they'll adapt the x86 emulation code they bought with Virtual PC and deploy on the XBox 360 when running XBox games. Obviously it'll need some work because the target processor is ARM rather than PowerPC but it's probably easier than starting from nothing.
Quite the opposite
Per the Bloomberg article you link to: "Microsoft, the world’s largest software maker, will showcase the interface running on hardware with an Nvidia Corp. (NVDA) Tegra chip, the people said last week, declining to be identified because the plans are confidential."
So that would make it sound like they're interested in doing tablets with ARM and ARM only.
One more difference
Apple got out of its funk by abandoning the existing software platform, bringing in external management and merging in an external development team, then segueing into a brand new market and then several other new markets.
Nokia already switched management and are outsourcing a large part of the software stack. But they're effectively ceding a significant part of their destiny, something Apple have always managed to avoid.
That said, I agree with the article. Nokia's nothing like finished, its old strategy was on a crash course long before Elop turned up, and the platform switch gamble is the only workable way forwards. You can argue about the decision to use Windows Mobile versus other comers, but its hardly the most significant of his decisions.
There's a difference in approach though
Apple assume everything to be malevolent until they've discovered it to be otherwise. Google assume everything to be benign until they've discovered it to be otherwise. And that's without getting into the tests each applies to determine what they think shouldn't be made available to customers.
They're trojans though
So OS security doesn't really come into it. That's the whole point of the trojan horse — the security is sufficiently onerous that you just get someone trusted to let you in.
Slight problem versus the NDK, presumably?
And I was under the impression that the only way to compile C code is via NDK, bypassing Android's virtual machine, giving a lazy option to EA, Epic, etc when porting their engines. I guess it'll be fine though — I'll bet that 99% of applications are purely Dalvik based.
... or you could just copy and paste the file URL from the 'Activity' window to the 'Downloads' window. No need to include bit.ly or anything similar, whatever happens. Or install the ClickToFlash extension (from about halfway down the page you go to if you click 'Get Extensions'), right click on the youtube video and select 'Download Video'.
Then go back to doing whatever you were really browsing for in whatever browser you like.
The inclusion of an anti-malware tool with versions of the OS since 2009 — per the article — would appear to make your comment a little late.
You're accusing iTards of ad hominem attacks? Surely some sort of satire?
I'm of the opinion that the 'i' has outlived its welcome, but I guess it makes it very easy to come up with brand names that are legally protectable and which associate new products with a person's existing perceptions of Apple.
I was sort of hoping that MobileMe indicated a move away from iEverything, with the iPad getting the name because the similarity to iPod was just too alluring, but I guess that wasn't the case. Oh well, they're just names.
From the article, Apple's complaints — and my guesses at the reasoning behind them are:
"infringed upon patents and violated its trademark", i.e. manufactured (if he was painting them himself as other commenters allege) and sold equipment with the Apple logo on without permission.
"using deceptive practices in the creation and sale of the product", presumably by making some sort of claim that these were authentic Apple parts for genuine white iPhones rather than genuine black parts, repainted.
Though it's ironic that Apple appear to be using (amongst others) laws with the purpose of allowing a company to protect is reputation to sue a 17-year old who through significant initiative managed to fill a gap they'd created when they failed to ship a simple product for an extended period of time. I think they're being really stupid on this one.
The "rabid fanboi" of your imagination doesn't exist. It's just a cheap caricature, calculated to inflame, that you've conveniently picked upon to be a scapegoat.
The default user is an Administrator in OS X parlance. Such privilege is not the same as and is significantly less than root.
Safari defines 'safe files' as: movies, pictures, sounds, PDF and text documents, and disk images and other archives. It doesn't include executable files. Having read some other sites on this issue today, it seems that the program comes as an installable application archive. So the OS launches the standard package installer, prompting the user to click onward to install the app. They have a few screens to click through, including one where they select a target drive and then confirm the installation location.
Anyway, 'execute' is the wrong verb. Safe files are opened. You can't throw arbitrary executable code onto a Mac using Safari's built in, designed behaviour.
It doesn't mean what you think it means
All the stuff you would need administrator privileges to adjust on another UNIX requires the entry of the user's password in a default OS X install. However, write privileges to /Applications are gifted without password.
Acting as the default user, if you have to sudo to do it in Linux or BSD, you have to sudo to do it in OS X.
Allow me: OS X contains a BSD layer, derived from BSD. Because it has a terminal, it also contains a bunch of open source components that you commonly see included in Linux distributions. WebKit is notably a fork from KHTML and KDE is generally closely associated with Linux distributions. There's even a rootless X11 manager if you want to use it (though I don't think it's a default install).
So, fine, technically it's not Linux-derived because its original development predates that of Linux and Linux is just a kernel, whereas OS X explicitly uses a completely distinct kernel. But it's quite accurate to say that it shares a large code footprint with what people idiomatically call 'Linux' and that at least some components were part of idiomatic Linux before they were part of OS X.
I'm sure that you could find a bunch of BSD, Linux or OS X people that would be angered by the statement, but hopefully not at as irreverent a site as this.
As above, my guess is...
... downloading to another folder is achieved by supplying an archive with an absolute path, and one of the built-in extractors failing to validate that properly. bsdtar is safe, so I'll guess it's a zip problem. The default set up also doesn't allow users to write to absolutely anywhere on the system, but it does allow them to write to /Applications, so whatever they're doing probably doesn't allow a write to anywhere.
Yes, though, it's a big gaping hole.
They were confused
"Administrator privileges" tends to be synonymous with unfettered access to anything on a computer. A default install of OS X will require a password be entered for a bunch of tasks, such as viewing things stored on the keychain, making changes to certain system preferences and some other things.
However, you're quite right because on a default install, and I'll wager on 99.9% of machines out there, the single user has a tick against 'Allow user to administer this computer' and can write whatever they want to /Applications, whenever they want. Combine that with Safari shipping with 'Open "safe" files after downloading' ticked by default and it's easy to see how this program installs itself, given that archives are considered safe and I guess one of the archive formats doesn't properly guard against absolute paths.
All of the proper, internal paths should be properly locked down by default, so in theory this program shouldn't be able to do anything to stop you from just dragging it to the trash and hence uninstalling it. That said, it should still be a major embarrassment that it can install itself in the first place.
I agree and disagree
Agreed: the important asynchronous fetch parts behind AJAX originated at Microsoft, IE6 didn't hit standards very well but it was an era before anybody did and before most of the standards it ended up living alongside, Microsoft's enterprise software — whatever else you may say — sets a high bar of entry for competitors, Gates was preaching the tablet before anyone actually wanted one and before the form factor really worked for technological reasons.
Disagreed (but not refuted as these are just opinions): much of the Apple stuff. Innovation just means to make changes in something established, especially by introducing new methods, ideas, or products. So it's actually a really easy test to satisfy — taking a good idea from one field and transplanting it into another so as to change perceptions of the market would seem to be enough, so multitouch on phones will do. They also have some genuinely novel manufacturing processes that create very robust enclosures, and Thunderbolt is very interesting.
On CPUs they use the latest Intel parts in the computers and design their own ARM-based silicon for the tablets, phones and MP3 players. Pixel densities are lagging on computers, but if pixel density were the test then you'd have to give them innovation for the latest iPhone, having jumped at least 50% ahead of the competition at launch and still being ahead almost a year later.
The thing with consumer products is that it's artificial to separate hardware and software when trying to pinpoint innovation. Which is why comparing Apple to Dell and Microsoft separately is a little silly and I think that's part of the point the article was making.
But on the other hand...
... the conclusion was drawn by comparing the proportion of people who saw the movie that watched it in 3d to the proportion of people who saw those other movies that watched them in 3d, to reach the conclusion that — amongst films with a large enough release and sufficient success — the proportion of people opting for 3d was lower than previously.
There are then a bunch of possible counterarguments about why the specific nature of the film didn't cause the drop, albeit that they're speculative.
So I don't think it's franchise sickness, since the comparison isn't to other films in the same franchise or to any numbers that may have been affected by the popularity or quality of other films in the franchise.
The phone manufacturer singular already producing them. Let's not take away from Samsung just because Apple are rumoured to have copied their idea.
You've probably got your DVD player or Freeview player or Sky+ box or whatever set to output a 4:3 picture, so it's letterboxing the 16:9 then putting subtitles on at the bottom. Probably things would improve if you set the box to 16:9, adjusting your TV's picture stretching setting accordingly.
42" EDTV here
Yes, you know, the 480p sort. It was a hand-me-down gift. I probably sit about 3 or 4m away and can see the pixels on any high contrast objects. But I don't really care.
You could post the counterexamples if you think evidence is important.
I think I generally agree with you though, as it stands to reason. Some manufacturers lower costs by accepting money to install the Norton Tools or whatever trial versions on their machines. Apple don't. So even if Apple and those manufacturers spent exactly as much on production and applied the same markup, the Apple machine would be more expensive. You don't have to allege that Apple are charging higher margins or in any other way pumping up prices to get to the conclusion that the Apple machine should cost more.
No viruses yet...
... but a growing list of trojans. Platform security obviously helps prevent viruses (I'll bet the number for Windows 7 is tiny compared to Windows 95 when it was at the same level of adoption), but platform unpopularity is the only way to safeguard against trojans. I guess someone is dipping their toe in the water to test the viability of this sort of scam given Mac market share.
Technically he's making that allegation only if you think the Linux box is a cheap imitation of an Apple box. And I don't even agree with him that markets prefer things that are cheaper and less efficient; generally they prefer cheaper and more efficient.
I'm of the opinion that desktop computers long ago became pretty generic, though I tend to buy Macs still because they have a small physical footprint, a tiny electrical footprint, operate silently and usually last a decent amount of time. I'm also quite familiar with the software stack. However, I accept that I'm putting myself into a straightjacket in terms of customisation and I'm not under the illusion that I couldn't get better benchmark results for less money, or that because I like the OS it must be objectively better.
The best computer is the one you like the most, and the competition is what keeps all the vendors on their toes.
Based on Microsoft's denial and the port to ARM, I'm optimistic that they're deprecating some legacy stuff by relegating it to a Windows 7 compatibility mode. So they wouldn't be re-jigging the underlying architecture, just trying to push everyone more forcibly towards the re-jigged stuff.
Would an all .NET Windows with all or most of Win16/32/64 in a sandboxed, legacy support environment really be a bad thing?
Not quite true; if I recall correctly then a PowerPC version of Windows NT 4 shipped for the PReP platform in a few extremely obscure ThinkPads that could run Intel binaries through emulation. Or my memory might be fooling me, and the emulator may have been an add-on, though I'm pretty sure it worked at the system level, to emulate the binaries but forward the relevant system calls directly to the native NT implementations. Or I'm just very confused indeed.
I completely agree; binary compatibility for the Mac goes back a decade at the most and requires the installation of optional components to do so. Obviously you can argue some virtue in that from the perspective of bloat and support, but the benefits of full backward compatibility are so obvious as not to need arguing. Microsoft aren't always 100% on the nail, but it says a lot that I can remember the only two times I've had problems, and the first of those was running a Windows 2 version of PageMaker on 3.1...
Same thing here; I think part of it is that there's no gaps between the pixels, and they've selected fonts that are relatively aggressively fitted to pixel boundaries so there's limited need for anti-aliasing anyway. http://www.bit-101.com/blog/?p=2722 makes the point quite well, especially when you get to the 400x versus 375x zooms.
Given that the proposal is being submitted to an industry-controlled standards body for the scrutiny of the normal industry-controlled standardisation processes, it's probably safe to assume that if Apple don't know what a standard is then someone will tell them soon.
The bluetooth criticisms are valid, the video criticism isn't. Both the QuickTime container and H.264 video are industry standards, being written into the BluRay spec amongst other things. My Android can play them, most £20 DVD players with USB slots can play them, VLC can play them.
I think they already get a share of your subscription costs — that was the deal originally, at any rate. Hopefully competition from Android and others has eaten into that by emboldening networks.
I think tablets may displace laptops for users that buy a laptop to use in their own home, primarily for the web and email, tending to keep it in the lounge or some other socially oriented room. A tablet usually betters a smartphone for the same reason that A4 is what most people put content on rather than on till receipts — reading a web page at approximately the size of a full piece of paper is just easier and more comfortable.
I'm aware they also act as media centres, but I don't see that being a big use. For music you want something that'll fit in your pocket and video content tends to want as big a screen as possible, with the average TV now being probably about 30".
I'll bet that within ten years you get a tablet with your broadband just like you currently get a wireless router. People will plug the router into the wall, put the tablet in the lounge and for 90% that's the home Internet sorted.
The iPhone supports the latest HTML 5 database stuff, for local persistent store, as does Android. Blackberry doesn't.
To my knowledge, Jobs has spoken out publicly only against Flash. The arguments he made that it offers a very poor user experience on mobile have been backed up by every objective review I've seen of the Android client. I doubt I'm alone in having decided never to download the thing onto my Android phone. His conclusions — to ban Flash from his platform — betray his control tendencies but the initial observations were valid.
History has also shown Apple to be a beneficial contributor to the web ecosystem, being the driving force behind the vast majority of WebKit development (it was forked from KHTML when that project was four years old, which was nine years ago) and the originator of the canvas element and 2d/3d CSS transforms amongst others. As far as I can think, they've done nothing at any time to hurt the development and propagation web standards.
Very exciting if true
With Honeycomb being closed source, I guess this'll be a fork in many respects — so there's a sense in which it'll add fragmentation to the Android world, but I'd expect that Amazon won't market these as 'Android' devices and won't officially support anything other than their own store so hopefully the issue will be moot. I expect they won't have done anything dramatic with the underneath APIs either, so falling back into line at some point shouldn't be hard.
Amazon are the only company I can think of that can largely match Apple's content collection (probably the same amount of music and movies, more books and magazines, fewer apps, but the latter feels easier to fix if the devices are successful), while having a good direct-to-the-customer relationship and a fantastic retail infrastructure. If you're looking for someone other than Apple to launch a single prominent consumer device, then Amazon is your best bet. If the news is true, my money's on tablets being a two horse race, not between Apple and the field of Android players but between Apple and Amazon.
If the Kindle experience is anything to go by...
... Amazon adverts are served only if you've accepted a subsidised device. And given the work Amazon will have had to do in isolation due to Honeycomb being temporarily closed source, I'd expect them to have cut out the Google ads even if it means sending all the normal Google apps with them.
@The Fuzzy Wotnot
While I agree with your point — which I take to be that Apple shouldn't be condemned for making a computer as an appliance given that there's a market for computers as appliances and lots of people want them only as appliances — I think possibly the offence here is that Apple are adding and removing features that some people want without any sort of notice and with no regard to that particular audience.
So it's characteristic of their control tendencies and it further evidences which segments they're actually interested in selling to. It's also a sign that they don't mind deviating from industry standards if they think something is to be gained for their target audience. So I think it plays both ways. I can see why it offends a lot of El Reg's readership but I don't agree with a lot of the motives that are assumed to be behind it, or that it imputes much upon Apple's customers.
That's not quite the issue here; Apple have built a temperature sensor and the firmware necessary to report back from the temperature sensor directly onto the drive, have created a proprietary connector to allow drive + sensor to talk back to the rest of the system and have set things up so that any failure by the drive to say that it's safe results in the fans spinning up to the maximum extent possible.
This hinders third party upgrades, so is a negative step in the eyes of a group of their potential customers. Any individual who would have swapped out the base build drive for something larger after purchase and who now instead opts to pay for a build-to-order upgrade will have to pay more than they did previously per the industry-wide rule that build-to-order upgrades cost quite a lot more than buying the better part and performing the same task yourself (often even if, hypothetically, you were then unable to realise any value from whatever bit you remove).
Conversely, it possibly shrinks the total size of the sensor + the drive (or, more probably, the cost of the two together), and iMacs haven't been designed to contain user serviceable hard drives at any point that I remember. Most iMac purchasers already treat it as an appliance and attempts to upgrade are rare — the RAM is user serviceable and very trivial to access, everything else is hidden. Upgrading some non-user-serviceable parts of the iMac is easier in this model than the last (including the CPU, notably), but that doesn't make for an interesting story.
So: Apple have taken a step that upsets some of its customers, but not most. It's news but it's not really the end of the world and it doesn't say much about Apple beyond reinforcing whatever you already thought about them.
What are you talking about?
They all look like well-educated, open minded, pragmatic types to me.
On the other hand...
... some who use Doctor Who primarily as something to talk about in advance, with theories and leaked plot elements, seem to get many hours more enjoyment out of it than they would merely by watching the television programme. And they're probably still buying the merchandising, while those of us that prefer it just as a television programme still didn't know what was going to happen.
No, I think it's to protect at the other end...
i.e. Motorola and Samsung versus the no-name, very low specification, resistive tablets that are threatening to give Android an unfair bad name. I also think that maybe why in 'Honeycomb' they've picked a codename that sounds good and is being pushed as part of the branding, and seem to be retaining it to the next minor version.
Either that, or it really is just that the code doesn't look very nice. Not everything is a conspiracy and companies do sometimes tell the truth.
@AC "pad size"
I thought it was more that 10" screens are quite close to the size that both the international community and the Americans separately have settled on as being good for a piece of paper, so the thing ends up a natural size for web pages, PDFs, magazines, etc. The 7" screen is conveniently like a paperback, but less suited to the web. And that's before you throw in the media centre component.
I think you're right; with Clang now fully capable of C++ and Objective-C++, they've switched to a Clang/LLVM pair for Xcode 4, to power not just the compilation stage but the static analyser, the as-you-type error highlighting, and a bunch of other workspace things.
At present they're pushing all the way to a native binary, but it feels like it'd be a relatively trivial step from here to build LLVM into the OS and move towards LLVM bytecode generation being the default in the developer tools.