Re: License to print money
However, it's exactly the same price difference as officially charged by Samsung and RIM, and was formerly charged by HP. So it's a bit of an industry-wide rip off.
2104 posts • joined 18 Jun 2009
However, it's exactly the same price difference as officially charged by Samsung and RIM, and was formerly charged by HP. So it's a bit of an industry-wide rip off.
It's not fair to compare it to a Kindle; on electronic ink devices the pixels exactly meet up so that a grey line is a continuous black line. On LCD devices the red, green and blue elements have gaps between them so that a grey line is a series of discontinuous half-lit red, green and blue pixels. As a result the electronic ink looks infinitely better at a much lower resolution.
It's the iPad 1, but check out http://www.wired.com/gadgetlab/2010/08/pictures-kindle-and-ipad-screens-under-microscope/ to see the difference under a zoom lens and a microscope.
Having taken ownership of one thanks to my job, I can confirm that the new iPad gets noticeably warm whereas the old did not. On the plus side, my cat loves it.
I think quite a lot of the discussion is manufacturers trying to find a way to distinguish their products and the usual partisans trying to turn it into a wedge issue. Which explains the pattern of up and down votes here, I think — even discussion it like normal people puts you a bit too close to areas heavily ploughed by the trolls.
Me? The 4:3 is closer to A4 (and US letter) so feels better for reading, especially with a pixel density that looks almost as good as printed material. Widescreen is trivially better for most modern video content because most modern video content is widescreen. So I agree with you that it's a preference.
The Maltese Falcon? The Wizard of Oz? Frankenstein? The Fly? His Girl Friday? The Man Who Knew Too Much?
Admittedly I can't think of one from the last twenty five years.
So the conditions described aren't prima facie wrong, it's merely that there's a certain amount of profit above which child labour, employee poisoning, etc becomes unacceptable?
I prefer to think all the named manufacturers are in the wrong. If placing disproportionate blame on Apple makes everyone have to act a little more properly then I accept that it isn't fair but I'm all for it.
It may also be a more muted launch but we won't know until the numbers are in. I'm of the opinion that the new iPad is a bigger step onward from the iPad 2 than the iPad 2 was from the original iPad but either of the early models easily passes the threshold for comfortable usability.
You probably have a point though — we have several of the sort of people that probably queued last year in the company but all they've done is preorder, with someone planning to head down (to the SF shop, no less) at lunchtime to pick the things up.
And the weather here is indeed very glum. It's been raining since Monday so I'm starting to wonder why I didn't just stay in London.
In this case the term fanbois would offend me since only exactly one person turned up four days early. What are you going to do next? Split the infinitive?
They said the pixels have to be small enough that a person with normal eyesight can't tell them apart at the distance they normally hold the device. Then they said that's 300ppi on a phone. At the iPad launch they sort of argued that, you know, people tend to hold tablets further away, so the density can be less while still fitting their definition of retina, etc.
I'm not sure I buy that since I seem to hold both phones and tablets at approximately arm's length and in any case if I play, say, a polygon-based game without antialiasing at the full screen resolution on my 4s I can make out the aliased edges. And I've had routine eye tests so I know that I'm average and not some sort of super human.
In theory if the pixels actually were too small to see it wouldn't matter exactly how source content maps to them — aliasing errors would be present but invisibly small, just as how magazines have been able to print TV snaps for years without someone doing arithmetic on exactly what size the picture needs to be to come out well when the printers are done. In this case I would expect that errors will be visible to the keen sighted but basically not a major problem.
Don't buy what everyone else tells you to buy! Buy what everyone else [on El Reg] tells you to buy!
Apple hasn't been lazy; this year's model has four times the pixels, twice the processing, etc, etc of last year's model. That's not below the progress rate seen across the industry as a whole.
The article's right though; it's just an iPad. This does not change the world.
The claim in the launch was that the iPad 2 remains faster than the Tegra 3; the new iPad is even faster still. I, probably like you and everyone else reading this, have never cared enough to bother seeking out an independent assessment.
But, yes, it's a product refresh. Like every other piece of consumer electronics you'll hear about this year, and probably all but one of those you've heard about in the last five.
Backlit screens have to compete with all other light sources and so perform worse as ambient lighting increases. Electronic ink screens reflect light and so perform better as ambient lighting increases. If you like to read outdoors that's a big win for the Kindle crowd.
And, yes, I have one of each.
It's a result we all know is coming though, right? Apple's sales, though still growing meteorically, aren't going to keep up with such a rapidly expanding market — especially now Amazon have jumped in.
The interesting thing is going to be whether Amazon's fork of Android ends up deviating so far from the original that it becomes a different market segment again.
I agree — if that's what they've done then they've basically killed the privacy of private mode.
That said, the release note is "allow cookies set during regular browsing to be available after using Private Browsing", which could equally just mean that entering private browsing mode no longer throws away all your session cookies (assuming it did before?), it simply temporarily hides them. That is, assuming Apple meant "using Private Browsing" to signify an ongoing state of affairs rather than just hitting the switch for private browsing.
To be honest, I think Apple could have been clearer. El Reg's interpretation could equally be true.
The article doesn't say Atari invented Pong (the game, not the brand), merely that it's iconic, that Bushnell and Alcorn produced the first successful arcade game and home console game with their versions of Pong and that they're looking for an iOS successor worthy of the brand (which, obviously, Atari did invent).
If anything I'd have expected ICS to run better; obviously it's not the only factor but the Google I/O presentation on how drawing has finally and definitively moved over to the GPU as of Android 3.0 sounded like a major performance win and the Galaxy S has a pretty good ES 2.0 compatible GPU.
I guess the difference might be that most people get iPhones through their mobile carrier, not directly from Apple? There was certainly a long queue outside Apple's Covent Garden shop for the iPad 2. I wasn't in it so I've no idea how demand compared to supply.
When I was a copyeditor I routinely had just Outlook and Word on screen at once, side by side and without overlap, with time spent elsewhere very minimal. That's probably not too common a use case though; even in the job before that I was juggling at least InDesign, CorelDraw and Outlook.
I think it's more that Android brings a decent interface to products across the market, from the budget end upward and in a variety of combinations. Although they've made concessions recently, such as continuing to offer the 3GS for free or almost free on a contract, Apple primarily target only the premium end of the market, with a single product.
Sure, Verizon were first to give Android a big push as something larger than the individual manufacturers combined and in that sense Apple's strategy may have given Android some early momentum, but Apple was never going to overwhelm Samsung, Motorola, HTC and a bunch of others combined.
@a_been: people who live in glass houses shouldn't throw stones. banjomike accuses Jobs of attempting to get _ownership_ of email, I point out that Unicode wasn't _owned_ by NextStep or Apple. The exact point of open standards is to divorce ownership from creation — it doesn't matter where the thing was developed, what matters in this context is that even if Unicode had been used for the initial MIME standard that would have given NextStep and Apple no control whatsoever over email as a whole. I'd suggest that in future you pay more attention to what people have written before charging at them with playground insults.
@boltar: I said 'better', not 'perfect'.
Unicode is and was an open standard. It has never in any sense been owned by NextStep or Apple.
Had Jobs succeeded, international text support would have been better from day one, except in the real world where a whole bunch of functioning systems would have been unable to implement MIME and it would probably have taken even longer for a standards-based approach that isn't Anglo-centric to win out.
The people saying there's no performance problem with Flash on budget Android devices (and I accept that the discussion is essentially my anecdote versus yours so there's no reason for a third party to believe any of us over the others) are missing the other significant part of my point against Flash — it's deprecated. As in, by Adobe. It will no longer be developed. One day soon it simply won't work.
I would therefore maintain that it simply isn't an option, regardless.
As for the allegations that Flash isn't on the iPhone because of greed, I strongly disagree. Flash didn't make it to Android until 2010 — three years after the iPhone's launch. That suggests it genuinely wasn't ready in 2007. I think it's more likely that Apple's initial decision was a technical one and that the subsequent fighting between the companies, from which I don't think Apple comes out looking all that good, caused it to entrench its position.
If Google were to dump H.264 and go WebM/Flash on YouTube they'd instantly cut off most Android devices. Flash is deprecated and in any case works really poorly on ARMv6 models — such as most budget phones of the last few years. Hardware decoding is also generally H.264 only.
So the Apple angle hardly comes into it.
But it's really useful! For, ummmm, those turn-your-iPad-into-a-miniature-arcade-machine joysticks?
Or you could just buy the HDMI 'adaptor' (cable, essentially, but to female HDMI). Everyone seems to want about £25 for it though (including third-party alternatives), so you're already a quarter of the way to an Apple TV.
The new one does 1080p — at last! — which presumably is why this issue has just now come to light.
While Apple are clearly profiting disproportionately more from the higher cost models, the original point alleged this behaviour was unique to Apple. Going to Amazon and searching for 'Galaxy Tab' shows that Samsung's list prices charge the consumer... £100 extra for 3G. Or £100 extra to move from 16gb of built-in storage to 32gb.
So in that sense the original point doesn't really stand at all. This isn't an example of Apple overcharging, it's an example of the market overcharging and Apple following market trends.
If I were to buy anything, I'd want one with no vested interest in where I obtain my content. I know that's going to lock me out of iTunes (including the iTunes Match music collection I pay to keep) but for television and movie content I really want to be able to use services like Hulu with a cheap subscription model — I know Apple provide interfaces to Netflix and some other competitors but they also appear to be locking others out.
It can wait until the next television upgrade though; I've no interest in having yet another box with yet another cable, no matter now diminutive.
Per the linked story, an 8% share of US homes and a 7% share of European homes gives Apple 32% of the market. So by a clear margin most people you ask aren't going to have any device at all, and most people you ask with a device aren't going to have the Apple. That doesn't sound so unrealistic, does it?
I guess it's just that most people aren't buying these things — IT types probably just hook up their laptop or have a dedicated full computer, most others probably either have a TV, BluRay player, console or cable/satellite box with enough functionality already built in or else are happy using their computer directly to view Internet content.
And 640kb ought to be enough for anybody?
You probably don't want to live there; the residents are always jumping out and hassling you whenever you attempt to cross a bridge.
Screen resolution seems disappointingly to have plateaued in the industry as a whole. Any hope of someone bringing out a laptop with pixels I'm no longer aware of any time soon?
If you think that having the same pixel resolution on a 7.85" screen means that user interfaces designed for a 9.7" screen need no changes then you're clearly no user interface expert. Individual user interface elements would need to be almost 25% larger in pixels to have the same touch area — that's not the sort of amount you can just ignore and hope for the best; if you do then people will spend a lot of time missing what they're aiming for.
Looking at it another way: Samsung, Amazon, etc, have spent time ensuring their user interface is designed appropriately for their 7" devices. Apple can't afford not to.
So that's two trolls not to feed.
Apple possibly has something amongst the haul they got from FingerWorks — the latter was a gesture recognition company that specialised in multitouch but which presumably patented the gesture stuff independently of the specific technology.
Hopefully everyone will have had their fingers so badly burnt by the patent process by the time that this stuff is commercialised that it won't matter.
The high resolution rumour appears to be quite likely for my money — based on the fact that it has proliferated enough that Apple must be aware of it and they've released a flier that clearly shows a very high resolution screen. That'd give Apple the highest density display on the market by a significant margin, matching their position as the supplier with the highest density display in mobile phones (albeit by an increasingly slender margin, it being almost two years since they took the title).
The main difference between the 4S and the 4 was the move to a dual core processor with a faster and better specced GPU. So that was unambiguously a minor update but if your fictionally generalised "Android users" expect their CPU and GPU to change with every software update then I'd suggest they look up the difference between software and hardware.
In reality I'd suggest that (i) the users of different handsets aren't locked in the eternal combat you imagine; and (ii) nobody expects more than minor incremental improvements from anyone at this point in the technology lifecycle.
DVDs and BluRay discs have DRM. Implementing that did not require any changes to the fundamental data distribution network — I can still walk into a shop, pay with cash and walk away with a disc that I can play on my player, which needn't have any sort of network connection of its own.
One would therefore assume that a minimal form of DRM might be simply to transmit video using asymmetric key cryptography, with individual vendors given a key derived from a common route that can decrypt the stream.
Encryption and decryption keys would be protected contractually. So to obtain a key you guarantee you're not going to publish it.
Mozilla would presumably opt out and Google would probably simply reserve it for the binary release. At least in Mozilla's case, I expect they'd provide some sort of mechanism to allow third parties to provide a binary blob that could push decrypted video in place of the normal video playback mechanism. The blob would be much simpler, and hopefully therefore much less error prone to implement, than Flash. There would be commercial value in making sure that someone supplies the blob for companies like Netflix, Hulu, etc, so it's reasonable to expect that someone would.
Of course the scheme can be broken — as others point out, DRM can always be broken because it's an inherently conflicted technology, and I'm aware that both DVD and BluRay have been broken — but as it'd be at least as secure as BluRay it'd presumably be secure enough for content providers to continue to provide content.
I have issues with DRM of content I've purchased because it's usually so over-zealous as to remove large swathes of legitimate functionality.
On the other hand, with services like Netflix that explicitly supply short, time limited rentals I think DRM is entirely appropriate. The ephemeral nature of your access to the product is explicitly part of the deal.
So I don't think it's an unethical thing to add to the spec and I'd much rather it's figured out by a broad forum like W3C, with a number of sceptics in the room, than by the highly partisan interested parties alone.
To equate the poor and needy of the UK to the poor and needy of India shows a lack of understanding of the levels of poverty throughout India.
It's a serious simplification to suggest that there is one politics of India that always moves together; it's a federation consisting of 35 states or territories, many of them no more willing to help their neighbours than you and just as quick to play the 'look, it's none of our business what happens over there' card.
If you make the fair redistribution of wealth a criteria before a country qualifies for aid then another way of phrasing that is that you're reserving aid for strict socialist countries, of which none currently exist.
The logical conclusion of 'if we help some we must help all; otherwise we should help nobody at all' is that even if we hypothetically had the money and the means to prevent the deaths of 99% of those that starve each year then we shouldn't do so.
About 40% of the population of India lives below the international poverty line and hence face a daily struggle for food, suffer from curable diseases, etc. Our aid to India helps improve conditions for some of them. If we withdraw the aid, those people will be worse off. To my mind, that justifies the aid.
Or better yet, ditch the scores entirely and assume we're capable of comprehending the prose? El Reg's reviews already have a 100-ish word summary verdict for those that just want to skip to the end.
A simple fact of the world is that equally intelligent people can always find things on which to disagree.
Trying to make the case that if people don't reach the same conclusions as you about which phone to buy then they must be stupid actually says more about your intelligence than theirs.
Sure, make fun of the iPad, but the PlayBook's mess is entirely of RIM's own making. RIM has one unique selling point — secure push mail — but has managed to give their tablet a native email client only almost a year after launch.
If you want a bargain media playback device, go Android. Let's not try to paint RIM in a positive light just because of dislike for the iPad.
No, Nokia sued first. Apple countersued. Source: http://www.businessweek.com/technology/content/dec2009/tc20091212_551557.htm
You might want to check your notes on that — Nokia sued Apple (and won), not vice versa. See e.g. http://news.bbc.co.uk/2/hi/8321058.stm
The iMac was actually quite speedy for 1998. The other issues you mention — the hugely outdated OS and screen too small for the Mac's niche audience of design and publishing houses — were greater problems.
The iPhone is usually the top selling single handset in any given portion of the year, because the Android market is more diverse and no single handset accounts for a sufficiently disproportionate share. The iPad is also currently the top selling tablet.
So while Apple doesn't hold anything even close to a majority of the phone market, its handset is still prominent enough to be worth citing in other handset reviews. The iPad just barely still holds a majority of the tablet market so is obviously relevant to tablet reviews.
I don't see any problem with reviewers comparing new products to the best selling products in the same category.
As for always mentioning legal disputes while talking about Apple? You'd have to talk to Apple's legal department to find out why that's currently almost unavoidable.
It's not really a question of right to take away, it's a question of whether the contract Apple has that ostensibly gives it rights to the name is valid. Thrown into that seems to be some sort of confusion about exact ownership of different parts of Proview. So Apple's case is that it paid to purchase the mark and Proview failed to assign it, Proview's case is that whomever Apple dealt with, it was somebody else.
I don't think there's any dispute that Apple paid someone and thought it had bought the mark in good faith, despite its history of launching products and dealing with the legal issues later (ala the iPhone and Linksys). There's definitely nobody suggesting that Apple has the right just to swoop in and take ownership of the trade mark because its an American company or a much bigger company or anything like that.
On the other hand, to me the London Olympics logo still looks breathtakingly awful even five years after its unveiling. Though it's possible I'm a minority?
It's not the first logical fallacy nay-sayers tend to learn though; they arrive at it only after years of using less fallacious logic.