Re: Nonexistent Nexus?
Indeed, my main thoughts have been 'is $100 extra over the Nexus worth it for a 40+% bigger screen?' — I wonder if we're about to see an inversion of the usual squabbling over whether screen sizes matter above all else?
2101 posts • joined 18 Jun 2009
Indeed, my main thoughts have been 'is $100 extra over the Nexus worth it for a 40+% bigger screen?' — I wonder if we're about to see an inversion of the usual squabbling over whether screen sizes matter above all else?
That'd require some significant rewiring within HFS+, wouldn't it? I mean it's obviously trivial to have, say, /Applications on the SSD and /users on the platter but from the announcements it sounds like they're talking about moving individual files between them but still having them appear to software to be in the same place?
Probably I'm relying on faulty information.
In 2011 96% of Google's income came from advertising (source: http://venturebeat.com/2012/01/29/google-advertising/). Even if Android were the only other one of Google's income streams, it would still be insignificant in comparison in terms of revenue and in any case I think that trying to close it up and squeeze more money by that route would be counterproductive. It's certainly not in itself going to lead to the sort of growth that is going to offset advertising losses.
So I'm confident that Android is safe exactly as it is.
The full framework documentation is at http://developer.apple.com/library/ios/#documentation/DeviceInformation/Reference/AdSupport_Framework/_index.html — if an advertiser wants explicitly to post a location then the app will have to request and be approved for location updates. Otherwise all they're getting is the "alphanumeric string unique to each device, used only for serving advertisements. [...] the same value is returned to all vendors. This identifier may change—for example, if the user erases the device—so you should not cache it."
Your 'opt out' appears equivalent to the don't track HTTP flag in that the advertising agent gets told that you don't want to be tracked and is then merely honour bound (or possibly legally bound, depending on your country) to obey. No technical barrier is erected. At best I guess Apple may implement some sort of vetting system for app approval.
I'm not sure this is entirely true, but only on very slender grounds: I can never find anything in any reasonable amount of time within my iPhone's settings. The whole layout seems completely counterintuitive. As a result, I don't think we can assume malicious intent from the simple fact of a new setting being in a very strange place.
Obvious examples: why is Auto-Lock a 'general' setting but Brightness & Wallpaper a top-level setting? Why is iTunes Match under 'Music' rather than under 'iCloud'? How is asking the phone right now to check for a software update a setting at all?
They desperately need an OS X-style search bar, I think.
Unless they foolishly introduce an iPad Mini in the tiny gap between iPod Touch and iPad, of course. Then they've no real flexibility to drop the price of one thing without having to drop the price of the whole lot. If it's take a hit across the range from the lowest iPod up or wait and expect that Microsoft fails on its own merits I think it's likely they'll try the latter.
They're as good in isolation as most other word processors, spreadsheets and presentation software, and as compatible with Microsoft Office as most things that aren't Microsoft Office (ie, reasonably but less so since Microsoft started creating fonts specifically for its idiosyncratic kerning and declining to license them).
I like to imagine that Microsoft's belated entry has made a lot of people realise that Office-equivalent functionality will do, whether on iOS or Android, in much the same way that IBM compatible became an acceptable alternative to buying IBM — especially when they took forever to release a 386.
Per the OED, decimate has had the meaning of 'devastate' since at least 1663; every single use of the word in the British National Corpus as maintained by Oxford University uses it in that sense, though they all seem to be from the 1990s or newer so that's not a fantastic argument.
It's the idea that it's being used incorrectly that's a modern invention. It's up there with the idea that the lack of a possessive apostrophe on 'its' is a special case* for false inventions that have somehow gripped the public imagination.
* it's not — check the other personal pronouns; 'one's' is the special case.
I always thought the approximate birth of Christ (leaving the question of divinity explicitly aside), the accession of Richard the 1st, the French Revolution and possibly even January the 1st 1970 were fairly important dates. Little did I suspect that a new laptop was going to sweep our current era aside.
I have my fingers crossed for the sake of the original author that his words have suffered an unduly literal translation.
Surely the question is: if the alleged iPad Mini is £199 then who's going to bother with an iPod Touch? The Nano and the Shuffle are already there for jogging, living in the glove compartment, etc.
Aspect ratios aside, I'll bet the new device won't run existing apps but physically smaller.
My logic is that the result would be a user experience train wreck, Apple has put significant effort into how layouts resize under iOS 6 (struts and springs are gone; you now specify arbitrary constraints) and it would be very uncharacteristic not to arm twist the developer base into adopting the latest technologies.
If implemented correctly it should be imperceivable — so much effort has gone into engineering mobile phone software to be power efficient that spare processing capacity is available quite often.
We should be grateful that Google, with a vested interest in doing it well, is stepping up before manufactures start shovelling on their own solutions. Have a wander around PC World to see the worst possible outcome.
If iOS is a copy of Android because it has lifted some good ideas (the notification area being the most obvious) then Android is a copy of iOS for the same reason (eg, pinch to zoom). Neither is inherently in the wrong, and I think it's really only ever cited as an issue because Apple insists on being so litigious.
OS X isn't built on BSD. It also isn't bug ridden.
History says Windows 8 won't bury Apple. Windows 95 and 98 were lightyears ahead of System 7 in a lot of important areas — preemptive multitasking and memory protection sound like tedious tech wedge issues but substantially improve the user experience. Regardless, Apple survived.
The Linux distros don't make OS X old fashioned any more than the Apple Magic Track Pad or whatever it's called makes mice look old fashioned. Simply being different isn't the test.
Hackintoshes are a solution for, what, the most technical 5% of people?
Samsung and Apple phones and tablets are basically indistinguishable. The fact that developing software for them is common now and that tech types like yourself therefore ascribe greater significance to brands hasn't much changed how people pick their devices; nobody outside Internet forum types thinks of either as a great satan. Ironically it's the people that most complain about Apple users slavishly following the company that most strongly define themselves by a brand — it just happens that they're defining themselves in opposition. In any case they're a tiny subset of society. Based on the value proposition Apple probably deserves some segment of the market — say 10% or maybe even 15% — and will probably end up profitably maintaining that segment.
Apple has almost $100bn in cash reserves. That's the biggest cash hoard in corporate America, and more than twice that of Microsoft. Even if nobody buys another Apple product or service ever again, they're going nowhere for a very long time.
For the purposes of anecdotes, I've seen the purple fringe too, with a lot of light coming from the top of the frame but no light source in or near the shot.
Conversely, there's no green glow, the battery seems to last about twice as long as that in my 4S (though I'm comparing one after several months of development use, with the huge number of part charge cycles that result from plugging in and unplugging the device, to the other more or less straight) and if there are any other hardware complaints doing the rounds then I can't claim to have experienced them.
@Destroy All Monsters: The 68020 is definitely 32-bit, no matter which way you cut it — 32-bit instruction set architecture, 32-bit data bus, 32-bit address bus. The 68000 was considered 16-bit at the time because of the 16-bit data bus though I'm not sure it'd be classified that way now as it had the full 32-bit ISA.
@Gaius: the MultiFinder was released in 1987 as part of System 5. In performance terms the Mac had the edge as of the Mac II (also 1987) since the Amiga stayed at ~7Mhz on a 68000 until 1990, whereas Apple more than doubled that clock rate and switched to the 68020. For GUI tasks the original Mac also outpaces the Amiga because the CPU is slightly faster and the display used for the GUI — being limited to 1bpp — is a lot more compact, and hence faster to manipulate. So there's really no tenable argument about performance.
Apple weren't shy in charging very high prices and often lagged in features (like, you know, colour) but it's rewriting history to suggest that there was "no contest".
The Apple 2 came out a year after the Apple 1, from the same team, based directly on lessons learnt and using money earned, and was explicitly a direct replacement product. There's a direct causal nexus between the two — the Apple 2 exists because the Apple 1 existed.
Conversely things using an ARM today do not exist just because the Archimedes project existed. They'd all still exist, they'd just use a different processor architecture.
I'm also unclear why you assume that if a machine can't be written off as virtueless then it must be worth a lot of money.
Surely another virtue is that it led to the Apple 2, the personal computer that ran VisiCalc and hence became the first to be popular in business? A lot of small and medium-sized businesses hadn't previously had any sort of computer at all.
Sadly not. It's one of Apple's new lightning ports. They're technically better because the meaning of the pins is configured by a chip in the cable, allowing it to support arbitrary connectors at the other end, but practically worse than a micro-USB or traditional iPod dock connector as an example of the usual arguments against proprietary ports.
This would appear to be only a problem with that one widget, so it's a highly localised display bug. Given that the previous screw ups have been system wide issues related to comprehension of daylight savings transition days, this does technically count as an improvement...
I'm pretty sure they invented the PDA so that Hungry Horace could better schedule his skiing trips.
The back screen is 3 inches, 4:3 (to include some metadata below the photo and to allow you to shoot 4:3 if you desire) and packs 1.23 megapixels. So by my arithmetic that's a resolution of 1280x960 with real screen dimensions of 2.4x1.8 inches. Or slightly more than 533 pixels per inch.
And, to make the point, the photos look like real photos, light emittance versus light reflectance aside.
That's not the same sort of DPI. On a screen 300DPI usually means 300 RGB triplets per inch, in printing it usually means 300 single coloured dots. That's why some people prefer PPI for pixels-per-inch.
Given that there's a world market for about five computers, networking hardly feels essential at all.
Your contention is that there's some ill-defined elite of Internet overlords that, behind closed doors, configure their machines to British English, download all the browsers, type 'color' into a text box and if no spelling correction hint occurs then allow the browser to succeed in the market?
You could argue that, but on the other hand we'd not yet have had the smartphone revolution or any particularly usable tablets because it's only ARM's virtual ownership of that market that's made Intel chips even slightly suitable for low-power use.
H1Bs are fairly easy to transfer; otherwise the problem that Microsoft (and everyone else in the article) identifies is very real. New H1Bs run out in April or May and come into effect in October; you can expect to have to offer at least a 50% greater salary to anyone you want to hire around October simply because the pool artificially contracts.
Proprietary disk drive, proprietary joystick wiring and it _still_ doesn't do clash-free graphics? Not to mention the wallpaper operating system — we've had menus for years.
Samsung would be foolish to cut Apple off. Apple wouldn't just go home and think 'well, that showed us!', they'd take the same amount of money and give it to a competitor. All Samsung would achieve would be to boost their own competition.
iPhones outsell Macs by about 7:1. Any notion that they sell exclusively to closed-minded sheltered people who buy nothing but Apple is demonstrably false. It's likely that the vast majority of iPhones are used with computers other than Macs, if they're used with computers at all.
It doesn't sound like artificial scarcity because, as a poster noted above, there are several places where it wasn't scarce.
It's also valid to compare the iPhone specifically to the GS3 because Samsung do so in their own advertising. It's only false to draw wider inferences.
If that's true it'll be because Microsoft will supply its OS to anyone that wants to buy it and bundle it onto a phone, exactly like Google. So if WP8 were to start picking up steam then the sales people at Microsoft will be chasing Google's immediate customers, which should manifest itself as a benefit to end customers through market competition and diversity, and lead to WP8 being in a similar price bracket to Android.
Conversely the marketing support for iOS is locked in.
Apple puts live information into its notification area, not merely new messages. So, for example, the topmost thing by default is a five-day graphical weather forecast, which is updated live.
Apple's notification area also lists new information as per its Android inspiration but it goes further. It's not merely posting notifications for weather, stocks, etc.
Apple prefers to put the information that it expects people might want into the notification area — so in there you get weather, stocks, a summary of your inbox, your Twitter activity, etc. Third party apps are restricted to posting text summaries but otherwise put whatever you want in there. So it's not that iOS still doesn't have them, it's that Apple has chosen to implement something else instead.
If you look at the history of widgets elsewhere — banished from Windows, ignored in OS X, the related clutter a driving force behind the average user's preference for search engines over web portals — you can sort of understand that position, even if you're part of the large group of people that don't agree with it.
There are a bunch of new APIs under the cover. None of it is earth shattering and significant parts of it are more to do with offering centralised and properly updated support for things you could have achieved by other means (eg, the Facebook integration, the helping hands for UI state preservation and the new constraints-based layout framework, the latter being no doubt to enable iPhone 5 support) but there's quite a lot there nevertheless.
In the world where Chrome is on version 21, Firefox is on version 15, etc, I don't per se see anything wrong with incrementing the version number just for significant internal changes but I think your main point is correct — if this is of interest to you at all will mainly hinge on what you think of the app updates.
The 3GS version of iOS 6 is an awful experience. Though arguing that you're better off not using it hardly seems to make up for the fact that you don't have a choice.
Microsoft got hit with an antitrust suit for using its monopoly position in one market to distort competition in another. Specifically it used its desktop monopoly to kill Netscape.
Apple doesn't have a monopoly position in the phone market. Apple's inclusion of a different mapping app isn't going to kill Google. Indeed Apple's market share in the maps market is going to remain minuscule compared to that of Google Maps.
For the record, Apple argues that the licence with Google ran out. It probably did but something tells me they didn't try very hard to negotiate an extension.
Inflation helps those in debt because it decreases the burden of debt. You can therefore be confident that when governments are in heavy debt, inflation will increase one way or the other.
It's a double win for homeowners — those with the most burdensome debt most people will ever experience — since not only do their wages go up to make their mortgage repayments look smaller but the value of their house goes up so that not only does the bit they already own scale with inflation but they get the benefit of the inflation on the bit they don't own.
In summary: unusual increases in inflation is to redistribute wealth from lenders to borrowers.
So inflation is a short-term pain (wages being a trailing indicator, unlike the price of bread) and becomes a serious problem if it goes on long term (lenders factor it in, debt becomes harder to obtain, the economy slows down) but in the medium term it's good for debtors.
In the US — similarly to most western nations — more than two-thirds of properties are owned by the residents. That means that inflation, if controlled, is a good thing for the majority of people even if it doesn't immediately feel like one.
Objective-C is Java's primary influence (source: http://cs.gmu.edu/~sean/stuff/java-objc.html); while the syntax is reasonably different, switching from the one to the other isn't very tricky because the design patterns are generally very similar.
In six months I'd expect you could hire a Java developer and get him or her to learn Objective-C and write a pretty good iOS app, or vice versa.
El Reg reported the difference as two times, by the way — http://www.theregister.co.uk/2011/03/15/apple_ios_throttles_web_apps_on_home_screen/page2.html — though I guess the difference will likely have grown since then as Apple optimise the one while ignoring the other.
"I once did a hundred takes and still couldn't say the word 'incorrigible.' Great! NOW I get it! Siri, bring Jessica Tandy back to life." ?
You've never heard anyone exclaim 'GPRS Creepers!'?
In common with most other phones, it's got a browser and maps and a decent camera. So if you get lost walking somewhere, you can find out where you are. If you're sat on the bus then you can check out BBC News or whatever. If you see something interesting you can get a shot as good as a point-and-shoot.
More specialised tasks can also be achieved: if you're frugal you can scan the barcodes of products you're looking at in shops and find out how much they'd cost from Amazon or wherever. If you own a large amount of music you can put it into one of the online lockers and then your phone becomes an MP3 player with ridiculous storage. Subject to your OS and service provider you can use your phone as a portable hotspot for your computer.
I think that, at best, they stuck with the old screen size for too long. It was in the top tier back in 2007 so 'always' is stretching things a little. Making more of the surface area glass than just the front always seemed a bit stupid though — if I accidentally apply excessive force to my device I'd much rather have a dent at the point of impact than a shatter across the entire face. As for the 3:2 aspect ratio? It's the same as 35mm film and closer to the golden rectangle than 16:9 but I guess moving to the industry standard buys some advantages, and surely many more people are watching movies than looking at digitised 35mm images.
Indeed — Apple's assertion that the licence to include YouTube has merely run out sound likely to me. It's a valuable brand and I can't think of any reason why Google wouldn't want to take back the reigns. I really don't think there's been any shenanigans here from either side.
The various agencies have to investigate and find a monopoly abuse. In this case Microsoft went too far to cut Netscape out of the market. One of the things they did to achieve that was the bundling.
So Microsoft used its dominant position in one market (home computers) to distort competition in another (Internet browsers). That's the definition of anti-competitive behaviour. The idea is to prevent a company that's doing well in one market from being able to obtain control of another artificially. Control over a market isn't in itself punishable but the law intends to allow natural market forces to operate.
So there's the fact that when Microsoft acted the market for home computers and the market for internet browsers were considered to be separate. People were paying for Netscape when the behaviour began so that's not entirely unreasonable, though it's a little peculiar to think about now.
There's also the fact that Microsoft used the one monopoly to obtain the other. Were history radically different such that the iPhone ended up with 100% market share then the browser policy still might not be a problem — the product was launched with the explicit feature that third-party browsers (or, at the time, any third-party apps) weren't permitted. That's still the case. So at no point would Apple have used the one monopoly to obtain the other, rather consumers would have judged that the limitation was not a troubling impediment.
As for notepad, paint, etc, they don't matter either because they've always been inherent parts of Windows and/or because they would be considered a natural part of the desktop.
Putting the legal abstractions aside: at no point was there some other company producing far and away the most popular examples of something like notepad or like paint that Microsoft brushed aside through brute force of its desktop position. Conversely there was such a browser manufacturer. Bundling was only one part of what Microsoft were found to have done but it was something the court could realistically attempt to remedy with its existing powers.
The point is to protect the markets and allow consumers to use their free will so that if there is a single dominant player then it gets there on merit. The system's imperfect but all justice systems are and the stink of the various agencies acting against Microsoft is part of what gave Firefox a push.
The rules are in place to prevent distortion of the market that harms the consumer.
Apple are not in a position to distort the market so as to harm the consumer. Microsoft were. That's not just a legal technicality — it happened in real life. Microsoft used its desktop monopoly to kill Netscape. Once Netscape was dead it said 'Oh, IE6, that's good enough isn't it?' and more or less walked away. That left us with IE6 for five full years; it wasn't really standards compliant where it tried to be and it included deliberately proprietary technologies but there was no competition. Supporting IE6 for all that time was a significant cost to the industry. Supporting IE6 is still a significant cost for some businesses. So Microsoft very visibly distorted the market. In doing so it harmed the consumer.
Microsoft is hence required to include a browser selection screen to reduce its ability to leverage its desktop share to distort the market for browsers and to create a more competitive market. The more competitive market benefits the consumer.
Conversely, Apple has never distorted the market. If it were to release a barely compliant browser with proprietary standards then people would simply ignore that browser and possibly that device. It doesn't have a hegemony such as to be able to dictate standards for web pages, and it never will.
To be fair to Microsoft, the company has jumped quite enthusiastically aboard the standards train now and I can see no reason to believe they'd repeat the errors of the 1990s. Having a proper, standards-compliant browser engine gives them the target portability they need to compete as the set of target devices becomes more complicated than 'Intel, on the desktop'.
I think it's just a small step beyond the normal connected TV concept. Instead of getting just YouTube, iPlayer and whatever else the manufacturer signed up and included you get Android as the standard base on which to install any media access software you want — whether the media is the traditional video and audio or something like your word processor in the cloud.
Or if you prefer it's like any TV with a browser but plus whatever the same advantages phones get from supporting apps as well as just the browser.
In defence of the original poster, I took his post more in the sense of 'I wish more manufacturers had a non-crapware option' rather than being specifically about Android or about advertising, since that seems to be an Amazon-specific problem.
People like you and I may know that the first thing to do upon purchase of a new PC is to uninstall the 30-day Norton trials, vendor-specific browser extensions, etc, but if I could pay £15 to move responsibility for that task for people like my parents from me to the manufacturer then I'd do so.
There's as much evidence for that statement as there is for this: the pointlessly angry and the clueless typically dislike Apple products as they mistakenly believe that doing so is a status symbol rather than a totem of their lack of intellect. They need to get a life and buy a clue.
In reality both statements are false.
Yuppies are as likely to buy Androids as iPhones because they can waste their money upgrading to the new flagship every few months rather than every year. Contrast with normal consumers who probably upgrade every couple of years or so, whenever their contracts make it economically sensible to do so.
The clueless are equally likely to like and dislike everything because they pick at random.
The pointlessly angry just want an excuse to shout; they'll dislike everything that comes along.
Either materials science doesn't exist or you have a point.
I'm one of those people that has dropped an iPhone down the side of a boulder (albeit only a couple of metres; being no great athlete I was merely scrambling) without any ill effects. I seriously expected the back to be smashed, or at least scratched, since it's glass and I'd dropped it onto rock.