Re: Icons (@AC)
At least that would make a change from the usual Zynga/EA/Disney/everyone approach of microtransaction-enabled Skinner Boxes masquerading as games?
2080 posts • joined 18 Jun 2009
At least that would make a change from the usual Zynga/EA/Disney/everyone approach of microtransaction-enabled Skinner Boxes masquerading as games?
I currently own an 11" MacBook Air. I don't see that it could be made any less repairable. I don't think a new floor will be forthcoming because I don't think one can be found.
Though the same comment goes for equivalently-priced TVs, etc...
Such a new machine would demonstrate the concept of diminishing returns in two senses.
http://www.techweekeurope.co.uk/news/apple-samsung-patent-truce-150286 : “Samsung and Apple have agreed to drop all litigation between the two companies outside the United States,” Samsung told TechWeekEurope.
http://www.theguardian.com/technology/2014/aug/06/apple-samsung-drop-patent-lawsuits-outside-usa : "Samsung and Apple have agreed to drop all litigation between the two companies outside the United States," the South Korean company said in a statement.
http://www.pcworld.com/article/2461940/apple-samsung-agree-to-settle-patent-disputes-outside-us.html : "Samsung and Apple have agreed to drop all litigation between the two companies outside the United States,” Samsung said in an emailed statement Wednesday.
http://gadgets.ndtv.com/mobiles/news/samsung-apple-agree-to-drop-patent-disputes-outside-the-us-571426?site=classic : "Samsung and Apple have agreed to drop all litigation between the two companies outside the United States," Samsung said in a statement.
The international lawsuits were dropped by mutual consent — both Apple and Samsung agreed — and this specific story is about an American judge disagreeing with Apple and siding with Samsung.
The B+ was the first BBC to route the write line to the paged ROMs, wasn't it? So you'd also be unable to use sideways E00 solutions on a regular B, at least without minor motherboard modifications (i.e. finding a write line anywhere and adding a patch wire).
If I dare make a possibly ill-educated comment: surely the issue with moving to a new SoC would be that the existing Broadcom is an ARMv6 device whereas anything newer is probably ARMv7. Which would make maintaining binary distributions of anything more troublesome — hardly an impossible problem, but a bit more awkward. In the intended environment, it'd at least mean remembering which pile of SD cards goes with which type of board.
As to price, I imagine there may be a calculation that with ARMv8 now filtering into mobile phones, there's about to be a whole generation of ARMv7 parts that remain incredibly capable but are suddenly a lot easier to get a good deal on just because of the realities of marketing.
There's also the fact that Broadcom has provided full documentation to the Videocore IV, which isn't standard industry practice — GPU internals remain entirely proprietary — and would be very unlikely to occur with a new SoC. So that would introduce a new round of binary blobs and impediments to the bare metal programmers.
It's not just X. From the OpenGL specification (every single version that I checked, including 1.x and 4.x):
The model for interpretation of GL commands is client-server. That is, a program (the client) issues commands, and these commands are interpreted and processed by the GL (the server). The server may or may not operate on the same computer as the client. In this sense, the GL is “network-transparent.”
The difference being that only one of those is a chore?
Yes, I know. That US English doesn't require a hyphen and that there's no symbol for separation — an anti-hyphen of some sort — is from where the amusing ambiguity springs. So it can, correctly, be read both ways.
Were it British English that would not be the case. Were it hyphenated it would mean the wrong thing.
The empirical evidence is that not everyone finds the second meaning humorous but it's a valid parsing regardless. The further evidence is that despite my comment about ambiguity being entirely clear to my eyes, you somehow took it to mean the exact opposite. Hopefully someone can at least enjoy the irony of my failure there.
I don't know about everybody else but internet journalism has really ruined cancer for me. I haven't been able to think about it in the same light for a while.
Now, if you'll excuse me, I must learn about the ten alkaline batteries to try before you die.
(EDIT: yes, pedants, I know, but compound adjectives aren't always hyphenated in US English — even formal US English — and the author is based in the San Francisco office)
This is your area of expertise though. Would you be confident enough to claim you've never done anything that an expert in some other topic wouldn't consider equivalently dense in the handling of some other area of your life?
@Gene Cash: Swipe up from the bottom of the screen to see the view that allows you to control Bluetooth and Wifi state (and aeroplane mode, screen brightness and a bunch of other things that aren't directly relevant to battery life).
As it isn't called a 'widget' that obviously means that the iPhone is some sort of failure as a device.
@Mike Bell: the really annoying thing is that if you have Wifi switched off then every location-aware program will show a modal pop-up insisting that it isn't going to be able to do a very good job because it has only GPS to use. Every time you launch them. But that's an interface failing, not a battery-life issue.
Some things, like Bluetooth, can be turned off without affecting a large proportion of users. Naturally it turns itself on again with every software update. I half-wonder whether it's the desire to persuade somebody, somewhere that iBeacons are a good idea: the better the stats for people walking around with Bluetooth on, the more relevant they sound, right? They're otherwise about as popular as Ping.
If we're talking about just the transition period then there's also the retroactive interference issue: a responsible person who is otherwise a very good driver recently visited us here in the US and was pulled over for driving in the evening without his headlights on. As you've guessed, it's because his car at home is automated but his rental wasn't.
He'd even complained just a few minutes earlier that he was having difficulty seeing anything, without the penny dropping...
Chromebooks have a lot of uses. Amongst those peers that we, commenters on a tech news site, discuss tech purchases with they're almost certainly just for use as something other than a Chromebook.
One anecdote: I know a teacher who has recently kitted out her entire classroom with them. All her students need is access to certain web-based educational resources. They're the easiest to administer and cheapest to buy (safely*), reasonably large thing with a keyboard that doesn't end up tied to any particular desk and can do that.
(* Android laptops seeming to require a gamble with the supplier, being overwhelmingly obscure off-brand imports)
Ye standard Aston Martin retort: nobody has to make petrol specifically for Aston Martins. They just work on the same petrol as the £10k cars. Somebody has to make apps specifically for iOS. iPhones/etc don't just work on the same apps as sub-£100 mobile phones.
Which isn't a perfect analogy because petrol is somewhat more fundamental to cars than apps are to mobile phones given that they're probably used just as much for the web, for photography and for texting, but apps are definitely used more often than oil is put into a car, etc, so forgive me?
That being said, I'll consider Apple to be in serious trouble when the market stops supplying apps for iOS as abundantly as it does for Android. If, say, the iOS Facebook app (no, I don't use it; yes, it's the most popular app) started lagging the Android version by a year or two then I'd be likely to think that Mac-level relevance was pending.
I am to earthquakes as many are to silicon valley "innovation".
Video games are older than 40; AMD is 55; AMD first forays into IBM PC world were 32 years ago; its first reverse-engineered CPU is now 23; its first fully internally-designed is 18. Meanwhile ATI won't be 30 until next year, having been founded in 1985.
If they're just doing ATI a little early, do this year's products justify that? I feel like I'm being thick and failing to think of something.
Salaries here being absurd and iPhones being abundant.
One of the more surprising things is that car (/auto) insurance seems to be a lot cheaper here in the US _despite_ also needing to include the medical bills of anyone you hit per their lack of nationalised healthcare. I doubt that $5m of insurance will cost more than a few hundred dollars a month, especially if these cars really are much less likely to have accidents.
Replying to myself on the reliability issue, having tried to source more objective numbers...
Per Squaretrade — http://www.squaretrade.com/htm/pdf/SquareTrade_laptop_reliability_1109.pdf — an insurance company that you might expect to want to amplify total numbers but that probably has no reason to be disingenuous about the manufacturer spread, 17.4% of Apple laptops fail within the first three years. The industry best was Asus at 15.6%, worst was HP at 25.6%.
So the worst big supplier produces machines that are 24% more likely to break than Apple and 64% more likely than the best. Apple's are only 11% more likely to break than the best.
Apple is also beaten by Toshiba and Sony but it's objectively amongst the good build quality guys.
For certain niche professions, it's a huge benefit that Thunderbolt exposes full-speed PCIe to outside lets you add arbitrary expansion devices when docked. E.g. Sonnet sell a box that lets you attach a regular PCIe card. I'm told it's quite the thing for pro audio.
For the rest of us? The build quality is usually good*, the battery life usually superb.
* no, your three anecdotes plus the times it has actually been newsworthy because the build quality wasn't good don't disprove that assertion. Though my anecdote doesn't exactly prove it either...
It shows up the first time you pair the mouse with the computer. Automatically. No interaction required.
Which, yes, is like those annoying applications that force you through a tutorial because they're insufficiently easy to figure out.
Find the power button by using your eyes, probably only once. No need to read the manual any more than you had to read the manual to figure out where to plug the monitor into your desktop PC.
Ports round the back are an awful idea for USB-like things that are likely to be plugged in and unplugged frequently (you know, versus things like monitors). I feel like Apple has stumbled into the problem — when the ports first headed round the back, the keyboards were still wired and had USB ports directly on them. So no need to touch the main computer at all. Then the wireless keyboard became the default and the fumbling began. Never mind the risk of scratching the machine with the USB device. More design thought required, I think. The sides would have been a good solution but Apple has collapsed those to being about 5mm wide so that option's out.
Clicking on text:
• single click places the cursor;
• double click selects a word;
• triple click selects a paragraph.
If you don't know about triple click then you can stick with manually locating paragraph bounds and clicking and dragging over the whole thing, in much the same way that, whether a Windows or Mac user, you can use File->Save instead of pressing [Control/Command]+S. It's just less convenient.
... to help relieve the RSI that's inevitable from using an Apple mouse.
The 'Magic' Trackpad is pure genius, however.
Would that be the same Apple that was first to ship a computer completely free of legacy ports, creating the first third-party market for USB peripherals?
I thought it was more to fire something trivial off, then disappear for 5–10 seconds?
The device has a feature called 'factory reset' that doesn't reset it to a factory state. That's different from having a feature called 'format the hard drive' which establishes the correct formatting on a hard drive. The first feature doesn't do what it promises, the second does what it promises but is sometimes falsely assumed to do something else as well.
That being said, it sounds like an easy bug to fix. A quick pop-up to explain that if the purpose is to remove confidential information then a full erase should be performed which will take X minutes rather than Y seconds and a couple of buttons would do it. It's such a fringe feature that it's probably not worth investing more time in than that.
I've enjoyed this being the year of the S5 and the 5s.
Indeed. Had the author alleged a strong correlation between use of that file format and intellectual property theft then he'd have been closer to making the point he intended, I think.
I saw the headline of his death and assumed a heart attack or something. I read the article, which said suicide, and immediately thought "oh, so there was nothing wrong with him". Only then did I consciously correct myself.
So I'm thoroughly of the opinion that mental health needs to be taken seriously but apparently still catch even myself not doing so. It's deeply sad that Robin Williams was sick and that his sickness killed him; I hope regressive instincts similar to mine didn't contribute and can be overcome more widely.
Let's not forget the Motorola ROKR, not matter how much Motorola and Apple might like us to. There were quite a few MP3 apps for Series 60 phones too, so you could rehabilitate your old Nokia 7650 if you really wanted. It's a slider though, so I guess something a little later like the 6600 would be more relevant?
Internet Explorer was the default bundled browser with all Macs from 1998 until 2003 or so. It was branded as version 5 but by the time it jumped to OS X had diverged completely from the Windows branch and, at the time, was the most standards-compliant browser (no, really — look up the Tasman layout engine).
I guess its development was tied to the patent cross-licensing deal; in any case Apple seemed to be aware that the agreement wouldn't last forever as Safari was conveniently ready in the wings.
Speaking professionally, even as late as 2009 I'd get extremely lazy web developers trying to justify leaving their half-decade old IE6 code alone despite the company's significant move into iOS development (and, therefore, Mac deployment) because "there's an IE for Mac, right?"
That article doesn't mention OpenGL even once — your conclusion is akin to citing an article that states that ebola is quite deadly as evidence that it is more deadly than the bubonic plague — and the 0.2% of people that care about "high end gaming hardware" are unlikely to play games in their browser.
Depends how you define peak Apple. In terms of peak influence, yeah, we're past that. But iPhone sales continue to go up year-on-year. The broadening of the market just makes Apple an increasingly less singularly notable player.
Yes, that's how Net Applications classifies browser users. Users who "have to be thankful for what they get" get counted as mobile.
Also: users who recently lost a shoe get counted as desktop, users who confuse effect and affect get counted as desktop, users who daydream a little too much get counted as mobile.
The tragedy of Microsoft, from its point of view anyway, is that there are about a hundred different things it could have done to own the mobile market and it succeeded at none of them.
What's certain is that it was a losing strategy to recreate the Windows desktop at 256x160 and expect people to poke at scrollbars barely a couple of millimetres wide with a stylus, then respond to the launch of the iPhone with laughter and disdain for it as a business product — i.e. if you think a large touch screen phone would be good for any part of your business then you're not the sort of company Microsoft is willing to treat seriously. Never mind waiting until long after Android has hit its stride finally to release a usable touch interface, then hobbling any potential uptake by applying a bunch of strict criteria about required buttons that means that if you're Samsung or whomever you can't just decide whether to put Windows or Android on at the final stage of manufacture.
The market is now Google's to lose.
True, but it does mean that your web page is more likely to be viewed by an Android user than an iOS user. Plan appropriately.
Umm, we don't know that because that isn't true. Chrome, Opera, etc, were counted towards the Android total. It's 44.83% Safari, 21.86% Android Browser; adding in the other options is how Android gets to number one.
In future maybe don't just make things up?
The quotes were meant to show that even the best selling Apple-related press doesn't blindly do "Wow! Apple's revolutionised everything, again! That's the third time this year!" but I actually think the original iPhone was a fantastic deal — don't forget to factor in the unlimited data versus the 18p/megabyte sort of charging that was common at the time. Though you'll recall that Symbian kept the vast majority of the market more or less up until Android came of age, so I don't think it would be accurate to say that the majority of people were sold.
It probably wasn't published in your version of reality but from MacWorld's contemporaneous review of the original iPhone — http://www.macworld.com/article/1058733/iphone_rev.html :
"... the volume buttons are located a bit too close to the switch, and on several occasions I found myself pushing the switch (which won’t budge) in a vain attempt to boost the iPhone’s volume."
"It’s a standard 3.5-millimeter jack—the very same sort used on the iPod—but because it’s recessed many third-party headphones won’t fit, especially if they’ve got a large plug or one that turns at a 90-degree angle. It’s too bad that a clunky add-on accessory will be necessary for aficionados of high-quality headphones"
"On the iPhone’s back face is the tiny lens of its compact, two-megapixel camera. It doesn’t zoom and doesn’t work well in low light"
"... the lack of [copy and paste] to transfer text from one place to another can generally hamper interaction between different iPhone programs."
"The iPhone also lacks a quick-dial feature that you’ll find on many other phones"
"Moreover, the iPhone doesn’t filter mail, nor does it have any built-in spam catcher. That means if you’re relying on a client-side filtering program ... you’ll be stunned at the amount of spam you’ll see on your iPhone."
"Steve Jobs has promoted the Web-browsing experience on the iPhone as one that brings you the “real Internet” ... Apple has brought the iPhone most of the way toward that goal, but it still falls a few notable steps short. ... there are a few limitations that prevent Safari on iPhone from truly showing the real Internet. The biggest is the fact that perhaps the most common browser plug-in in existence, Adobe’s Flash, is nowhere to be found."
"The bad news is that Text can’t send MMS messages [...] Because of this limitation, you can’t send a picture you snap with the iPhone’s camera to another phone via Text. [...] What’s worse, the iPhone has no support for any Internet-based instant-messaging network. "
"... the Notes program is fairly useless. ... And not to get too font-nerdy on you, but the Marker Felt font used in Notes is extremely ugly and, sadly, can’t be changed."
"The only thing missing from the Maps equation is that the iPhone doesn’t know where it is. Not via built-in GPS (it has none), nor by triangulating signal strengths from nearby cellular phone towers."
"I’m aware of numerous complaints from iPhone buyers ... about long, drawn-out issues with activating their phones. Still others have complained about poor customer service on the part of Apple and AT&T during the product’s first days of existence."
You're always prompted and asked whether you want to install a new version of the OS. You opt in, not out.
As a rule of thumb, iOS gets slower with the major releases, faster with the minors. Big performance degradations seem to have followed the release of iOS 5 (which significantly rejigged the Objective-C runtime, to add self-zeroing weak references amongst other things) and iOS 7 (all the rendering changes), with iOS 4's addition of blocks probably adding up to a bit of degradation over time as developers switched to Grand Central Dispatch perhaps a little more quickly than Apple was able to optimise it. Empirically, from optimising via the profiling tools, there were definitely huge changes between 4 and 5 in the logic applied by the OS to thread pooling.
The iPad 1 I still have knocking around on iOS 5 is much, much slower than the iPad 1 kicking around the office with iOS 4.
That scam will be released as soon as Apple starts allowing third parties to post EFI updates via its software update channel.
It can be downloaded direct from the web but either way you need to supply the administrator password to install.
Yesterday I received three 2011 MacBook Airs, having seen them available for a bargain price and decided to help out a charity. I'm not a support person so having multiple of the same computer in my control has never happened before. I therefore may not have proceeded in the most intelligent fashion, criticise as you must.
I decided to run all available software updates on all three, which were otherwise seemingly identically configured — each completely clean with the SSD given a volume name of 'Mavericks 10.9.2', implying a recent wipe and install though I doubt that will have had any effect on the firmware.
All three have now been updated, including to EFI 2.9.1, and work perfectly.
The only oddity was that all of them kept presenting the update as available even after it had been installed. In all three cases I installed the update last night and then again this morning, with software update now finally silenced. In one case I attempted to install twice last night, with the second go downloading and rebooting but not attempting to install anything. In the other two cases I stopped after the first try last night.
So installation still seems to have some issues, but only so as to create extremely minor inconveniences — nothing bricked — and may just have been some sort of propagation problem as 2.9.1 replaces 2.9.
... because there are no other iOS devices. I think the market share is going to continue to contract but not precipitously. If the Mac still makes money for them, the iPhone probably always will.
Conversely, Android manufacturers are fungible, sufficient hardware only gets cheaper and Samsung continues to target Apple in its advertising, providing a textbook answer to what happens if you can't stop fighting the last war. But with its manufacturing and component expertise I doubt it'll wilt quite as quickly as did its predecessors, Motorola and HTC.
Greater diversity is arguably the sign of a more mature market so the upside is: we all win.
My contactless card is a Visa Debit. Not credit. I didn't ask for it, the NatWest just sent it when the preceding debit card expired. I use it on the bus infrequently partly because I don't live or work in London any more but, given that I often spend an entire week or two there, primarily because it doesn't aggregate with my Oyster. The Oyster caps you at the travelcard cost; if I were to use the Oyster on the tube then the debit card on the bus then I'd quite likely end up paying more.
If the option becomes available to consolidate my Oyster entirely into my card then I will do so for the convenience. Also, in London contactless seems to be reasonably popular for things like sandwiches and beers. Anything where there's a benefit to the retailer in having the peak queue move more quickly and corresponding social pressure on individuals to pay and get out of the way.
My main problem with the Samsung advert is that Samsung is the world's most successful phone manufacturer by volume (amongst other things). So what the advert appears to be saying is:
You know those people that didn't buy the same kind of phone as the overwhelming majority of people? That stick with a particular kind despite showing a conscious awareness that it severely limits their hardware options? Well they're just sheep.
A PlayStation 2 does exactly everything I want from a video game console. That doesn't mean I think that people with anything from the XBox 360 or Wii onwards are implicitly facile.