Re: As a Moto G user
A PlayStation 2 does exactly everything I want from a video game console. That doesn't mean I think that people with anything from the XBox 360 or Wii onwards are implicitly facile.
2281 posts • joined 18 Jun 2009
A PlayStation 2 does exactly everything I want from a video game console. That doesn't mean I think that people with anything from the XBox 360 or Wii onwards are implicitly facile.
I was once made fun of by a girl for wearing brightly-coloured trainers. Later I saw a different girl praising brightly-coloured trainers. This proves that all women are hypocrites and that the lot of them owe me an apology.
Don't even get me started on Google software. Of the current Android development stack, Android Studio uses "significant energy" even when left completely untouched for a prolonged period; the Android device simulator (with Intel HAXM, admittedly) does not even when in use.
So: text editing is apparently very costly, but running a whole other virtualised OS is quite battery efficient.
Apple's implementation of timer eliding relies on the application switching from saying "I want a timer that fires every X milliseconds" to "I want ... plus or minus Y milliseconds". Applications that don't specify the tolerance continue to receive the old behaviour. Knowing Apple that's likely only a transitional move but is in effect as of v10.9.
Incidentally, anecdotal observations: one uses significant battery power by waking often regardless of the amount of work done. An application I wrote woke regularly 50 times a second but used barely 3% CPU time — it was an emulator, doing the most obvious thing. It got into the hall of shame. Switching to an adaptive timer, only when wakes drop below about 10 times a second does it stop being named as a problem. Of course, that's empirical without being particularly scientific and I've quite possibly made a dumb mistake elsewhere, etc, etc.
The Z10 can be had unlocked in the US for $210.72 on Amazon which would put it in the league of this article (as £150 is $256.30) except that in the UK Amazon wants... £187.49.
They've got form: see 2012's FTC settlement over the exploit of a Safari bug — though the fine was more about promising clearly and directly that tracking wouldn't occur, then exploiting a browser bug to track regardless. So it was a false advertising issue more than anything.
Project Zero would presumably just have had a quiet word with Apple.
.., it'll still know your broad location though, as it will still know which cell towers it is near. That being said, the cell towers know when you're near them so if a government wanted to track a mobile phone user in slightly broad terms, it could do so regardless of the handset.
The article claims that Apple's "year-on-year growth shrank by 1.7 per cent.". You can't turn a positive into a negative by shrinking by a percentage. You can negate it but you can't negative it. Therefore Apple continued to grow.
It subsequently claims that "Worldwide PC [saw] a year-on-year decline of 1.7 per cent." which appears to state that the market as a whole declined.
Given that Apple's relative share within the US declined, as you observe, and assuming El Reg has subedited properly, I guess what we have to conclude is:
• Apple grew;
• the US market grew more quickly;
• the worldwide market declined.
The real absurdity is that an article that should be about Lenovo's incredible growth seems to want to focus on a point here or there for the fourth-place supplier.
Apple's profits are likely still the best or second best in the industry but this survey is about what consumers are buying, not about whether any particular company is about to have to shutter up. So it's useful to people like software companies. Even if Apple were more profitable than every other manufacturer added together, if they had only 0.1% market share then where would you focus your development resources?
Apple did poorly against Lenovo because Lenovo did spectacularly. Nevertheless, Apple saw growth while the market as a whole declined?
Of course, I wouldn't exactly have described the PC market as "Apple's lunch" in the first place. Compared to shipments of other types of device and of other suppliers I'd have called it maybe "that small snack somewhere around 4PM that Apple eats a very small piece of".
It's the complete absence of PCPs in my area that they cite. So there's no trunk level at which it is economical to install fibre. They'd need either to install PCPs or add fibre direct from the exchange to every individual building. The local fibre company that has jumped on that opportunity charges £20k/building for installation even after they've got sufficient interest in subsequent subscription.
Sadly the phone reception is also quite dodgy in my specific flat (though I don't think that's endemic, just bad luck on my part) so LTE-or-whatever isn't much of a fix, and the local MP has been trying to obtain funds but is a Lib Dem so I suspect BT may just be running out the clock on any effort he's creating for them. Meanwhile Boris has marked his funds for Wifi on the DLR so local government doesn't have the resources.
... and, as I said, I can therefore only imagine what service people in the countryside get, probably with just as many logistical problems but most of them so specific that there's not even an MP on their side to ignore.
Ditto around the place I bought. In London's Zone 2. Apparently because it's the Docklands, with residential properties developed in a very ad hoc fashion, overwhelmingly since the privatisation of BT, they've just sort of thrown the wiring together and don't have the money to fix it. Most properties are copper all the way to the exchange, a few kilometres as the crow flies so I can only speculate what it is in cable length.
I can only imagine the problems in less-dense areas.
You've obviously missed the memo from Silicon Valley. What coders do nowadays is (i) innovate; (ii) disrupt.
Working calmly and rationally on solid products isn't part of the plan.
I think tablets are a much preferable option to laptops for a large number of people. I think that any deceleration is, as the poster suggests, saturation. Tablets are now on a refresh cycle for people that want them.
You can't soundly conclude either (i) that any particular proportion of sales were people buying into the hype (though almost certainly some will have been); or (ii) that laptop sales will climb.
... didn't it once materialise around itself?
On the contrary, it looks like you've assumed I don't have a clue and are therefore suffering confirmation bias whenever you read my comments. Try to be less prejudiced.
The specific issue being discussed is whether "as new cameras come out it's unlikely they'll get any love from Apple". i.e. if Apple never writes another line of code for Aperture, will it be able to import RAW images from newer cameras?
There's little industry consensus. The sensors companies use are almost entirely orthogonal to the file formats they use. You can find the same sensor in cameras from three different companies but to deal with processing the results, you'll have to handle three different RAW file formats. Manufacturers are also very inconsistent, constantly changing formats seemingly arbitrarily. That's why there's already something like 150 of them.
99.99% of them, across the entire range of cameras, utilise Bayer RGGB filters. That will likely be true for a long time to come. It's not great for entropy but there it is.
So actually the thing that would likely determine whether Aperture keeps working is exactly _whether it can get the sensor information from the files_. I'm saying: it will keep working because the stuff of getting the raw sensor information from the files isn't built into Aperture. It's built into the OS.
I guess a rough analogue is dealing with printers back in the 1980s.
Everyone here is fully aware that there's complicated, often proprietary mathematics behind getting to a vanilla RGB from raw sensor information, that companies like Adobe have spent a lot on doing that really well and have accumulated a lot of value in doing so.
That's a completely unrelated issue.
There's no technical reason. Apple's RAW API provides the raw sensor information if you want it. It will demosaic only if asked.
Only if you were falsely to assume that it returns RGB pixels would you think there was a problem. That's not the case, I don't believe that's the case and I didn't state or imply that was the case.
There's no reason those programs couldn't use the built-in support other than that Adobe rarely uses OS-native anything. I guess they'd argue cross-platform consistency. They managed to support the Fuji non-Bayer filter a lot more quickly than Apple so it's not necessarily a bad thing. A bit more bloat, arguably, but creating a more competitive market.
But all I meant was that, relevantly to the original poster: (i) Aperture uses it, so it will continue to support new cameras for as long as it remains compatible with the OS; and (ii) any future photography app from Apple will use it, so it'll continue to be updated.
That's not really how it works under OS X — RAW image decoding is a service the OS provides, just as it can open PNGs, JPGs, etc. As long as Apple provides any photography application at all they'll keep updating the RAW support.
That said, it took something like a year to get support for my non-Bayer X-Pro1, which is 'semi-pro' at best, so I can't completely endorse Apple's approach.
I have a Sputnik* which takes stereoscopic photos onto 120 film, i.e. images are captured at an absurdly high definition 60mmx60mm per lens, at the cost of only six fitting on a roll of film. On the plus side it's the same format used by hipsters in their Holgas so film, development and printing are still widely available; also suitable reels are available for most developing tubs so you can cut out the middle part. It's even easier to develop at home than 35mm because there is no cartridge. It's just literally a roll of film.
Good luck with 110 nowadays. I'm sure someone can handle it but it's going to cost.
* http://camerapedia.wikia.com/wiki/Sputnik — named before and independently of the satellite as the word literally means a thing that goes with a traveller, I think.
Did anybody else find that shoehorning a little absurd?
I wouldn't agree with that.
The SCOTUS found that Aereo is covered by the same rules as any other cable operator. They took a step back and said "regardless of splitting technical hairs and the exact way that individual subscriber's money is spent, the service you provide is other people's copyrighted works delivered via a cable".
They then concluded that therefore the rules that should apply are those that were made by the body the public specifically elects to make laws.
So that's just (i) taking a common-sense approach; and (ii) deferring to democracy.
The result that Aereo would have to license the content it streams is a direct, intended result of the laws that Congress has enacted. It's not the court bending over backwards for big business.
I don't think they would because that wouldn't be a performance to the public. Unless you gave the login to all your friends.
From reading the judgment, the court was of the opinion that:
(i) it had found cable television redistribution systems that used centralised aerials legal;
(ii) Congress had then legislated the 1976 act that made them 'illegal' (i.e. ensured they had to pay to repipe content);
(iii) the language used by the 1976 act also covers Aereo.
Primarily it seemed to hinge on the idea that since 1976 the mere act of transmitting a performance from point A to point B is itself a performance, and the test for whether a performance is public hinges on how widely the work is being distributed and the relationship between the various people that receive it, not on the specifics of each specific connection.
They may be individually recorded streams but they're all of the same work and the receivers don't know each other. So it's one work being performed to the public, and Aereo does not possess the rights to do so.
So my understanding would be that if you put your own TV on the web for you to watch you'd be fine because you're not performing to the public. If you invite your neighbour over then you're not performing it at all.
The full judgment is here: http://www.supremecourt.gov/opinions/13pdf/13-461_l537.pdf ; the summary is just the first four pages — then the lead judgment and the dissent are quite a bit longer.
In England and Wales at least, an advertisement is an invitation to treat; if a customer likes the offered terms then he makes the seller an offer; if the seller accepts the offer then that's a contract. E.g. that's why the well-known wisdom is that if you see something advertised at what's obviously the wrong price then you can bind the seller to that price only if they confirm the order — that's the point at which they accept the contract you offered.
The main relevant constraints are those terms that are implied and which may not be overridden (that's the Sale of Goods Act-type stuff about products being fit for purpose for a certain period, etc). There are also caveats about "mere puff" — statements that are clearly subjective and not intended to be enforceable, like one cafe's being the best cup of coffee in London or a particular make of car making you more popular at work.
With the Pixel it sounds like the terms suggested in the advertisement were specific and intended to be enforceable, and the contracts formed between Verizon and the consumers were exactly on those terms.
Even though a lot of the specifics are post-1776, US and British law is similar enough that it's the only area I can think of where a US case is commonly cited in British courts as being sufficiently informative as almost to be treatable as a precedent — Shuey v US on the extent to which the revocation of a unilateral contract needs to be communicated (though, oddly, I think British people tend to take it to be that the revocation must have the same notoriety but US people tend to think it must go through the same channels, so the conclusions drawn are not quite the same).
Consumers saw a particular product advertised and paid for it. Verizon then unilaterally decided they weren't going to supply it. In which country would you not use the law when a major corporation decides just not to do what it has accepted money for?
Click the Apple in the top left of the screen. Click System Preferences... . Click Users & Groups. Click the padlock in the lower left where it says 'Click the lock to make changes'. Enter your administrator user name and password. Click the plus symbol at the bottom of the list of users to add a user.
I don't know what you're talking about. I've still seen only about a dozen Google Glasses. Once I even met someone in a bar who didn't work in tech.
A few Apple market failures, since Jobs reconfigured it into its modern form, from the top of my head:
The G4 Cube; Ping; MobileMe; the U2 iPod; the Motorola ROKR; the iPhone 5C; the iPhone in its first few months, before receiving a doubling of storage and a hefty price cut; borderline, the Apple TV.
Never mind the various minor products Apple tries to push which everyone just ignores — remember the half hour that was spent on how the new bundled earphones were some sort of sonic revolution, and that people would buy instead of third-party earphones for their non-Apple devices so they were now available standalone?
The evidence appears to refure that the theory that: (i) there's an Apple hardcore who will buy anything with the logo on it; and (ii) that hardcore is solely responsible for the majority of the success of the iPhone and Mac.
I guess from Apple's point of view, the downside of relying on that market would be that it exists primarily only in the imagination of blind partisans.
I think it saves more than a few pence — they're reusing the MacBook Air logic board as far as people can make out. So that's dictated by the form of the tiny machines and savings then flow from production scale rather than purely from not spending 5p on a socket.
All the ultrabooks seem to use soldered RAM so that decision at least is likely justified.
The 1.4GHz part has scored pretty much the same single-core benchmarks at the 2.7Ghz model it undercuts; the big loss is the two cores instead of four which results in the corresponding benchmark being 40% lower.
So this machine will be much faster for many consumer tasks than instinctive feelings might suggest; even though it's a direct i5-versus-i5 branding comparison the hugely reduced clock speed is barely a factor.
On iOS third-party keyboards are not permitted network access unless the user explicitly allows it. So it should be easy to avoid key loggers.
Does that sort of comment usually come from the sort of people that actually read the articles? It had Apple in the title: that's reading it.
Block Out, maybe?
Tetris 2 on the Spectrum is indeed the finest thing. I had a version for the Sam in which someone had adapted the AY music to the SAA; it's quite possibly the game I played most. Though my choice of computers didn't leave me exactly overflowing with options. Prince of Persia, anyone?
Swift uses var for variables, let for constants.
From native C or C++? With the incredibly arbitrary restriction that the feature Apple specifically supplies for this — the ability to make Objective-C calls arbitrarily at any place within C++ code — isn't to be used?
You'd use the C-level entry points to the Objective-C runtime: https://developer.apple.com/library/mac/documentation/Cocoa/Reference/ObjCRuntimeRef/Reference/reference.html
You're working up to an objc_msgSend, probably, but you can also use class_getMethodImplementation to get the C function pointer for any method on any class. Then you just need to remember to specify the instance of the class as the first parameter, the selector for the method as the second and the other arguments in sequence after that (it's a va_list).
For getting selectors, metaclasses and suchlike you'd use NSClassFromString, NSSelectorFromString and the gang. They take NSStrings as arguments but NSString is toll-free bridged — i.e. the two look the same in memory so just cast the pointer — with CFString, which is a pure-C API. So just use CFString.
There were quite a lot of speculation around 2010 of a full-scale switch to Ruby; I guess Apple ended up deciding that they liked the idea but wanted a bit more control and a completely native coupling to the existing runtime?
In what sense does Apple 'force' you to use Objective-C? You can use as much C or C++ code as you like without having to hop any sort of barriers — the three can natively call each other directly, in the same source files. No managed/unmanaged border, no wrapper libraries, straight calls.
Of the two on-the-box options offered by OS X 10.0 and 10.1 — Objective-C and Java — developers picked the former. Not Apple.
As to whether the world needs a new language? It doesn't. But Apple needed one and decided they had to engineer it themselves because (i) they wanted it exactly to fit the existing runtime; and (ii) Apple usually thinks that way anyway.
Read quickly, obviously.
Honestly, the new language looks pretty good from the thirty-or-so pages I've read so far but I've yet to get to anything particularly complicated. E.g. if it uses the same runtime as Objective-C — reference counting (automatic or otherwise) rather than garbage collecting — then is it still a programmer's responsibility to avoid retain cycles? That's the main area where I felt the existing runtime (rather than the language) was looking kind of historic.
No; COM was as much about the ABI and the incredibly painful way it interacted with every then-current language as it was about the OLE and DDE stuff.
I'm pretty sure the Swype-style keyboard shown in the app was Swype branded.
Also don't hold your breath for answering objections; even putting the crazies aside, once someone is invested in a particular platform they can usually find reasons not to switch even if they're doing so only quietly and for their own benefit, and not to win arguments against internet twelve-year olds.
If a majority of devices have property X then what's the probability that a device with property Y also has property X? Or, rather, what doesn't a higher probability of X given Y imply?
I heard that more house fires occur at the homes of people with Windows PCs than occur at the homes of people with Apple computers. Just sayin'. Could be because the total amount of sunlight that falls on Windows users is so much greater? Etc, etc, etc.
Bada's issues were more deeply ingrained than mere unpopularity. It was the endpoint of Samsung's decade-or-so of internal phone OSes and wore its baggage on its sleeve. It used a weird alternative history version of C++ (i.e. no STL, containers and primitives custom to Samsung, no exceptions, Samsung's own invention of two-step construction to try to bridge the difference) and the developer tools left a great deal to be desired (as in: I never once got the debugger to attach).
If they're pushing a higher level language and/or an up-to-date version of whichever language it is, with development tools that work properly, then they've already learnt a lot from the last endeavour.
Will the iHaters ever come close to admitting that super-arrogant Apple can roll back its mistakes*? The Dock was 2d everywhere prior to 10.5 and has remained flat on every interim release if moved to the side of the screen rather than left at the bottom.
(*albeit without ever acknowledging or, god forbid, apologising for them; humility is not part of the deal)
If anything, it's a potential future battery life liability — the very white look obviates potential power savings from more intelligent backlighting or any self-emissive screen like an OLED, versus the much more black-oriented iOS 1–6 look.
(and never mind whatever the cost is of keeping the gyroscope going for the blink-and-you'll-miss-it parallax wallpaper)
It's not really directly comparable though, assuming new versions of OS X continue to be free, as 10.9 was. In that case the market will move forward rapidly — most people don't jump to lingering fears about compatibility when shown something new and shiny — and the hassle will be staying with an old version as the APIs move on and developers lack an incentive to program down to older versions.
Assuming that's Apple's goal, uncontroversial updates are actually beneficial
... though the timing and quality of the photos implies fakes. I guess we won't have to wait long to see.
Read: El Reg commenter slams YouTube video as 'DREADFUL, the whole thing is DREADFUL'
I think your mistake was expecting information. The approach of this sort of thing is to give you a title that suggests an obvious conclusion, show a bunch of disjointed clips that jump straight to that conclusion, then expect you to feel a warm glow due to the lack of cognitive dissonance.
Biting the hand that feeds IT © 1998–2017