Re: 10KB for the OS?
- "Swtmr" objects (whatever they are)
Software Timer, surely.
82 posts • joined 7 Mar 2008
- "Swtmr" objects (whatever they are)
Software Timer, surely.
...then Affinity show you just what a powerhouse closed source can be (I'm going to get *so* many downvotes for this) and just how good software really can be when it targets carefully, and integrates deeply with a specific operating system.
As for "I haven't found anything in Photoshop that Gimp can't do", try > 8bpc images. Unless you're on the 2.9 beta, Gimp *still* can't do deep colour after years and years of waiting.
Amazing nobody's mentioned that this "nice skin" is an utterly shameless rip of OS X pre-Yosemite, including Finder with sidebar, pre-Yosemite full screen arrows in the same location on the title bar and even including the stuff OS X users complain about like hard-to-see "running application indicator lights" in the faux 3D dock under the icons. The Calendar interface screenshot is also disturbingly unoriginal.
I can only assume they don't get sued to oblivion via design patents is that nobody's paying for the OS so they aren't big enough to worry about.
This is dubious behaviour at best, but the title is nonsense (as are all titles I've read so far on the inevitably sensationalist reporting of the story).
Music loaded into iTunes would sync to an iPod just fine. You could buy (say) MP3s from anywhere you liked, load them into iTunes, sync and it'd work (within the normal iTunes/iPod values of "work"). Music sync'd to the iPod *BY SOME OTHER METHOD* would cause iTunes to barf.
The context is Apple's accusation that RealNetworks hacked the iPod by allowing unofficial iPod sync within the Real player, so that Real could directly sell and sync music over their own service instead of Apple (rather than selling over their own service, then throwing the files at iTunes for playback and sync - they had an overinflated opinion of their "brand" in RealPlayer, I suspect). Apple disagreed with the reverse engineered syncing, are calling it hacking and took steps to stop Real doing so. If iTunes found files on the iPod which it didn't put there itself, it would insist you reset the device.
You might very well argue that Real should've been allowed to do that (interoperability, monopoly breaking etc) and think it warranted an anti-trust suit. And lo, that's what's happening, and that's where the story comes from.
Plus the execs from Zimbra and VWMare probably knew execs from Microsoft from the get-go and after lots of dual-direction corporate wining and dining, the backhanders were all sorted so the sale could go ahead.
The idea that an e-mail client could be worth $200 *million* is absurd even by typical industry absurdity, so it's very likely that there's more to it.
Well yes, obviously, it makes a lot of sense.
If you upgrade on device, then the upgrade archive has to be downloaded to the device filesystem somewhere. So that's 1GB ish. Then you need a bunch of scratch and verification space, probably room to unpack files etc., and though 5GB seems excessive, you can certainly see how there'd be escalating storage requirements - especially with on-device checks and balances to make extra sure if anything goes wrong the OS isn't stuffed.
When doing it via iTunes, all the big files can be kept on the downloading computer, with only filesystem changes sent over the wired connection to the device. It may well be possible to be much less careful about keeping a consistent system state too, since if you're connected to iTunes, the upgrade process will already have made a restoration file in case things go tits up.
> b) let Apple/Google know all about your purchases
That's exactly what Apple Pay doesn't do, since Apple don't make money off user data unlike Google. They were able to bypass any notion of storing information about the purchase as part of a unique selling point - and point scoring! - against Google, which does make money off this sort of thing and does collect data.
Apple often seem behave in a nasty way but the interesting part is that with Apple Pay, they have a vested commercial interest in *not* collecting your data. They're financially motivated to be the least evil in this particular case.
CurrentC is a waste of time because it's such a myopic US-centric mess anyway; social security number? In 2014? Chortle. Meanwhile, Apple Pay might struggle outside the US just because the rest of the world was already onto Chip & Pin, and now PayWave etc. anyway. The US transaction market has always seemed pretty "quaint" to much of the rest of the world.
Lots of (entirely IMHO) missing-the-point comments about 4K.
4K is basically rubbish on large screens. It's just a 1920x1080 *equivalent area* display in most cases, albeit with the ability to scale outside the natural quad density arrangement (or in Windows' case, a general inability to scale consistently across applications with Hilarious Consequences). Most use a 4K monitor as something with 4 times the detail of the equivalent "low DPI" display, for a 1920x1080 equivalent area. At 27" and using typical laptop display area / on-screen element size as a reference, that's comically bad; UI elements are very large with a feeling of much wasted space.
Worse, 4K made it look like the industry would just settle on mass producing cheapo 4K panels and we'd get stuck in a prettified version of the 1080p rut we've endured for several years already.
Fortunately - albeit in largely niche products with a price to match - manufacturers have been making 27"-30" monitors with a 2560x1600 or 2560x1440 (former 16:10 preferred, more area / height - e.g. widescreen video editing *plus toolbars* top/bottom) resolution for a few years now. Dell's new so-called 5K display isn't just a numerical contest to try and make 4K look outdated, it's merely the natural quad density evolution of the predecessors; you end up with a 27-30" monitor that has a vaguely sane desktop "area" (think of it as 1440p), rather than something that's really just a crummy 1080p panel in disguise.
As someone who was using 1600x1200 CRTs in 1994, today's "FULL HD!!1!!11" seem rather pathetic and 4K overrated and overdue. The Dell announcement is great news, though I'd be even happier with a 30" display at 5120x3200 :-P
Yes, so it's a good thing that this is just the pointless summary done by a journalist, rather than the more useful and linked-to-information summary done by Apple.
Try reading the original material - then you can complain about what Apple *actually* said, rather than assuming The Register's language is Apple's language.
Apple's software slide into utter mediocrity continues. How depressing.
I personally find all Adobe UIs I've ever used to be utterly horrible, but marginally less horrible on Windows since Windows doesn't have the same system-wide integration and toolkit approach of Cocoa on OS X. Your expectations as a user are thus much lower anyway. OS X seems to be as slow as molasses these days too (it all went horribly wrong at OS X Lion) and has at least as high a bug count as Windows now (again it all went horribly wrong at OS X Lion), so it just seems pointless to bother running it anymore, even if I'll miss Logic, text services and AppleScript support.
Much as I'm interested in WWDC as a developer, I'm not particularly looking forward to the Tim Cook keynote. Over time he's developed inter- and intra-sentence pauses that are getting so long, he's risking heat death of the universe before he finishes the speech.
Hopefully someone at Apple has noticed.
...It's getting ridiculous.
...Asked him to speed up, but somehow...
...I doubt it.
<Thunderous, slightly relieved applause>
"Iphone 5s is 64-bit. Are there benefits to that?"
I co-wrote the following article which got journalistified :-P a bit but the gist of it is there:
IME, Microsoft Word itself can't offer near-perfect fidelity when reading its own files, so I don't buy the sales line that the cut-down web version is somehow better.
""MS control the hardware and the software" except this wasn't MS. Yes, the MS GDR3 update was included, but Lumia Black is a Nokia software update for Nokia devices... er... OK."
Given how closely MS and Nokia have been working together from the get-go with WinPho devices and given the ex-Microsoft man at the Nokia helm back then, it'd be pretty daft to think that they were genuinely separate or that MS didn't have stranglehold control of what Nokia were doing from the get-go. That's now irrelevant though, since Microsoft purchased Nokia's mobile phone business in September 2013. When it comes to phones, MS and Nokia are now one and the same.
I think there are different revisions of the same model number, which is confusing.
It seems the current one does indeed have a 3200x1800, 10 point capacitive panel which if you want to touch your laptop screen is impressive. The big problem is the 13.3" screen size. It means an equivalent 2:1 resolution of 1600x900 in a 13.3" notebook... You'd better have *really* good eyesight. Alternatively, it means Windows is going to be asked to scale even higher than x2 standard; and good luck with getting *that* working reliably across all apps! To make this particular panel really legible at that size, you really need an OS which can manage arbitrary scaling, or at least something around the x2.5 mark. OTOH, it's clearly really nice to be able to get something that high res at that size if you really want it and great to see the industry finally moving towards higher resolution panels (even if the underlying driver for all this is likely to be 4K TV so we're probably going to be stuck with 16:9 displays everywhere still).
People have criticised Apple's decision to lock OS X into a 1x or 2x scaling mode, meaning that the so-called "retina" screens are always going to be x2 versions of traditional resolutions lest user interface elements appear too small. But the thing is, it works well and is cheap to implement; the illusion of arbitrary scaling can always be achieved just by running the frame buffer at a different resolution from the panel; and even though the "retina" marketing is a retch-inducing it's based on a sound principle of typical viewing distances and average ability to "see" a single pixel. Going higher resolution is pure spec-chasing with little real world benefit. As for scaling reliably - unfortunately Windows 8 hasn't quite got there yet as Microsoft historically took the more difficult, though more flexible (near-)arbitrary scaling approach, but often apps don't seem to cope very well. I don't know what stage Linux is at here.
All-in, given the scaling mess of Windows, comparing to Chrome Pixel might make more sense.
Version 7.0.1 was specifically for the 5S and 5C only, rolled out almost immediately after the hardware was released. The only widely discussed fix was a fingerprint sensor bug on the 5S; there's also a lot of talk of a number of regressions too. Rather an obvious sign that it was all pushed out the door a bit too early...
Version 7.0.2 is being used for everyone as a universal upgrade to bring the numbers back in sync. For most devices, there's never been a 7.0.1.
Good grief, "industry analysts" are unremittingly dumb. And shame on the journalists for parroting it.
Analysts predict that Apple will sell a cheap phone, even though Apple never sold a cheap anything. Apple then release the 5C, which is in line with every other incremental phone update they've done - the old model gets dropped by around $100. In this case, they've tried to boost sales a bit by changing the case, probably boosting their margins in the process, but mostly - surely very obviously? - so that the outgoing 5 **doesn't look the same as the 5S on the shop floor**.
When the old model was the 4S, the difference was obvious because of size. Now we have Apple's traditional per-year speed bump upgrade (3G to 3GS, 4 to 4S, 5 to 5S) and the lower end model needs a bit more differentiation. Thus, case change. Surprising? Of course not.
It's nothing new, it's nothing unexpected, it's just not what the idiot analysts were claiming would happen and despite they fact they've been completely wrong about it all, they continue to whine about Apple's phones not selling into the markets they weren't targeted for. Despite record breaking opening week sales. *Again*.
The low-middle end was not the target any more than it was the target for the 4S when the 5 was released.
When the previous gen speed bump was released (the 4S) it sold 4 million in 3 days (http://en.wikipedia.org/wiki/IPhone_4S#Commercial_reception). When the 5 was released it sold AFAICT around 5 million in the opening weekend (http://en.wikipedia.org/wiki/IPhone_5#Commercial_reception and Google). My response to the 5S personally was "meh", I'm sticking with my 4S; iOS 7 is fugly and buggy no matter what you run it on, so I'm in no hurry to waste money on a new handset just yet. But 9 million people, despite the relatively uninspiring new hardware, bought in. Y'all can moan about "sheeple" and so-on but Apple just keep on winning those consumer satisfaction surveys and their sales figures keep rising and rising, so more an more people are buying in and more and more people are re-buying in again, despite the very high hardware cost.
Surely Apple never expected the 5C to be a big hit in China, either. That's what the low-end still-sold 4 is intended to do. The 5C and 5S are far too expensive to be a mainstream market offering in China, so clearly, they're targeting the wealthy. The wealthy in China like shiny things. The gold has sold particularly well. News at 11.
Nobody but analysts could *possibly* be daft enough to have not understood any of this. SMH. Why does anyone ever pay them attention? They're completely wrong year in, year out, over and over. It's almost impressive that they manage to perform so consistently badly.
iCloud isn't competing with DropBox because it's not the same service. It's about sync - full device consistency in as many areas as possible.
That includes apps. The iOS app model, which can be frustrating, is nonetheless closer to the likes of Palm than it is to a standard desktop like OS - your documents are per-app, app managed and the filesystem is invisible. Palm OS did that via a database, iOS does it via the sandbox, but the user sees a similar experience. It's the same idea as syncing music and video via iTunes and libraries rather than manually via files and folders. With iCloud integration, things like Pages, Numbers and Keynote, or even the humble TextEdit, can save files directly into the cloud, to be read and even live-updated - but by those apps and only those apps, or their equivalent, including web apps - on other devices / in browsers, without needing to navigate around a filesystem to discover stuff.
How well this works for large amounts of data (large numbers of files) remains to be seen. It seems very home user orientated presently (nothing wrong with that, but a definite limitation in style).
Trouble is most people are familiar with manually managed files/folders, especially if from a Windows background, so they complain and want and old-school file-like interface for this stuff. Somewhat ironically, though, Windows has itself been moving more and more to the managed model via its Libraries features and people were clamouring for something Palm-like in the Longhorn days, because of the rumoured WinFS, which would be database-style and not based around a traditional "confusing" file hierarchy.
So basically, people are just rambling on about stuff because it's fun to complain and don't realise whether they've got what they asked for or what it was they were ever asking for in the first place :-P
"Could this not also be done on various platforms and various OSes, perhaps with a bit of handwaving to account for logical sparseness of the used application/OS space vs logical contiguousness of the blocks on "disk"?"
I don't understand what you mean really... But bearing in mind the flash on phones is almost always more than 4GiB - the maximum logical address space of a 32-bit device - and given that some of the space will of course be required by the OS, then the answer is no, not without lots of trickery (usually support hardware - an IOMMU). This tends to add complexity and degrade performance. My point was, you don't need trickery or extra hardware with 64-bit. You just map stuff. You've 16 exbibytes of address space! It's lots and lots and lots.
Again, there's a huge difference between physical and logical address space and those that don't understand this need to go and read up about it to understand why 64-bit addressing has nothing directly to do with 4GB of RAM.
Given the thread size already, my comment will almost certainly never get read, but here goes anyway.
Kids today... Sigh. This isn't about physical memory and never was, it's about logical address space.
My background is post-ARM Acorn since 1996 (the company from which ARM originated) and RISC OS (the OS for which the ARM processor was originally created, via Arthur). See http://www.riscosopen.org/ and the Raspberry Pi for where things are with that these days. I'm one of the founder members of the not-for-profit RISC OS Open Limited.
When the OS moved from 26-bit (http://en.wikipedia.org/wiki/26-bit) to 32-bit addressing, one of the big benefits was increased available logical address space for applications. The overall OS memory map could make much better use of the 32-bit space available and applications could have considerably larger logical address space allocated.
Because of a dubious memory allocation API called "dynamic areas", RISC OS can still easily suffer logical address space exhaustion even if there is plenty of free physical RAM, just by asking for the reservation of dynamic areas - which are contiguous areas of reserved addresses - with the potential to grow to an indefinite physical size. This means that a large chunk of logical space has to be reserved on the off-chance that the application actually does try to extend that space to the maximum permitted size (IIRC the "maximum size" value used to be "available RAM" until 32-bit machines arrived with larger memory capacities - a 512MB RAM machine could've otherwise exhausted all virtual space in just a handful of unbounded dynamic area allocations, so application authors were encouraged to always specify an upper limit and the OS introduced a much-lower-than-RAM limit for those applications which said "as big as possible").
Under a 64-bit CPU, you have a *vast* address space you can use for logical addressing, so the memory map becomes very flexible and all sorts of constraints and limitations that don't necessarily have anything to do with physical RAM become lifted. There are usually performance benefits from increased register availability as others have correctly already pointed out but, as already pointed out too, the cost is the increased storage requirements and potential impacts on caches and memory accesses.
In any event, while it might not be *necessary* to have 64-bit address space in something as small scale as a phone, it can certainly be *advantageous* and there's no need to have anywhere near 4GB of actual physical RAM installed in order for it to be useful. One could, for example, memory-map the entire flash storage device into logical space without needing any of the additional special tricks required with 32-bit addressing.
(It's thread necromancy time...)
You didn't; but as you can see from *actually reading* my reply, I was addressing the *thread*, not an individual.
My final comment in that reply, which I did consider for some time removing but eventually decided to keep, was an attempt to head off the next tired set of arguments of Google vs A.N.Other.Competitor wherein apparently Google aren't evil because they're "open" (FSVO "open"). Fortunately the thread died out anyway.
"Yes, it's not Apple TV. So what? It also works in a totally different way to AirPlay, but I guess you skipped over that minor detail in your rush to Apple-based condemnation?"
Good grief, epic fail doesn't quite cover it.
Congratulations to you and all the other frothing-at-the-mouth fandroids who apparently didn't read past roughly the first sentence of my post, or even read its title. You might want to look at where I talk about DLNA, Android based options and Raspberry Pi.
Meanwhile Google's product is NOT running Android and it is NOT open source.
I still don't see why Google's half-baked implementation is interesting. AirPlay does all that and more, plus it's integrated down in the A/V framework of OS X and iOS so it works from any A/V-based app that hasn't explicitly disabled it. With desktop mirroring from an OS X host, *any* visible content can be pushed to the TV. The audio side of things makes it even more interesting given the support for multiple synchronised devices across the home. AirPlay gaming may be a minority use case, but at least you can do it.
Meanwhile any existing DLNA-capable TV, STB, or handheld device, regardless of OS, stands a fighting chance of doing what Google's new toy does and more, though DLNA does seem to be a minefield of compatibility problems so getting it to actually work in the first place can be a big problem! But in any case, it's clear that from a software point of view, Google's offering looks rather sad as it stands.
The only advantage Google have for their very limited device is price, but at $99 an AppleTV does an awful lot more - it even has (shock!) its own GUI, own remote (as well as An App For That) and a whole slew of apps and self-contained ability for playing stuff. And when you can do neat things like using an AirPlay screen as a new desktop extension monitor in OS X Mavericks, it's clear that technology is not being left to languish. Meanwhile for around the $60 mark you can get a similar "HDMI stick" form factor Android 4 "tablet" device which will do far more than Google or Apple's devices, albeit at the cost of some reliability and usability.
You certainly do get what you pay for, but even then, the Google device looks comparatively expensive against existing competition, especially including things like the Raspberry Pi. The software is unimpressive and the hardware isn't interesting either; it recently became very easy to make things like that at such prices and I'm pretty sure we'll see a lot more of them. Google's device stands a good chance of being lost in the noise unless they make very major improvements to its software.
Years ago I found out that they did a decent capacity, OK-ish price 1.8" PATA SSD that would fit in a Rio Karma, so I got one via eBay. It never worked properly. Sent it back for repair, at my cost; almost 1/3 the amount of the overall drive. Came back four weeks later, still didn't work properly in any PATA device. Basically just flash errors everywhere. Just had to write it off. I won't touch KingSpec kit again after that. A dud device is one thing, but acknowledging a fault, "repairing" it at the owner's cost and sending back a still broken device? No thanks.
Wireless charging: Saving the planet through fewer wires, because the vendor specific mutually incompatible wireless chargers connect to the mains by magic, nobody ships tablets with USB cables for data anyway and induction charging is 100% efficient, wasting no energy at all compared to wired charging.
Everyone also loves having to make room to position a 7" tablet flat on its back on a wireless charger and make sure it doesn't get knocked out of place. Or if the charger clips the tablet in, then everyone must love the extra plastic bulk for something that still ends up physically connected to the device. But not by a wire. Wires are bad now. Yay.
Gotta love those down to earth folks in marketing.
Got to love how some potential nutjob says "I did it! It was me!" with no evidence whatsoever (handily, he has to keep that secret to avoid "blacklists" - where's my tinfoil hat?) and, perhaps because it's Apple, every commentard thus far appears to have just swallowed the story without question.
Thus, a stream of comments about how 4 hours is too long, how the messenger gets shot and so-forth - all on the basis of one guy, on the internet, making a claim without evidence.
Thunderbolt is PCIe over a cable. Your internal expansion cards are now external. Enclosures exist that let you plug in whatever cards you like, even if you have no expansion inside your actual computer. So your laptop dock just allowed you to have a full spec desktop while docked, complete with powerful (but external, now) graphics card, arbitrary drives, monitors, other expansion ports and so-on.
USB 3 is a completely different technology, a distinct bus/interface, compatible only with itself, with huge piles of driver software layers implementing both the USB 3 and older parts of a standard that was a bit of a creaking mess even back when USB 2 was added. All these years later, there's a good reason vendors are moving to things like SPI to connect internal low bandwidth interfaces rather than dangling them off USB - it's just too slow and software-heavy, even for USB 1. Things got a lot worse when USB replaced PS/2 for those devices, and now, finally, SPI sorts that out.
So what of Thunderbolt? Why is it so rare? The one obvious area where Thunderbolt fails technically is the hugely expensive active cables. Even if they were cheap, active cables are an obvious point for things to fail (and fail badly). But even putting that aside, Intel and Apple decided to push the technology via transfer speeds - a peeing contest. Rather than just say "It's way faster than USB 3" and *then* focus on all the unique and really interesting aspects, they just said "It's way faster than USB 3" and pretty much stopped there. Customers are left in the dark. From their perspective, it's just a funny looking port that has really expensive peripherals. The genuinely remarkable opportunities it offers for new form factors - a much better approach for a hybrid desktop->laptop->tablet kind of affair for example - have been overlooked.
Net result: Very little uptake, no economies of scale, specialist peripheral vendors only, extortionate prices.
The forthcoming updated Mac Pro shows what can be done, but I fear too little, too late and too specialised. I'd love to upgrade my now very old Mac Pro with the new machine, but even if I can afford the base unit, I almost certainly cannot afford the peripherals I'll need for the expansion cards and drives I'd need to carry over.
So El Reg lapped up the nonsense about still having a "working web app", yet then went onto describe how you need to call into the Firefox OS API if you want to get any real work done. Much as HTML 5 might try, it still doesn't (thank heavens) replicate everything a native OS can do, along with general housekeeping / "being a good citizen".
For example, accessing the platform-wide GUI toolkit, or producing a platform-wide notification broadcast, requires cooperation with the system and that requires an API. Screen orientation. System settings. Bluetooth. USB. Basic telephony. APIs, APIs, APIs.
As you can see from the Wiki page, Mozilla list standards bodies they're trying to push the APIs into, or where they may already exist. If this were Microsoft, everyone would cry foul...
Bizarrely, in some respects this is actually LESS open than writing web apps under a cross platform framework such as Cordova, or even writing a "raw" un-integrated web app that just runs in the web browser and requires the user to save a link to it onto their home screen / apps list, where you really are limited to just the published and implemented APIs available in the browser environment. Or - as we used to call it once - write a web page.
And good luck writing something like Alchemy with that. Good luck even writing something like Audiobus with it!
Judging by the comments thus far, people don't know what a hotspot is. Which is hard to believe. So, one assumes a major reading comprehension fail.
This is talking about *tethering* - when the phone is set up as a WiFi hotspot; a gateway to its mobile data service. This is of course disabled by default on all smartphones (due to the major battery hit) and not even allowed by some carriers.
On the iPhone, when this is explicitly activated by the user in the Settings app, a pseudorandom password is presented to the user so that their other device can connect to the new WiFi hotspot without too much hassle. It's quite short, because the user has to read it on their phone's screen, then type it into their laptop or other device. Sounds like it's not pseudorandom enough!
Since the password verification for WiFi is done at the CONNECTING DEVICE, the iPhone has no idea that someone has tried to crack the password 10,000 times. That's arguably a basic design deficiency in WiFi (if the source of the WiFi hotspot were itself responsible for checking and validating the password before granting access, there would be the opportunity to block such attacks).
Meanwhile, whenever anyone is connected to the iOS hotspot, a permanent glowing bright blue status bar shows them the running tally of connected devices. So at least there is an opportunity for the user to see that more than one device is connected, though yes, it's unlikely most people would be paying attention to their phone's screen rather than their other, connected device's screen. I don't know if there are equivalent, prominent indicators on other popular mobile operating systems.
The iPhone version of iOS 7 is not finished. I can't say much without breaking NDA, except that there are very clearly numerous aspects which are most certainly a work in progress. The iPad version hasn't even been released to developers yet. So we're talking about an unreleased version of an unfinished variant of an unfinished OS.
I have a lot of doubts and concerns about the new interface, as well as areas I like; so as a developer, I file bugs as required then wait and see what happens. In the mean time, there's little point getting all worked up about something when it is in such a state. If it got released to the public like this - well that'd be a different story.
It seems El Reg just wants to be nice and give Apple some free publicity ;-)
Agreed. We used Cordova (née PhoneGap) for a proof of concept app, but we're recoding native for production. Why? All the usual things. The HTML stuff just "isn't very native", without decent looking buttons, incorrectly styled scroll bars, incorrect scrolling physics and so-on (you can see all this very clearly on the Sencha demo video in the article).
More importantly, you were stuffed when you hit low quality implementation issues on the native side of the barrier - e.g. a lack of decent error handling for unexpected events. Camera photos would sometimes just fail to save for no reason; error callbacks wouldn't go off but the "camera wouldn't start" (IYSWIM) or a photo would not save. And often when an error handling path *was* provided in the JS API and *did* get called, the error information would be sparse or entirely lacking (and at its best, bespoke error paths or exceptions in JS are a pathetic shadow of the facilities offered by e.g. the NSError class). All of this makes for a terrible user experience.
Worse still, the memory footprint was far larger than necessary, which ruled out smaller RAM devices on Android (the app would just randomly quit - this is HTML + JS, remember, so that kind of crash ought to be impossible unless you're encountering bugs in the underlying framework over which you have little or no control). This was a pretty small app, too. I'd wager that all the caching shown by Sencha when switching feed views or refreshing will, in fact, be a huge problem in a real app - it'd potentially lead to unbounded RAM use and random app crashes. In short we'd be back where the Facebook app started before it went (partly) native and started to solve some of those very issues.
You *could* try writing caching frameworks in JS to try and mitigate this but how utterly ridiculous that would be; an OS with an HTML+CSS formatter and JS execution engine with an app balanced on top of all of that trying to ignore and replicate all the stuff that's already implemented beneath it. If you want write-once-run-anywhere you'd be better off trying to go back to Java (which leads us straight to Android, ironically).
HTML 5 + JS + CSS 3 are quite simply very badly designed if they're meant to be used for applications rather than dynamic additions to web sites. Very large hard to implement specifications, and highly over-complicated due to legacy and a one-size-fits-all requirement of providing maximum presentation flexibility for arbitrary web sites, just when you want platform consistency for an app.
One ends up either devoting huge amounts of resource to creating your own bespoke application toolkit, right down to the scrolling physics, or you end up using a framework, in which case you've a huge API to learn. In *that* case I'd rather learn then native API and benefit from all the compilation support, IDE support and far more professionally designed-and-fit-for-purpose frameworks that come with it.
Shouldn't have gone out with those faults, but storm, meet teacup:
I wrote about my own experiences with the Glo here:
The more I use it, the more I like it.
Agree about the Kobo Glo. I replaced a Sony PRS-350 with one last week, wanting the higher resolution screen and built in light. I have a Sony case with an LED light on a "stalk", but it's hard to an angle that avoids reflections off the screen or case and, during a long night flight one time, even the lowest brightness seemed very high for the dark cabin environment. It seemed to illuminate the people nearby more than the page!
The Kobo's lowest brightness is still quite bright, but definitely more manageable and of course reflections are not a problem. The software is a bit shaky, but the hardware's nice. If you manage your book collection using Calibre you'll be OK. Worst feature seems to be that some Calibre-loaded ePubs make the device reboot when first opened, but they work nicely after that reboot - and that doesn't take too long as the device is comparatively quick to start up. No very long startup indexing delays as with the Sony either, which is a relief.
Downsides: PDF handling is clunky; it works OK but the Sony was better. Software speed is often not good; reading books is fine (even full page refreshes are much faster than the PRS-350 and the partial refresh implementation seems basically flawless) but certain operations, e.g. changing from page to page in your list of books (annoying) or typing the first (but not subsequent) letter of a search, can be very slow.
On balance, it's quite cheap, quite well made, has lots of very cheap accessories on Amazon - oh, the irony! - and you can get it off-the-shelf from W.H.Smith so you've got a physical high street presence for technical support or returns/exchanges should anything go wrong.
Or I could just buy something like this:
...and that's the equivalent of roughly five spare batteries, but you only have to carry the one "lump", it charges loads of different devices including an iPad or non-Apple kit and you don't have to faff around individually charging lots of vendor specific battery packs. No reboots required either since you're not swapping hardware.
It works well. I've had one exactly as pictured above for a year (for less than the above price - search around, should be available for under £30). I recharged a 4S from less than 10% to 100% three times with it and there were still two LEDs illuminated on its 5 LED display so it probably really will get 4-5 charges of the phone as advertised.
There are loads of alternatives to the same idea with varying capacities, sizes and designs. Excellent for long haul flights and so-on. As for packing multiple replacement batteries for one single specific device? Don't be daft, it's not the 90s anymore :-)
Memory is not the same as storage space. The iPad 1 didn't have much RAM, forcing it to swap to Flash much more when iOS was upgraded. The speed impact could be quite dramatic at times.
As only one other poster, remarkably, has pointed out the "megabug" is in Java 7 only.
Apple stopped supporting Java in-house at Java 6. Development was transferred to Oracle who are entirely responsible for Java 7. Apple's recent update patches currently known vulnerabilities in Java 6. It's up to Oracle to patch Java 7.
Memory 'cells' are rigged up in parallel with clever firmware attempting to read or write from the maximum possible number of such cells at any given time. The operating system doesn't know or care that a given single file may actually be split across lots of different locations on the actual drive silicon. Essentially it's like a RAID stripe system with a very large number of stripes.
That's presumably why this drive's performance decreases as its capacity decreases - the potential for parallel operation is reduced.
Hmmm, don't quite see all the fuss about a stylus... This one seems to work fine with an iPad and all of its various apps including freehand drawing thingies. It'd presumably work just as well with any other capacitive screen tablet.
Seven quid inc P&P, or probably less if you search around a bit. No pressure sensitivity of course, though an app could perhaps detect the contact area of the squishy tip and behave accordingly.
I have a Panasonic plasma that's getting a bit long in the tooth by modern consumer electronics standards. Nonetheless, a few days before the Olympics kicked off it got a software update and there, in the app collection, was "BBC Sport". All Olympic sports - every single event - streamed live in SD and HD over the wired network connection on the TV. Lots of catchup options, little 1-3 minute news snippets to summarise key moments, a medal table, the works.
Basically, a very nicely realised TV version of bbc.co.uk/olympics. Extremely impressive, along with the performance of streaming on iOS devices too. It's not often I get to say this - but from my perspective as a viewer, the technology not only worked, but excelled and exceeded my expectations in almost every respect. That's before we even consider some of the remarkable camera angles, types, speeds and tricks throughout the whole event, though this aspect wasn't down to the BBC.
I note the presence of bbc.co.uk/paralympics which implies similar live streaming over the web; the BBC Olympics iOS app is also showing signs that Paralympics live coverage will be streamed. I'm thus hopeful of just as good coverage over the 'net, whether or not C4's broadcast coverage on its main channels is up to scratch.
People don't buy hardware either. They buy a device, and the device is the integral, inseparable sum total of its hardware and software. Most people probably wouldn't even be able to precisely tell you which bit was which.
Apple recognised that very well written, stable, easy to use, tightly integrated software was a key component in devices. The original iPhone was, compared to other contemporary leading smartphones at release, poorly specified and expensive. But it sold like hotcakes not because of hype, or clichéd notions of hipsters and stereotypes, but because IT WORKED. Other smartphones were a ridiculous, confusing joke of numerous GUI metaphors, design paradigms and serious bugs all wrapped up in one uninspiring package.
Moreover, the iPad and iPhone are often touted as being *all about* the software. "The device gets out of the way". Microsoft "borrowed" this quote, along with several others, when talking about Surface in their launch event. They've realised that you need to have the most unobtrusive hardware you can, which presents the best possible software in the best possible way to the user.
This is why there are app stores, and why app stores generate lots and lots and lots of money. Software is key.
In summary, I can't imagine many ways in which Mr. Asay could have been more wrong in his analysis.
HbbTV is absolutely not based on MHP; there's no Java VM in sight. It's an XHTML+JS+CSS sort of environment (I've done a lot of software development work in this field).
Though the article told me not to, I did bother "checking the web page" and the security bulletin details are now present:
Very few fixes are specific to the Safari application. Everything else is in WebKit. The WebKit engine is shared by e.g. Chrome and most current smartphone platforms. There are many contributors involved in the WebKit fixes, including a few from Apple and some from the Google Chrome team. So, this is a case of both rolling in patches from and rolling out patches to the open source WebKit project.
So for £55 you've a battery which will only charge an iPad to about 40%.
Meanwhile, I found and bought this on Amazon. It charges an iPad to 100% or charges an iPhone fully about 5 times.
The link from my Amazon order history to the product page shows a product that differs from the one I bought last year - mine is white and had a slightly lower quoted capacity - but it seems to be the same general idea and there are several others to choose from with better specs and a much lower price than the Belkin unit reviewed on El Reg.
There's every chance that the minute I so much as go near an aeroplane with it, the thing will burst into flames; but then given my prior experiences with Belkin kit, I'd have much the same risk with the reviewed unit... :-)
...was about an awful lot more than just rounded corners. This was about the device shape, colour scheme, icons, icon arrangement, icon design, icon colours, box design and so-on, all taken together.
This is a great visual comparison of Samsung's comparable smartphone and tablet products from just before the iPhone appeared, then what happened just after:
Going after the 10.1N seems to be on much weaker ground because Samsung have in theory changed their design to avoid the worst of the infringements. The original case, however, was far from frivolous - perhaps that, rather than the opinion of armchair lawyers on web forums, is why the ban on the 10.1 was upheld.
According to the BBC, yes, some retailers saw a drop, such as Morrisons and Tesco. Sainsburys reported record high Christmas sales, though; Greggs and M&S also saw a boost due to food sales; Primark and Ted Baker gave very good reports over Christmas.
Ted Baker: http://www.bbc.co.uk/news/business-16542018
Overall, in the US retail sales roles about 1% compared to last year (astonishing in a so-called recession) and the BRC says that Britain overall did well too, though the high street had to introduce heavy discounts that mean the overall profits aren't as boosted as the sales numbers suggest.
So some retailers did better than others... Hardly news. The underlying story, that both online and high street sales *were* improved compared with last Christmas both in the UK and US, is pretty impressive given the rather dire prevailing economic circumstances.
...more green "up" arrows than orange "down" there.
So why just talk it down all the time and make out like the high street bombed? That's just not true at all. Unless the BBC are wrong - but they seem to have more video reports, statistics and external links to industry bodies to back up their figures.
Hard to use - you have to hover around the thumbnails to see what they are and the sort order seems a bit random. The 8mp Sensation's in there, though, along with the iPhone 4 and (annoying with the focus set to the wrong place, doh!) the 4S. Viewing the zoomed out image gives a good feel for overall balance. But the devil's in the detail as usual... 4, 4s, Nexus S, Galaxy S2, Sensation full size:
The iPhone 4S shot is odd - the book's in perfect focus. Looks like it has quite a shallow depth of field in low light compared to other smartphones and where depth of field is an issue with other shots, the focal point is the in-shot-camera's lens, not the book.
The Nexus S does extremely badly; the Galaxy S2 is probably best. Make sure you're viewing at 100% and look at stuff like noise in the background - uniform for the iPhone and very well controlled on the Galaxy S2, while the Sensation and Nexus S are awful, with big blobs of noise reduced mush. Text on the lens is much harder to read on those too.
There's a Nikon D80 shot there too for balance (DSLR circa 2006):
Looking at the depth of field, grain/noise, colour balance and so-on, I'd say both iPhone shots come very close to the same "look". Whether or not you prefer that, though, is of course a matter of personal judgement.
I really don't know what goes on with El Reg and camera quality comments. Maybe you all have horribly low expectations, but frankly, the sample shots from that HTC are terrible - washed out, colour shifted, heavy-handed noise reduction and over-sharpening. If that's "not bad" I'd hate to see what you classify as "terrible".
Consider this 5mp shot from the now old iPhone 4, with a camera you didn't seem to think was that good either:
Check out the clarity of the text at 100% scale on the postbox and the barely visible halo around letters. Then scroll down to the pavement around the postbox and note all the detail in the bricks and road surface. It's probably a bit over-saturated, but at least the postbox itself is red! Now compare to the very new 8mp "not bad" HTC post box:
It's washed out and the postbox is purple! Despite being 8mp, you can't read most of the text. There's almost no "real" detail in the pavement around the base; just multi-pixel splodges of noise-reduced blur. Scroll up a bit to look at the brick wall mid-left and paving slabs behind and they're just a smoothed out, detail-free expanse of brown and grey.
Bad light, perhaps? Check out the sky from the HTC in this sunny shot:
...and try to find detail in the brickwork. Never mind that, try to find a clear edge to a *brick* that isn't indistinct and blurred. Now look at the old iPhone 4:
Quite heavy noise on the sky, but notice the uniformity; it hasn't got the deep speckling arising from noise exceeding the noise reduction threshold in the HTC, so noise reduction in e.g. Photoshop will be much more successful. And just look at all the brickwork - clear and detailed, with shadow detail, highlight detail, good colour balance and contrast.
Unfortunately your reviews lose credibility when you play Emperor's New Clothes with sample shots that show very different results from the review text that purports to describe them.
You're paying £35K for a camera because, in part, of its 200MP headline feature. Why on Earth *wouldn't* you do pixel peeping? The individual pixels all contribute in some way; if all anyone wanted was some low resolution photos for a web site - or even A4 prints - then single digit megapixels would do. If you're not interested in the pixels don't buy the camera.
The 4x images are pleasantly reminiscent of Foveon sensor output and gives me hope that the rest of the camera market, based on Bayer sensors, might in time adopt similar technology at more "everyday" prices. Then we'd have proper per-pixel sharpness and that extraordinary sense of depth that Foveon images can give, without the drawbacks of excessive shadow noise and colour accuracy problems... And with more manufacturer choices than Sigma-or-nothing.
The 6x images, on the other hand, are just plain broken and surely, at *any* price, such messy, artefact riddled output would be considered a fault. At £35K, it's a bad joke. I don't care *what* the name on the body of the unit is - it's real, factual, forget-the-brand-mythology-nonsense performance that counts. At 4x and 1x the output looks great, but 6x is broken - perhaps the review unit was genuinely faulty?
Me: "What time is it?"
Siri prints "What time is it" and spins for about a second.
Siri: "The time is 12:59."
An analogue clock which also shows my location and the time numerically is printed underneath.
On the other hand, "Where am I?" really ought to just dump the current location but instead it gives the tiresome response about only supporting US locations presently.