52 posts • joined Friday 7th March 2008 14:20 GMT
Re: Why try and make HTML do everything?
Agreed. We used Cordova (née PhoneGap) for a proof of concept app, but we're recoding native for production. Why? All the usual things. The HTML stuff just "isn't very native", without decent looking buttons, incorrectly styled scroll bars, incorrect scrolling physics and so-on (you can see all this very clearly on the Sencha demo video in the article).
More importantly, you were stuffed when you hit low quality implementation issues on the native side of the barrier - e.g. a lack of decent error handling for unexpected events. Camera photos would sometimes just fail to save for no reason; error callbacks wouldn't go off but the "camera wouldn't start" (IYSWIM) or a photo would not save. And often when an error handling path *was* provided in the JS API and *did* get called, the error information would be sparse or entirely lacking (and at its best, bespoke error paths or exceptions in JS are a pathetic shadow of the facilities offered by e.g. the NSError class). All of this makes for a terrible user experience.
Worse still, the memory footprint was far larger than necessary, which ruled out smaller RAM devices on Android (the app would just randomly quit - this is HTML + JS, remember, so that kind of crash ought to be impossible unless you're encountering bugs in the underlying framework over which you have little or no control). This was a pretty small app, too. I'd wager that all the caching shown by Sencha when switching feed views or refreshing will, in fact, be a huge problem in a real app - it'd potentially lead to unbounded RAM use and random app crashes. In short we'd be back where the Facebook app started before it went (partly) native and started to solve some of those very issues.
You *could* try writing caching frameworks in JS to try and mitigate this but how utterly ridiculous that would be; an OS with an HTML+CSS formatter and JS execution engine with an app balanced on top of all of that trying to ignore and replicate all the stuff that's already implemented beneath it. If you want write-once-run-anywhere you'd be better off trying to go back to Java (which leads us straight to Android, ironically).
HTML 5 + JS + CSS 3 are quite simply very badly designed if they're meant to be used for applications rather than dynamic additions to web sites. Very large hard to implement specifications, and highly over-complicated due to legacy and a one-size-fits-all requirement of providing maximum presentation flexibility for arbitrary web sites, just when you want platform consistency for an app.
One ends up either devoting huge amounts of resource to creating your own bespoke application toolkit, right down to the scrolling physics, or you end up using a framework, in which case you've a huge API to learn. In *that* case I'd rather learn then native API and benefit from all the compilation support, IDE support and far more professionally designed-and-fit-for-purpose frameworks that come with it.
Agree about the Kobo Glo. I replaced a Sony PRS-350 with one last week, wanting the higher resolution screen and built in light. I have a Sony case with an LED light on a "stalk", but it's hard to an angle that avoids reflections off the screen or case and, during a long night flight one time, even the lowest brightness seemed very high for the dark cabin environment. It seemed to illuminate the people nearby more than the page!
The Kobo's lowest brightness is still quite bright, but definitely more manageable and of course reflections are not a problem. The software is a bit shaky, but the hardware's nice. If you manage your book collection using Calibre you'll be OK. Worst feature seems to be that some Calibre-loaded ePubs make the device reboot when first opened, but they work nicely after that reboot - and that doesn't take too long as the device is comparatively quick to start up. No very long startup indexing delays as with the Sony either, which is a relief.
Downsides: PDF handling is clunky; it works OK but the Sony was better. Software speed is often not good; reading books is fine (even full page refreshes are much faster than the PRS-350 and the partial refresh implementation seems basically flawless) but certain operations, e.g. changing from page to page in your list of books (annoying) or typing the first (but not subsequent) letter of a search, can be very slow.
On balance, it's quite cheap, quite well made, has lots of very cheap accessories on Amazon - oh, the irony! - and you can get it off-the-shelf from W.H.Smith so you've got a physical high street presence for technical support or returns/exchanges should anything go wrong.
Carrying device specific spare batteries? Really?
Or I could just buy something like this:
...and that's the equivalent of roughly five spare batteries, but you only have to carry the one "lump", it charges loads of different devices including an iPad or non-Apple kit and you don't have to faff around individually charging lots of vendor specific battery packs. No reboots required either since you're not swapping hardware.
It works well. I've had one exactly as pictured above for a year (for less than the above price - search around, should be available for under £30). I recharged a 4S from less than 10% to 100% three times with it and there were still two LEDs illuminated on its 5 LED display so it probably really will get 4-5 charges of the phone as advertised.
There are loads of alternatives to the same idea with varying capacities, sizes and designs. Excellent for long haul flights and so-on. As for packing multiple replacement batteries for one single specific device? Don't be daft, it's not the 90s anymore :-)
Publish a retraction or update, please
As only one other poster, remarkably, has pointed out the "megabug" is in Java 7 only.
Apple stopped supporting Java in-house at Java 6. Development was transferred to Oracle who are entirely responsible for Java 7. Apple's recent update patches currently known vulnerabilities in Java 6. It's up to Oracle to patch Java 7.
Re: Am I missing a point -
Memory 'cells' are rigged up in parallel with clever firmware attempting to read or write from the maximum possible number of such cells at any given time. The operating system doesn't know or care that a given single file may actually be split across lots of different locations on the actual drive silicon. Essentially it's like a RAID stripe system with a very large number of stripes.
That's presumably why this drive's performance decreases as its capacity decreases - the potential for parallel operation is reduced.
Hmmm, don't quite see all the fuss about a stylus... This one seems to work fine with an iPad and all of its various apps including freehand drawing thingies. It'd presumably work just as well with any other capacitive screen tablet.
Seven quid inc P&P, or probably less if you search around a bit. No pressure sensitivity of course, though an app could perhaps detect the contact area of the squishy tip and behave accordingly.
IPTV options were excellent
I have a Panasonic plasma that's getting a bit long in the tooth by modern consumer electronics standards. Nonetheless, a few days before the Olympics kicked off it got a software update and there, in the app collection, was "BBC Sport". All Olympic sports - every single event - streamed live in SD and HD over the wired network connection on the TV. Lots of catchup options, little 1-3 minute news snippets to summarise key moments, a medal table, the works.
Basically, a very nicely realised TV version of bbc.co.uk/olympics. Extremely impressive, along with the performance of streaming on iOS devices too. It's not often I get to say this - but from my perspective as a viewer, the technology not only worked, but excelled and exceeded my expectations in almost every respect. That's before we even consider some of the remarkable camera angles, types, speeds and tricks throughout the whole event, though this aspect wasn't down to the BBC.
I note the presence of bbc.co.uk/paralympics which implies similar live streaming over the web; the BBC Olympics iOS app is also showing signs that Paralympics live coverage will be streamed. I'm thus hopeful of just as good coverage over the 'net, whether or not C4's broadcast coverage on its main channels is up to scratch.
Uuuh, it seems to prove the opposite
People don't buy hardware either. They buy a device, and the device is the integral, inseparable sum total of its hardware and software. Most people probably wouldn't even be able to precisely tell you which bit was which.
Apple recognised that very well written, stable, easy to use, tightly integrated software was a key component in devices. The original iPhone was, compared to other contemporary leading smartphones at release, poorly specified and expensive. But it sold like hotcakes not because of hype, or clichéd notions of hipsters and stereotypes, but because IT WORKED. Other smartphones were a ridiculous, confusing joke of numerous GUI metaphors, design paradigms and serious bugs all wrapped up in one uninspiring package.
Moreover, the iPad and iPhone are often touted as being *all about* the software. "The device gets out of the way". Microsoft "borrowed" this quote, along with several others, when talking about Surface in their launch event. They've realised that you need to have the most unobtrusive hardware you can, which presents the best possible software in the best possible way to the user.
This is why there are app stores, and why app stores generate lots and lots and lots of money. Software is key.
In summary, I can't imagine many ways in which Mr. Asay could have been more wrong in his analysis.
Re: Path dependence
HbbTV is absolutely not based on MHP; there's no Java VM in sight. It's an XHTML+JS+CSS sort of environment (I've done a lot of software development work in this field).
Re: Neither is the number of security vulns fixed: 83
Though the article told me not to, I did bother "checking the web page" and the security bulletin details are now present:
Very few fixes are specific to the Safari application. Everything else is in WebKit. The WebKit engine is shared by e.g. Chrome and most current smartphone platforms. There are many contributors involved in the WebKit fixes, including a few from Apple and some from the Google Chrome team. So, this is a case of both rolling in patches from and rolling out patches to the open source WebKit project.
IMHO, never buy Belkin. EVER.
So for £55 you've a battery which will only charge an iPad to about 40%.
Meanwhile, I found and bought this on Amazon. It charges an iPad to 100% or charges an iPhone fully about 5 times.
The link from my Amazon order history to the product page shows a product that differs from the one I bought last year - mine is white and had a slightly lower quoted capacity - but it seems to be the same general idea and there are several others to choose from with better specs and a much lower price than the Belkin unit reviewed on El Reg.
There's every chance that the minute I so much as go near an aeroplane with it, the thing will burst into flames; but then given my prior experiences with Belkin kit, I'd have much the same risk with the reviewed unit... :-)
The original infringement...
...was about an awful lot more than just rounded corners. This was about the device shape, colour scheme, icons, icon arrangement, icon design, icon colours, box design and so-on, all taken together.
This is a great visual comparison of Samsung's comparable smartphone and tablet products from just before the iPhone appeared, then what happened just after:
Going after the 10.1N seems to be on much weaker ground because Samsung have in theory changed their design to avoid the worst of the infringements. The original case, however, was far from frivolous - perhaps that, rather than the opinion of armchair lawyers on web forums, is why the ban on the 10.1 was upheld.
Are you being misleading deliberately?
According to the BBC, yes, some retailers saw a drop, such as Morrisons and Tesco. Sainsburys reported record high Christmas sales, though; Greggs and M&S also saw a boost due to food sales; Primark and Ted Baker gave very good reports over Christmas.
Ted Baker: http://www.bbc.co.uk/news/business-16542018
Overall, in the US retail sales roles about 1% compared to last year (astonishing in a so-called recession) and the BRC says that Britain overall did well too, though the high street had to introduce heavy discounts that mean the overall profits aren't as boosted as the sales numbers suggest.
So some retailers did better than others... Hardly news. The underlying story, that both online and high street sales *were* improved compared with last Christmas both in the UK and US, is pretty impressive given the rather dire prevailing economic circumstances.
...more green "up" arrows than orange "down" there.
So why just talk it down all the time and make out like the high street bombed? That's just not true at all. Unless the BBC are wrong - but they seem to have more video reports, statistics and external links to industry bodies to back up their figures.
Side by side
Hard to use - you have to hover around the thumbnails to see what they are and the sort order seems a bit random. The 8mp Sensation's in there, though, along with the iPhone 4 and (annoying with the focus set to the wrong place, doh!) the 4S. Viewing the zoomed out image gives a good feel for overall balance. But the devil's in the detail as usual... 4, 4s, Nexus S, Galaxy S2, Sensation full size:
The iPhone 4S shot is odd - the book's in perfect focus. Looks like it has quite a shallow depth of field in low light compared to other smartphones and where depth of field is an issue with other shots, the focal point is the in-shot-camera's lens, not the book.
The Nexus S does extremely badly; the Galaxy S2 is probably best. Make sure you're viewing at 100% and look at stuff like noise in the background - uniform for the iPhone and very well controlled on the Galaxy S2, while the Sensation and Nexus S are awful, with big blobs of noise reduced mush. Text on the lens is much harder to read on those too.
There's a Nikon D80 shot there too for balance (DSLR circa 2006):
Looking at the depth of field, grain/noise, colour balance and so-on, I'd say both iPhone shots come very close to the same "look". Whether or not you prefer that, though, is of course a matter of personal judgement.
You're paying £35K for a camera because, in part, of its 200MP headline feature. Why on Earth *wouldn't* you do pixel peeping? The individual pixels all contribute in some way; if all anyone wanted was some low resolution photos for a web site - or even A4 prints - then single digit megapixels would do. If you're not interested in the pixels don't buy the camera.
The 4x images are pleasantly reminiscent of Foveon sensor output and gives me hope that the rest of the camera market, based on Bayer sensors, might in time adopt similar technology at more "everyday" prices. Then we'd have proper per-pixel sharpness and that extraordinary sense of depth that Foveon images can give, without the drawbacks of excessive shadow noise and colour accuracy problems... And with more manufacturer choices than Sigma-or-nothing.
The 6x images, on the other hand, are just plain broken and surely, at *any* price, such messy, artefact riddled output would be considered a fault. At £35K, it's a bad joke. I don't care *what* the name on the body of the unit is - it's real, factual, forget-the-brand-mythology-nonsense performance that counts. At 4x and 1x the output looks great, but 6x is broken - perhaps the review unit was genuinely faulty?
Camera "not too bad"?
I really don't know what goes on with El Reg and camera quality comments. Maybe you all have horribly low expectations, but frankly, the sample shots from that HTC are terrible - washed out, colour shifted, heavy-handed noise reduction and over-sharpening. If that's "not bad" I'd hate to see what you classify as "terrible".
Consider this 5mp shot from the now old iPhone 4, with a camera you didn't seem to think was that good either:
Check out the clarity of the text at 100% scale on the postbox and the barely visible halo around letters. Then scroll down to the pavement around the postbox and note all the detail in the bricks and road surface. It's probably a bit over-saturated, but at least the postbox itself is red! Now compare to the very new 8mp "not bad" HTC post box:
It's washed out and the postbox is purple! Despite being 8mp, you can't read most of the text. There's almost no "real" detail in the pavement around the base; just multi-pixel splodges of noise-reduced blur. Scroll up a bit to look at the brick wall mid-left and paving slabs behind and they're just a smoothed out, detail-free expanse of brown and grey.
Bad light, perhaps? Check out the sky from the HTC in this sunny shot:
...and try to find detail in the brickwork. Never mind that, try to find a clear edge to a *brick* that isn't indistinct and blurred. Now look at the old iPhone 4:
Quite heavy noise on the sky, but notice the uniformity; it hasn't got the deep speckling arising from noise exceeding the noise reduction threshold in the HTC, so noise reduction in e.g. Photoshop will be much more successful. And just look at all the brickwork - clear and detailed, with shadow detail, highlight detail, good colour balance and contrast.
Unfortunately your reviews lose credibility when you play Emperor's New Clothes with sample shots that show very different results from the review text that purports to describe them.
Me: "What time is it?"
Siri prints "What time is it" and spins for about a second.
Siri: "The time is 12:59."
An analogue clock which also shows my location and the time numerically is printed underneath.
On the other hand, "Where am I?" really ought to just dump the current location but instead it gives the tiresome response about only supporting US locations presently.
Facepalm quotes galore
Clearly, Mr.Chen has never written JS for more than one browser and has presumably not read any of the numerous articles about JS based exploits.
"One of the key features of the web is that it's safe to click on any link..."
Meanwhile, Mr.Upson is apparently talking about a different 'web' on some planet other than Earth.
"Before, when you downloaded a native application, you had to install it and you had to trust it"
How would providing a sandbox at the OS level be any better or worse than providing a sandbox built into the browser? Are Google completely unaware of all the code signing and sandbox technology that's been going into the mainstream operating systems for the last few years?
So this isn't about safety...
...it's about providing the legendary ease of development, easy of debugging and stability of C and C++ combined with the legendary reliability and security of the cross-platform seamless environment that is the web browser. After all, HTML and CSS are just *the* best thing in the word for making application-like user interfaces, yes?
Just a handful of quotes in the article have managed to give me stronger reasons to avoid Native Client than any number of comp.risks digests might have done! :-)
Re: Only the reg?
What's that got to do with "the iOS ecosystem"? It was doing just fine before Apple decided to insist on in-app subscriptions and ask for 30% of the pie... But the article isn't about this. It is trying to say that HTML applications threaten the iOS App Store model for certain application genres. I was just pointing out that this is nothing new, indeed, it was the *only* way to develop on iPhone OS 1.
Meanwhile, Apple revised their subscriptions policy a while ago in response to complaints. Publishers can charge whatever they want, on or off the store, so subscription-based application developers aren't (are no longer) being driven away from the App Store by the 30% 'burden'. The following article is from a very pro-Apple site but has lots of links to other sources so you can verify this for yourself:
Incidentally, on the whole short memory thing - Amazon used to charge publishers a whopping 70% to publish through Kindle Store! They revised this to 30% at the end of 2010, though bloggers are still charged 70%. See "Terms & Conditions" in:
Grim reading. So when it comes to Amazon complaining about Apple's subscription charges, frankly, cry me a river and then remind me who's supposed to be being unreasonably expensive, again?
El Reg has a short memory
The original iPhone had no native SDK. Apple said you could do all you wanted in HTML. A native SDK came later and required iOS 2.0.
That's why Safari has the "Save to Home screen" option. It wouldn't be there if Apple wanted web applications to be discouraged. People have long been writing both HTML and native apps; there's even a web apps "store" on Apple's own site.
"Trouble a-brewing in the cosy iOS ecosystem"? Trouble a-brewing for journalistic relevance and integrity perhaps... With hundreds of thousands of apps on the store, a few publisher's applications moving to HTML only presentation will make little difference to the profitability of iOS.
The languages may have a similar name and a superficially similar syntax, but that's all they have in common.
A brief look at the Chrome extensions API shows interfaces for browser windows, visit history, cookies... Are you *sure* that extension you just downloaded hasn't been sending all your cookies off to some shady remote server somewhere?
Note the "getAll" and "getAllCookieStores" methods. Sure, the manifest needs to specify permissions for that, but we know what users do when an OS asks them about it - "<foo> wants to do <bar>, is that OK?" - "yes".
Being unable to write native code clearly reduces the range of attacks possible on the platform, but claiming that security problems are a thing of the past or trying to punt them off as a 'web problem' is nonsense. Well, it's marketing, which is much the same thing ;-)
Personally, I've adopted the "50 foot barge pole" policy with this particular OS.
There are other options
Wait for the $69 thumb drive in Autumn, or burn the install image to DVD yourself for free (see Google).
That said, Lion includes a recovery partition as part of the installation (OS X can repartition HFS volumes without data loss; the Lion installer does just that). If you reboot and hold down Command + R just after the chime, you'll boot into this partition and be able to play with Disk Utility, reinstall the OS (if somehow things *really* get badly broken) and do a bunch of other stuff. This covers the lion's share (hahaha, etc.) of rarely needed "recovery" tasks for OS X.
Of course, the above does not help if you want to install Lion onto a brand new, blank hard disc. The thumb drive, DVD or (if you've a very, very new Mac) EFI network installation options are needed for that.
Yes, you can
I typed "apple support" into Google and got: Apple - Support
www.apple.com/uk/support as the first hit. I'm no expert, but I reckon that's where you should go to find out what went wrong with your Lion installation.
Your purchase history for all things lower case "i" is indeed associated with your iTunes account. This is one of the reasons some people get all nervous about Apple and its level of knowledge / control over data. But this is also enormously useful; if you completely clean out your machine, you can download again anything at all you ever bought from iTunes. Apple are keeping a sort-of-free backup for you (this process being formalised and extended via iCloud and things like the not free music matching service).
You can repeat the download process for other machines you might own - e.g. a desktop and a laptop - or mobile devices, for content such as music which is relevant to both. The various App Stores are in essence selling you site licences, rather than single use licences, but usually selling them at or below the single use licence price seen on corresponding vendor web sites (where applicable). AFAICT from reading the lawyer-speak, you get to install on up to as "few" as 5 machines, possibly up to 10 and, some claim, as many as you own.
Allegedly a security issue
This is hardly user friendly, but:
...though if possible (and of course, it isn't always) I recommend asking your NAS vendor for a firmware update to support a more secure authentication mechanism.
Redmond Marketing Machine
It is remarkable how out of touch and wide of the mark Microsoft seem to be when it comes to much of their recent marketing. Apple's "Mac vs. PC" turning into the dreadful "I'm a PC" campaign (no, I'm a person, thanks) is a good recent example.
The latest incarnation features a green-shirted sales drone in a room of someone's house he has invaded and turned into a clone of an Apple store. He has the temerity to greet the homeowner by saying "thanks for popping by" in her own home. Said woman is understandably delighted by this and purchases a brown laptop.
Then we come to this article and, herein, an example of Microsoft's excellent product naming strategy. BPOS - Big Piece Of... Were they *really* unaware of the widely used slang acronym "POS"?
How come even Microsoft's advertising is being outperformed by competitors - even the typically unbearably smug output from Apple? How can a company go so consistently wrong?
Well, not SQL perhaps...
...but Ruby is really rather nice.
But it doesn't matter
With respect, I'm afraid you still don't get it.
The 5GB storage could be reduced to zero free storage. If you wanted to keep the wireless sync capability, you'd pay. Otherwise, you'd just use iTunes and wires again, exactly as everyone has to today. There's no lock-in and no requirement.
Are you reaching for your keyboard to say that Apple could remove wired sync support, perchance? Well why don't they just charge for wired sync or charge for iTunes downloads, then? They've had a decade to do it. Why do you think they don't? It's the same reason they won't charge for rudimentary iCloud usage.
Your "30 days" refers to Photo Stream. It's the length of time a new photo persists on Apple's servers before evaporating. As long as your various devices "see" a network within that time, any new photos will be sync'd down them. Once they're gone from Apple's servers, it doesn't matter, because they're still on your devices and always will be until you delete them - photos are obviously always on the device that took the picture as well as on any other devices that sync'd with iCloud within the 30 day period.
If 30 days was reduced to zero without paying, you'd just lose the wireless auto-sync stuff. So you'd go back to sync'ing via iTunes and iPhoto with wires. See how this works? It's an extra facility, for free, that takes nothing away from what you already do whatsoever.
iCloud is NOT a backup service or a remote data store. It is a SYNC SERVICE with no GUI. It's plumbing. Backup is still left up to the users through Time Machine or a third party network backup service. With Mobile Me going away, some people are complaining that paid services are being removed - DropBox should be ecstatic, because Mobile Me's iDisk was a competitor to them. Now it's gone. iCloud provides no arbitrary file storage, so DropBox just got some new customers. Flickr may pick up some Gallery refugees (the iCloud photo stream is - once more, this time with feeling - just a sync service, not a gallery engine) and there will be a few web hosting companies enjoying picking off ex-iWeb people too. Of course iCloud may regain some of this in future, but if so, it'd only be going back to where MobileMe used to be.
Apple aren't treading on toes - if anything, they're actually stepping off a few.
It's a wake up call
I hadn't really realised just how low IT industry journalism had sunk until the June 6th WWDC keynote and the extraordinary deluge of journalistic tripe that followed.
iCloud boils down to a glorified seamless wireless sync system and as a result, all your data *must* be stored on your *local* hardware devices. If every server Apple owned were to simultaneously evaporate, individual users would lose precisely no data whatsoever.
But where's the anti-Apple, anti-cloud, nonsense lock-in black helicopter tinfoil hat raving lunatic ranting in that? No fun at all. So hey, let's all just make stuff up.
I was on a flight back from NZ with an iPad, via Thailand. I managed to get through 5 of the six parts of the SE version of the Lord Of The Rings trilogy and about an hour into the final part before we were coming into land at London. During the brief stopover at Thailand, I did a bit of e-mail and web browsing over a 2G SIM (no 3G there) for about half an hour. I also spent around an hour listening to music when I was trying to get some sleep.
Both were night flights and obviously while in the plane the iPad was in flight mode. The backlight was turned low - about 25% - since the plane was dark anyway. Nonetheless, sitting there watching movies, listening to music and with the half hour non-flight-mode 2G session, I must have used at least 12 hours of battery. According to the meter there was still 15% charge left when I turned it off for landing and IME the battery indicator is fairly linear and accurate in its behaviour.
That's the closest to a continuous playback stress test I've put it through. Even so, heavy use since I got it last year has drilled home one thing that's the stand-out feature of the device for me, as someone who hates having to keep charging things: The iPad battery life is absolutely outstanding. It can be completely relied upon to last a full day without charge under heavy use.
That's what Apple say
Apple seem to agree with you! Touch is for tablets and phones, not for monitors. That's what the Magic (ugh) Trackpad is about and why they've got multitouch trackpads in the laptops. The new multitouch features in Lion are all gesture based, not point-and-touch based.
While they may yet converge the GUI of the likes of a 10" iPad and a 27" iMac, it looks some way off - if indeed it ever happens at all. Lion is about taking *ideas* from iOS and using them on OS X, not about replacing it entirely as some IMHO very misguided pundits seem to be claiming.
When the iPhone was announced, Apple said that they thought you needed an entirely new GUI for touch screen devices compared with more traditional form factors. Microsoft, trying to use the WIMP / Start menu metaphor with Windows Mobile, persisted with their dated attempts for one-size-fits-all before finally abandoning it and coming up with WP7. And rare as it is for me to say good things about Microsoft, their GUI looks like a genuinely different approach in both philosophy and design, rather than just another me-too iOS clone. But Microsoft, being champions of the entirely missed point, are now trying to bodge that back onto Windows - and using a totally different implementation language just for kicks.
With Lion and iOS 5, I really don't yet see signs of the Apple approach changing. I just see a remarkably comprehensive set of very well integrated and well regarded products, but from a company who appear to be too fond of control for our own good and seem far too happy to press the legal red button. It's a dilemma - arguably the best technology and unmatched when taken as an end to end solution, but all provided by an arguably questionable company.
It makes me wonder if in the medium to long term, there's a by-then-monopoly break up brewing; iTunes to be run by an independent organisation, for example.
...and so I'll say it *yet again*...
If people believed they couldn't get a virus, they wouldn't have believed a fake web site telling them they had a virus, wouldn't have downloaded the fake anti-virus package and wouldn't have run it and clicked through the install process (with, or more recently without, having to type in admin privileges). And then typed in the credit card details too.
The success of the recent phishing scam / trojan combo can only be put down to people believing that their Macs are definitely *not* immune to viruses. So, quite the opposite of what you suggest.
Here we go again
If people were so convinced that Macs were immune to any kind of malware, why did they download an anti-virus package from a website that said they had a virus? A virus that they believed they couldn't get? Nope; they were all too happy to believe what the web site said.
They downloaded the package, double-clicked on the installer, clicked through the warning that the package may be malicious (you get that for any downloaded executable), clicked through the installer procedure, typed in the admin user name and password and then ran the software. Some even then went and typed their credit card details into its GUI.
If people really *did* believe their machines were immune, they never would have believed the web site saying it had a virus and never would have downloaded, or been infected by, the malicious package in the first place. Oh, the irony!
Don't ask Lewis, ask the IAEA
On the ground:
"At two locations in Fukushima Prefecture gamma dose rate and beta-gamma contamination measurements have been repeated. These measurements showed high beta-gamma contamination levels" ... "results ranged from 2-160 microsieverts per hour, which compares to a typical natural background level of around 0.1 microsieverts per hour. High levels of beta-gamma contamination have been measured between 16-58 km from the plant. Available results show contamination ranging from 0.2-0.9 MBq per square metre"
"Results provided recently by the Japanese authorities range up to 55 000 Bq per kg of I-131 in samples of Spinach taken in in the Ibaraki Prefecture. These high values are significantly above Japanese limits for restricting food consumption (i.e. 2 000 Bq/kg)."
I'm sure "high levels of beta-gamma contamination" are perfectly safe, and the government have made their food safety standards far too tight, since levels of radiation over 27 times higher than the maximum amount permitted are all just fine.
After all, having to spray water from fire trucks through the shattered remains of a building housing a nuclear reactor is just all in a day's work in the world's safest power generating industry.
If you're into tinkering with fun bits of kit, you could do worse than an ARM-based BeagleBoard XM. In addition to Linux, it runs RISC OS too thanks to the shared source effort RISC OS Open (of which I'm part).
Since your screenshots were from OS X, I guess you haven't realised that OS X includes a built-in virtual keyboard. It's fairly well buried though. To set it up for easy access:
* Start System Preferences
* Go to Language & Text
* Go to the Input Sources tab
* Turn on "Show Input menu in menu bar"
You'll see a flag or other placeholder image appear in the menu bar. This is useful for various things, including accessing a very flexible full Unicode character palette for unusual character selections (select "Show Character Viewer" from the menu popped up by clicking on the menu bar icon). The virtual keyboard is accessed by the "Show Keyboard Viewer" menu entry.
The always-on-top keyboard presents a "live" view showing you keys as you hit them and updates itself as modifiers are pressed, so you can see exactly what Shift, Ctrl etc. will do. It's also bidirectional so you can click on it to enter characters and can be resized from a pen-or-mouse-friendly to huge-and-finger-friendly sizes.
Still missing the point
Here we go, another too-long post... :-)
An extra monitor doesn't help with e.g. portable group working and may not easily lie next to the keyboard for glance-down reading. There may be window focus and control issues (think "workflow interruption") - a touchscreen monitor may help with that. The 7" Mimo 720S is around £145, so almost 1/3 the price of an iPad, but it's stand-based (can't lie flat on its back due to USB input port), small (7" 800x480), the touchscreen is quite unresponsive (no multitouch etc., don't know how good the OS X driver is for e.g. inertial scrolling) and it requires an attached computer to be any use. All that said, if *all* someone wanted was an extra display for reading, your suggestion is a good one and who knows, I may yet go that way if I decide I just don't want to spend >= £429 on a gadget!
The two monitors I have are high quality units. Even though they're getting old, replacing one or both of them with similar quality, larger, higher resolution unit(s) to achieve greater desktop area would cost considerably more than an iPad does, although perhaps balanced by resale value of the old monitor(s). That would pay for a display-only device tethered to my desktop machine and the mains. Lots of people might find a 2nd monitor or bigger main monitor a good choice if - again - all the other functions a(ny arbitrary) tablet can perform were unimportant.
A graphics tablet connected to a laptop would allow me to use a pen or maybe fingers for quick sketching. However, as with any remote control device, the screen and the control surface are not one and the same; this is why combined graphics tablet and monitor units are becoming very popular with graphic designers and enthusiastic amateurs - who have sufficiently deep pockets. At least it can be a portable solution, ish.
Using a laptop in a meeting creates the "craned neck problem", which I imagine most El Reg readers are familiar with - having the laptop at an angle which makes it visible to two or more people yet also accessible for typing and the trackpad (or a tablet, maintaining visual coordination between tablet and screen) is difficult. Being able to lay the screen flat on the table and still interact with it would help. Better still, if screen and control surface are one, everyone can get involved with the hands-on work. There are laptops with twistable screens which turn into tablets but these tend to be expensive and the full-on laptop compromises of weight and battery life bite. There are interesting cheap devices like Always Innovating's Touchbook too. That might be a contender were it not for its unresponsive touchscreen and a lack of software support. It's a great device, but it doesn't fit my use cases; if you're going to do fat-finger computing, you really do need a fat-finger GUI, not just a launcher on top of a standard WIMP environment.
You're all correct, there are other ways to do all the things I mentioned but there are devices which fit particular use cases better or make life easier. One can find cheap solutions or kludges to achieve a goal, but it's not necessarily the best way to go about something. Tablet computers are often expensive, but as I said before, this doesn't necessarily make them overpriced or worthless. IMHO the key to it all has been to recognise the need for a completely dedicated tablet UI. Apple got there first in the mainstream (albeit not first overall) with the iPhone, but the post-iPhone Smartphone GUI "boom" shows that lots of other people now have the kind of software stacks which could do just as good (or better) a job.
The computer industry obviously perceives numerous use cases for tablets because it's been making them for years and years now. The original question was if Apple's specific "locked down" spin on a tablet is a good solution and personally, I can see lots of uses for it already, and I'm pretty sure that I'll discover all sorts of things I hadn't thought of if I end up owning one. I'm rather hoping, however, that someone will release a really strong Android (or similar) contender in the mean time, because I'd quite like to have a choice of similar devices - right now Apple are the only company to have produced something that actually works properly. While lots of people are complaining about it not being a useful device, lots of other companies are busy copying it :-)
I have uses for it
I'm a software engineer with many years of experience on multiple platforms. Most projects I work on have one thing in common: Large, highly interrelated APIs. Learning the languages for development is relatively easy, even if it does take a while to understand the deeper nuances of each; learning the APIs, on the other hand, is a very big job. API documentation tends to be in heavily linked HTML or well indexed PDF format. Being able to very swiftly move around the documentation, both within a document and by switching between them, is vital to keep productivity high and frustration down.
I usually want both of my two monitors for development (one for the code, the other for the results of the code). Switching between virtual desktops or shuffling windows around interrupts workflow, so I tend to read documentation on a third display - a laptop at present. As with most laptops it has a landscape orientation display which is poor for reading reference material. It doesn't easily sit next to my keyboard for "glance down reading" because it's got a keyboard, which is essentially wasted space; with the display folded back to a viewable angle it has a large combined footprint. So - it can be used for the task, but it's a long way from ideal.
What I really need is an e-reader. An e-ink display is no good because it's too unresponsive for rapid navigation. This puts me into the domain of tablets. For those, I have other use cases - infinite virtual paper for sketches during client meetings; custom applications for mock-up page layouts for web design; that kind of thing. An accurate, responsive touchscreen is thus a benefit. It'd be good to allow remote desktop connections, and sure, decent web browsing for general use when I'm sat square-eyed in front of the TV but want to be even *more* square-eyed at the web. There were numerous applications covering all of this even before the iPad was released.
So we get to the price issue. Before launch, prophesies of a $999 starting price abounded. The web was full of confident assertions that the iPad would cost a cool grand, usually coupled with statements along the lines of "...needs to cost half as much to consider it". So guess what? It costs half as much, but people still say it's too expensive.
IMHO it's not particularly *overpriced*, it's just *a lot of money* for the unit. There's a difference. It's annoying in the UK that the economy has managed to tank so hard (!) that the exchange rate to USD is so unfavourable - a 1.6/1.7 rate or better would probably have meant a sub-£400 entry price - but exchange rates plus mandatory VAT and duty hikes apply to any US company.
What of the competition? What other high accuracy touchscreen tablet computer can I get which will do what I require? Well - not many. The JooJoo tablet proved, by sadly sucking really hard, that everything Apple was saying about Atom chipsets and Flash power consumption are pretty much spot on. The HP tablet, which was going to be the "iPad Killer", was such a great success that HP have killed it and are rumoured to be working on a WebOS alternative. As for Microsoft, they haven't even released any real Windows Phone 7 Series devices yet and Courier was nothing but imagineware.
Many companies claim to be working on similar tablets, though. Considering the iPad is supposed to be - many commenters would have me believe - so very overpriced and very useless, it's surprising that so many other companies seem to want to copy it.
My biggest concern is the lockdown issue - but lo and behold, it's already been jailbroken. Problem solved. Meanwhile, if you don't want a locked or unlocked Apple tablet because you want one with a "full OS", then why haven't you bought one already? They've been around for the last decade or so.
The web supports pictures, you know...
Thanks for a group test of printers which didn't at any time show a single solitary scan or photograph of the printed output from the printers you tested. The reader is left guessing at how things look from your prose and trusting that your reviewer judgement and priorities for image quality happen to precisely match theirs.
Re: HTML 5
Couldn't agree more.
While everyone is arguing over video tags, I love the way that audio tags are mostly ignored. For example, Firefox won't handle MP3s - only Vorbis files. Many more people understand MP3 than understand H264 or Theora, so to many more people, the idea that you can't use MP3 audio files will be particularly bizarre. The same reason exists of course - Vorbis is open/free, MP3 is not.
These technologically incompetent arguments continue to amaze me. If you must integrate support for specific CODECs into your core browser, then do so. If you really *must* insist on that bloat, fine. But if you encounter a CODEC you can't handle, don't sit there like a lemon; use the OS you're running on. That's what it's there for.
I've said it before...
Every major OS ships with a video playing framework, I's barking mad for a web browser to implement its own framework and CODEC internally. Just call the OS! That's what it's there for.
If you really want to ensure Theora is present, then fine, support that in your browser if you must. And if you encounter H264 video with no internal decoder? Just call the OS.
There seems to be no rational technical argument here whatsoever IMHO.
Re: Could be useful
"And surely the iPad is powerful enough to cope?"
This just shows the Citrix client software; the iPad emulator is running a dumb terminal, not running Windows. The copy of Windows is running on another machine. The whole thing is just a demonstration of a remote desktop client running on the iPad with a publicity stunt screenshot that seems to have worked wonders!
Mired in politics
Here we are running operating systems such as Linux, OS X and Windows that typically require gigabytes of disc space on installation and almost as much RAM on boot up, yet apparently nobody expects the operating system to supply facilities for image decoding, or video decoding, or audio decoding. People seem to be expecting it to be all built into some humungous great monolithic browser "lump" because "we don't want plugins".
Making use of your host operating system's API to render images, render video, render audio, render fonts, handle colour, do printing etc. etc. is exactly what a browser *should* be doing, all the time. Just because the browser recognises the <video> tag and decides it doesn't need to launch an external plugin - and by the way, it could just as easily do that with HTML 4's <object>, for which HTML 5's <video> is really just a shorthand subset - does NOT mean that the browser has to have an entire media playback engine complete with CODECs all *built into itself*. That would be a preposterous solution.
So why is a mandatory video format being pressed for? Because big companies are involved and the whole things has become mired in corporate politics. This has nothing to do with the engineering abilities of those companies IMHO.
Incidentally, just because a browser recognises <video> does not mean it avoids plugins just as handling <object> does not mean that a plugin must be launched. The browser may choose to simply show a "play" icon placeholder and when the user activates it, fire up some external helper application or embed a known video handling plugin instead. The helper application option may well be a sensible thing to do for web browsers where screen real estate is limited (e.g. mobile phones), making it easier for the user to control the video and, in particular, show it full screen.
Re: 240 lines
In the broadcast world, "lines" of resolution is not the same thing as scanlines. It's giving you an indication of "horizontal" / per-scan-line / columns of picture detail. Moreover, since the Elgato device has an S-Video port on it, presumably they're expecting some users to play back SVHS cassettes. More information on resolution and VHS / SVHS can be found on this NTSC-centric Wikipedia article:
Most VHS and SVHS decks record and play back all 625 scan lines of broadcast PAL (of which roughly 576 are intended to be visible, apart from anything lost to overscan). They vary in the amount of detail captured on each of those lines, though. Nonetheless even mediocre VHS is usually good enough to reliably record and play back analogue wide screen signalling and decent SVHS can sometimes record and playback teletext - interesting if you're trying to capture page 888 subtitles.
The NTSC 640x480 resolution of the Elgato device is poor (if depressingly common for "cheap" video capture units) and makes it look rather like a kind of lazy afterthought import with PAL bolted on, throwing away real scan lines which even VHS does record. There isn't much detail to start with so throwing away entire lines of it is just silly - the less you have to start with, the more important it becomes to retain as much of it as possible. No mention of interlaced recording is made and I imagine it does its own deinterlacing job without asking.
If you want to burn a DVD from your 640x480 25 FPS new digital movie, you're going to have it scaled back up to 576 visible lines again, transcode it to MPEG 2 *and* you've lost interlace information; the net result will most likely be quite rough. If Elgato are going to keep the hardware cheap by omitting a dedicated encoder chip, the least they could've done would be to offer an "advanced" settings window which presents the standard QuickTime encoder settings, allowing the user to choose their preferred CODEC (e.g. DV, AIC or ProRes for a near-lossless master, or MPEG 2 for straight-to-DVD use).
- FLABBER-JASTED: It's 'jif', NOT '.gif', says man who should know
- Analysis Spam and the Byzantine Empire: How Bitcoin tech REALLY works
- VIDEO Herschel Space Observatory spots galaxies merging
- Apple cored: Samsung sells 10 million Galaxy S4 in a month
- More than half of Windows 8 users just treat it like Windows 7