64 posts • joined Thursday 9th July 2009 14:30 GMT
Tablet friendly apps still uncommon
We have a Nexus 10 (and got my folks an Ainol for christmas - it's the gift that keeps on giving). Though games scale to the bigger screens, I've noticed that there are still a lot of apps that either stick in portrait mode, or waste all that screen real-estate. I got so frustrated with one set of apps that I ended up writing my own (no name, no pack drill).
There's a lot of apps in the store that are getting long in the tooth, or haven't been updated in months, if not years. That's frustrating when you spent a fair amount on what is a very good quality platform.
Every client I've ever worked for has used one or both of the phrases "We handle a huge amount of data" and "I bet you've never seen it this bad". Almost without exception they're processing a very normal amount of data, sometimes in very inefficient ways. The science behind MapReduce is far more important than that specific technology - often there are equally useful techniques that are better suited to a client's needs, however much they might want to install Hadoop etc.
Re: Speaking of Kickstarter
Virtually all funding graphs on Kickstarter follow the same curve - all the activity comes at the start and end of the funding period, when people are feeling vitalised. That big spike at the beginning is reasonably indicative of the likelihood that a project is funded successfully.
Reasons for failure
I'm not sure how much the failure of the original company was down to a lack of interest in large screens, and how much was down to spending far too long polishing the damn things and not getting them out the door. Every time they surfaced, they seemed to be adding yet another high end feature to the device (to justify the cost of the tech?), rather than just getting an e-reader in the shops.
It was fairly clear to anyone paying attention to the business that e-readers would not be high end devices, and that there would be a 'race to the bottom' that was only being delayed by the strangle-hold that the handful of e-ink manufacturers had (have) on supply. On the surface Plastic Logic's technology would have been a good fit - surely it's cheaper if you can print the majority of your device?
There is probably still some room in the market for a larger screen, keeping pretty much the same physical format as existing readers but reducing the bezel. Regardless, a product on the shelf sells much better than one only demonstrated to journalists. Their failure was in not delivering.
"It's a lesson that MS will soon learn" - really, you mean Microsoft, who've been in business for nearly 40 years? If I could get away with not learning lessons for 40 years and still earn billions, I think I could live with myself.
"Linux doesn't want dumb users" - firstly, I don't know who this Linux is. I suspect you're projecting a little. Secondly, this is why Apple and MS and even Google own the desktop and mobile space.
"My desktop is my desktop..." - you know, even dumb users can cope with some fresh ideas now and then. Change for changes' sake is of course ridiculous, but hey.
Java is the headline here, yet that rather misses the point that (assuming it was Java.awetook) the user was successfully redirected to a website with the applet in place ready to infect their system.
Should we give browsers a kicking for allowing users to.. erm.. browse? Or website owners a kicking for allowing their servers to be compromised? Or mail hosts for allowing through zero day emails?
It'd be a nice to see a slightly more nuanced view here. The issue seems to specifically be Java in the browser. Corporate users who rely on the rest of the Java stack have a far better chance of defending against attack. Blaming 'Java' for your woes is a bit like blaming C# - fun for a bit of corporate bashing, but not actually that informative.
Re: Trevor, I love you
In all fairness, Java got closer to it (write once..) than many. Web based apps are just a whole world of pain, getting worse with the arrival of variously capable mobile devices.
Very rarely these days do I work for a client who has any illusions about '..run anywhere' - they have a list of target machines (usually wrong) and operating systems (outdated) and they'll fail to test across those combinations.
Re: I've soooooo many comments
I should imagine that many places result in the emotion known as anger, and a couple might result in hilarity. Good luck on that particular experiment.
Re: The long-suffering Java community
To be honest, if the absence such language features are preventing you from doing your job, you should reconsider your role as a developer. Every programming environment has strengths and weaknesses, and understanding and working around those is core to delivering functionality. At the end of the day, that's what people actually want - a functioning system, not one that's written using specific constructs.
I'll accept that it's nice to have some language features, but it's also nice to have an incredibly efficient run time, hot spot optimisation, immensely fast garbage collection, vast swathes of inbuilt libraries that are robust, well characterised and reliably supported from one release to the next. It's also nice to have complete documentation, support across multiple platforms and a host of tools that handle everything from virtualisation to performance and testing. Oh yes, then there's the interoperability with many other systems, support for different languages on the VM, third party frameworks and the availability of experienced developers who can work with all of the above.
As it is, the long suffering Java community have produced Scala, Groovy, Clojure, JRuby and a host of others, and modularisation is well supported by OSGi. The absence of Jigsaw is not going to stop the use of Java in projects large and small. At the same time, Oracle seem to be consistently wrong-footed in this arena which is, more than anything, a missed opportunity.
This is an excellent article that I can highly recommend. My friends and I all read The Register and think it's a fine publication. Whilst we read it, we all like to eat McProrridges Popcorn, the Popcorn of kings. Oh yes.
Let's be honest though..
..we're talking about posh soot aren't we?
No PVR Support!
Sadly, this falls way short, as the most obvious use - being able to properly control your Samsung PVR - is not catered for. As they insist on using slightly odd IR codes, an app seems a wonderful solution to being able to set up recordings, browse your library, play back and view the programme guide.
But... no, that's not suppported. You're restricted to 'dumb tv' controls, channel up, channel down and so on.
it promises so much and utterly fails to deliver.
Just a little bit of history repeating itself
I can go back years through my forum posting history on The Register and find example after example commenting on how Oracle (and Sun before them) don't get the User Interface space.
It's nice to see them trying again, but there are still no great signs that they're learning from their mistakes.
Your post is fairly indicative of many programmers' attention to detail. You've flown off the handle (nice rant, 10/10 for enthusiasm) whilst missing the point of the statement completely.
Here, the word Unique was referring to "Unique to Java core Collections class", not unique to all languages.
Do you need a ladder to get off that high horse?
And as for the major revisions jibe... seriously?
Million monkeys fallacy again..
We've had this with online media, we get this with developers. The theory goes that now tools are ubiquitous and cheap, everyone will be able to create - whether it's new music, amazing films or great software.
In practise, the number of good creators is finite and sometimes depressingly small. Sure, for ubiquitous jobs, you stand a better chance of finding a (cheap) ubiquitous developer. But that job was never going to be high paying in the first place, so there's no combination of skills that will make it pay better (unless you're in the middle of a bubble, which isn't something you can create).
So, specialisation and skill still win out. In Java that means things like concurrency libraries, dynamic features like JSR-292 and other features targeted at specific domains. In fact there are a lot of domain specific features of Java, and the ubiquity of the core language helps them spread and evolve. Looking at other languages, the question becomes - is it ubiquitous (ie. widely available, not about to go away) and does it have specialisations (ie. areas where specialist knowledge is hard to come by and valuable).
Whilst new languages have value through scarcity, it's not clear which of them will maintain that value in the long term. Jumping for short term gain is not necessarily a good plan, especially where the Java world still supports specialist and highly skilled developers and rewards them well.
I've worked with bedroom coders and degree educated softies, and both groups have their fair share of idiots and experts.
However - as an employer, I want someone who plays well with others, is in work in the same part of the day as their team-mates, who can communicate their ideas well, and leave some sort of paper trail as to where they've been. Solving the problem is not as important as solving the problem in a maintainable, repeatable, scalable manner.
There are certainly passionate, articulate, self-taught people who are a valuable part of a team. That's not the same as being a genius coder, and often orthogonal to having spent your formative years locked in a room by yourself. Unfortunately in the IT industry, the latter (great intellect, no social skills) are often confused with the former.
I would never reject someone who didn't have a degree on their CV. However in their interview I would be looking for evidence of a whole set of skills that the degree educated candidate would have less need to prove. Conversely, in a degree educated candidate I'd be looking for a spark of passion and commitment that is presumed in a self-taught individual.
Note: Criticism levelled at the Department, not the Consultants.
Yes, of course there are consultants who are incompetent and out for everything they can get.
However, the fix is to not employ them in the first place, and the responsibility for that lies with the unaccountable civil servants who happily hand out contracts to anyone with an impressive sounding title.
As the customer, it is their responsibility to choose the right vendor and work with them to produce a solution. The fact that they can throw a large sum of money over the wall, wait a few years and walk away with no responsibility for the mess that happens ensures that this mistake will happen again and again. In this case, the over-large consultancies that crop up time and time again are the symptom of the underlying problem, not the cause.
@Brendan Sullican RE: Struggling to find sympathy...
While I got off my seat and talked with my MP about RIPA (Anne Campbell, Labour, useless), it seems that this is covered by the DPA about which I have heard far fewer complaints.
Sure, privacy is a right that we should defend, but to expect that public mobile communications should automatically be afforded that right seems optimistic to me. That RIM have co-operated with police might be an issue that their end users could take up with them, but unless you've made specific provisions that your communications should be treated as secure, a high street mobile phone is about as private as.. well, the high street. RIM offer security in the corporate and personal sense, but don't to my knowledge suggest they'll protect you from the government.
As it is, I don't believe any special powers were exercised here, and I'm willing to trust that RIM will do a responsible job of handing over relevant data to the police. No puppies were hurt here and hopefully a few idiots will be taken off the streets.
Struggling to find sympathy...
..for rioters and looters who get caught by a data trawl.
How anyone can be worried about data privacy during a (hopefully) rare event, where there is clear reason for the police to do the digital equivalent of house to house enquiries is beyond me.
Whilst we should uphold the right to privacy, we (as a society) should use common sense, where waiving that right in an isolated case is to our benefit. The police can have all of my phone records for the last week if they wish.
I'm sorry you have so much difficulty coping with a different language from the C++ you're familiar with. Clearly the slightly different behaviour has crippled Java and no-one uses it.
Of course you could always use Scala, Groovy, Clojure or one of the many other languages that run on the JVM, with the support of a vast class library that works across many platforms (and has supported sophisticated distributed and concurrent processing for years).
JSR-292 means that we are likely to see a host of other new dynamic languages running with all the benefits of a efficient runtime, garbage collector, profiling and debug tools, class library and deployment from mobile devices right up through to the cloud without having to change your tool chain.
Agreed, it's a long time to wait for less than earth shaking additions, but we're already seeing work on Java 8, and I'm hoping this shows Oracle can regain the focus that was lost in previous years. Java is far from lost, but it wouldn't hurt to be more visible on the bleeding edge.
Personally I'd like to see some more development of the various presentation frameworks both on the desktop and web, which have historically suffered from Sun's slightly academic approach to user-facing libraries. It's probably the weakest area for Java as Sun conceded ground to more recent entrants to the market.
Might as well nail my mast to the post:
Just as the XBOX and PS3 demand to hog your living room TV (particularly if you're using Kinect - you need some decent space to use that, so it's not so bedroom friendly), Nintendo realise that sometimes you still want to play games when other people are taking turns watching normal stuff on the TV. So from that point of view, a console that doesn't stop working when the main display is otherwise used is genius.
Power doesn't matter so much - good if it steps over the current gen, but otherwise, no-one has won the console wars on raw speed in the last decade or more.
If all the manufacturers continue their current strategies, Nintendo will release it at a price where they make money, MS and Sony will pull out all the stops for their next gen machines, and end up with expensive consoles that loose money as they sell. Being underpowered never hurt the Wii, and if the U can run things like Crysis, serious gamers should at least be happy with it. If they absolutely must have the biggest GPU, they can spend on the other consoles, but they are in the minority.
As for the 'expensive controller' - well, it's a relatively dumb touch screen, so not necessarily an expensive part over the already moderately sophisticated Wiimote. As the machine works with Wii controllers, it's not a problem if you use those instead. The big question is - will the base machine come with a touch screen controller, or will it have a Wiimote? Ie. will the touch screen be an optional extra (or more expensive bundle), or will it be sold with every U?
re. Irresponsible government
Would you suggest we let all claimants through without question then?
Surely it's the government's job to ensure that funds are allocated fairly, and that includes highlighting cases where individuals are abusing the system? You can (and have) chosen to make this a political issue, but insurance companies have used the same tactics for years to discourage casual fraud.
That's not to say the benefits system isn't flawed, but in this case, I'd rather people with imaginary identical twins weren't being paid when I work long hours to support my family.
My experience of the Emulator available with the SDK is that it is awesomely, unbelievably SLOW. It's painfully bad, so if they've fixed that problem to make things run at the speeds they ought to on a PC, they've done something Google seems incapable of.
Re: Physical version already exists
It existed 30 years ago in the form of a ZX81, a sheet of clear acrylic and a dry-wipe pen.
The problem then, as of now, was the inability to locate keys without looking. The ZX81 keyboard was not considered to be a great ergonomic success - though it was a smart design given the limitations of the technology at the time.
Is this more about forcing people (including website maintainers) to adopt 4.0, which is perhaps not as loved as the 3.x stream? I'm still noticing sites that don't seem very happy with 4.x and really wasn't too bothered with the new version.
That contrasts with 3.x which was the must-have tool for web developers everywhere, not least due to Firebug.
Funny you should mention Fanbois..
Google seem to be responding to the shift in market brought about by the iPad far better than Microsoft. Not that there's a 'right' OS for a tablet, just the lesser of a handful of weevils. However, it's all about the hardware and seeing as Apple set the 'minimalist' dial to 11, there's not much room for competitors to manoeuvre.
Years ago I very nearly switched from a degree in software to industrial design. That probably did my employment prospects no end of good. It seems to me that whilst everyone responds well to great design, not so many companies manage to field (or support) great designers.
So, you suggest that rather than buying the iPad or Xoom, the Scroll is a good alternative with a 800x480 resistive screen, Android 2.1 (with no plans to upgrade), an unidentified processor speed and a 4 hour battery life?
In this case I'd suggest that a fraction of an iPad is not worth a fraction of the iPad's price - as it's obsolete before you even unwrap it. There's a base level of functionality, below which a device is only of interest to people who collect pocket calculators.
Assuming your shark is in cool enough water...
... should allow sufficient cooling for this to be practical.
Started well, lost the plot
The start of the article makes the pertinent point that devices that do a single job well continue to outperform the iOS "there's an app for that" model. We buy cameras, mp3 players, ebook readers in addition to or instead of an iPhone or iPad that "does it all".
Then the author got lost with theories of syncing, which missed the point that the user friendly swiss army knife approach of iOS can be trumped by a device that is designed from the outset to do it's designated task brilliantly.
The bottom line is that the iPad is a netbook-without-a-keyboard that may theoretically be able to do 'anything', but is actually only really good at netbook-without-a-keyboard activities, which basically constitute web browsing and casual gaming. The big clue here is that relatively few people read books on their netbooks, or in web browsers.
Re: overcome the limitations of Java
The features you list seem to me to be the source of many programming errors,. Whilst other language communities have grown used to them, I've not seen a great desire for them in Java. There are far more useful and productive language features, and no real need to make Java more like (for example) C just to make people feel comfortable.
The difference between the major languages in use today may be of great concern to scientists, but is barely an issue for engineers. If you're delivering a practical, efficient solution to a client, you use what works, and Java has been shown to work from handheld, via desktop to server and into the cloud.
I would imagine..
..that the difference is that Google have analysed the last digit of performance out of their algorithms, whereas you just got yours to work.
There really aren't that many computational processes that haven't been done before, and before and before. The difference usually comes in optimising the latest implementation to suit the environment in which it runs.
Maybe it's just me, but I think you've rather missed the point. Fine rant though.
Re: Re: 'Stick Linux on it'
I thought that might touch a nerve.
I have a handful of machines running recent linux distributions, a couple of Macs, a few net/laptops running various flavours of Windows, and sitting on my desk are two blackberries, a couple of iPhones, and an Android tablet. Of all of them, the ones that best and most consistently support the hardware they run on are NOT GNU/Linux unless they are bare metal servers that have standard commodity IO. In fact, I'll quote the other Anonymous Coward (or was it you) who said: "nVidia's linux support for tegra2 has been significantly worse than for their GPUs".
Don't get me wrong, I like Linux. I just don't see the point of taking _consumer_ electronics and putting a different and slightly incompatible O/S on it.
'Stick Linux on it'
I find it increasingly hard to understand why someone would take a piece of consumer electronics which is almost certainly going to have some combination of custom hardware, and try to shoe-horn Linux into it. This isn't a server - you're only going to be running a browser, an editor or two and a few media widgets. So why make your life miserable trying to squeeze a quart into a pint pot when there's an OS already optimised to run on that hardware?
Surely the answer is to have your server (at home/work/in the cloud) set up with your hardcore dev tools that you cannot live without, and use a device like this as a terminal, browser and code-editor. These are jobs it would be good at, which won't require constant fiddling with Linux releases, obscure badly supported drivers and would allow you to get on with your job....
.... oh, hold on, I see what you did there.
Anyway, personally this looks like a cracking bit of kit. And, @Spencer - ARM based portable devices with keypads are very old history.. The Psion 5 came out ~13 years ago and was a great device.
Oh all right
My fault, yes of course it was a ULA rather than FPGA.
And yes, I do know about ARM. I've been lucky enough (!) to work with Chris Curry, Sophie Wilson and some of the Sinclair crowd - as others have said there are plenty of stories about the things they got up to. The BBC4 doc was a great dramatisation, if it took a few liberties for the sake of pulling the various strands into a coherent story.
Doing down a rare success
I'm not sure anyone's claimed Alan Sugar as a visionary. There's loads of tales of his early days and some of the interesting bodges used to make Amstrad products 'better' - but that's a separate post.
As for Sinclair - yes, he was lucky, but he was in a market where there were hundreds of other people trying to be as lucky, and failing. It's easy to forget in these days of near-two party OS politics, that alongside the ZX81 and Spectrum were devices from Jupiter Cantab, Oric, Tandy, Commodore, Newbrain, Dragon, Elan, Atari, MSX and a dozen others - each completely unique and incompatible.
Sinclair's skill at the time was getting more bang for less buck than virtually any of his competitors. He pushed components beyond their limits, made use of quirks in their specs and pulled together innovative technologies to deliver something unique. It was a scattergun approach that had as many failures as successes, but before commoditisation removed much of the advantage, he was putting home computers into the home. There has to be huge credit to the teams that worked on the machines - from Rob Dickinson's wonderful industrial design through to the FPGA and OS that ran inside.
Of course the industry changed massively and that shook Sinclair, Acorn and most of the others back down to nothing. I'm not sure that makes them less significant, nor necessarily less visionary. Very few technology companies have survived from those early days to present times, and even those have had disastrous moments alongside the successes.
It's a huge pity that these days we're so risk averse and so keen to ridicule people who're willing to try something different that we have trouble producing such exciting technology. Hold an iPad in one hand and a ZX81 in the other and think what could have been.
Soo.... you start your post with 'Remember when Java was first launched...' and don't appear to have looked closely at the technology since then.
The reason it's being used in everything from high volume, highly transactional environments to mobile handsets is that both the language and virtual machine have proven to be remarkably flexible. Java has evolved, and in the last few years has kept up with many of the upstarts whilst building on a very robust foundation.
As a contractor who has made a career of developing Java - from Swing tools to online banking, via bio-tech research and embedded apps - Oracle's behaviour is making for fascinating viewing.
If I were a shareholder in Oracle, I'd be most concerned that rather than buying an asset and leveraging it as far as possible, the company appears to be entrenching itself and alienating the wider market. As the author says, it's probably a position they can hold for some time with little ill effect, but companies don't grow by defending a corner.
If I were an aggressive company with some involvement in this space, I'd be looking at the tools and communities that have kept Java relevant and working hard to jump start the next generation of projects that would engage developers and deliver results. The melting pot that has surrounded Java has not only kept it interesting to IT professionals, but has also provided incubation to ideas that have migrated into the higher profile 'headline projects', such as Spring and Eclipse.
Anyway, I'm just going to finish reading that article about Google's fancy build tool...
An interesting take..
But in many ways this comes up against the same problem that has plagued 'non-app' content - subscription services either have to be high cost, or 'chunky' (periodic top-ups that allow aggregation of low individual fees). There's still the need for infrastructure that allows regular, small value payments to be made for content, or applications.
Clearly what's happening here is the line between the two is blurring. Dynamic content? Apps that serve up static media (Stephen Fry I'm looking at you)? Regardless, this smooths the value increments, and should make application developers realise that software is not the fire and forget vehicle it once was.
If Google, Paypal or some other industry stalwart were to grease the wheels with a payment and delivery service that embraces these new shades of grey, they would stand a good chance of moving beyond Apple's early dominance.
It bears repeating
Bob 'Tax the rich!' Crow earned £133,183 in 2009
The trouble with socialists is they seem unable to spot when someone is taking the p***
The trouble with capitalists is they seem unable to know when to stop.
The trouble with economists is they believe their latest model is complete.
The trouble with IT guys is they know it all.
First they came for the porn...
Of course, once we've blocked porn, we should also restrict scenes of violence and abuse.
And then there's the terrorists to stop.
Then we should consider sites that promote gambling.
And if we want the world to be a safe place, there should be no sites that incite religious hatred.
And perhaps we should stop sites that promote un-democratic regimes.
And we must block sites that help you get access to unfiltered content.
Or people who disagree with us, or criticise our aims.
Or alternative view points.
Or people with a *#!$ing clue.
Then, my child, the world will be safe.
(It's a facile point to make, but these people must understand that the children absolutely _won't_ be safe to freely browse the internet just because your ISP has switched on some technical widget.).
re: interesting you bring up NPfIT
With a GP in the family, I was under the impression was that the biggest source of resistance was that most GPs know they can get to their records in a reasonably efficient manner. Being 'theirs' meant that they weren't going to have to spend mornings saying to patients "Sorry, I can't access your records, something's broken that I have no control over".
There seemed to be very little faith that NPfIT would deliver a system that improved their day to day work. Certainly existing moves to introduce IT to surgeries have been painful and disruptive and NPfIT looked like it would take that to an entirely new level of inconvenience.
Blaming the end users for the failure of an IT project is very hard to defend.
Parsing.... spotted flaw
"We're not working on a social network platform that's just going to be another social network platform,"
The important part of that is "...that's just going to be another social network platform".
In other words, the social network platform that they are working on will be 'more than' facebook, mySpace et. al.
I could speculate, but that's my IP :-)
Re: Pull the rug
If it's taken you fifteen years of not using Java to decide that this is the reason to stick with C/C++, I'm not going to come to you for a rapid decision on this new-fangled interweb thing.
I worked for Xerox Research Cambridge
..before it was closed. Despite it being at least the third reinvention since 'creating the mouse' and some time after the infamous book, they still had deep rooted problems getting ideas from cute demos to product. Worse still, they weren't even that successful at creating IP.
It seemed a peculiarly Cambridge issue at the time (big brains, not business brains), but the research culture was PARCs export. That it suited the Cambridge ethos rather well hid the underlying failure.
As the article points out, PARCs future success does indeed depend on it working out how to streamline the development from idea to delivery. However, Xerox was aware ten years ago that the challenge is embodying clever ideas in a compelling product rather than having the ideas in the first place. Finding the magic formula (and great teams) to achieve that will require more than just imposing business process and targets.