77 posts • joined Thursday 9th July 2009 14:30 GMT
Given how the Register produces fascinating 'history of computing' articles, the fuzziness around the development of the first ARM processor, and what exactly Acorn did is a bit disappointing.
As is the lack of technical detail of how exactly Apple surprised ARM with their implementation. If someone tells you that they were surprised by something, wouldn't you ask what and why?
Re: Missing the point
If you're building thousands of devices, the same device is available as a solder on module and I imagine the company can sell you a private bit of cloud that you can operate yourself. They've solved the integration and configuration issues that you would otherwise have to deal with if you buy one of those other SoCs.
The fact that we're still seeing different offerings like this suggests to me that none of the other products have created the universal solution to IoT devices. To claim this is less valid when all of it's competitors fall down in various ways is somewhat mean spirited.
Re: Missing the point
Except (as I understand it) your zigbee module (that offers no real cost savings or simpler integration than this unit) needs a separate base station and further integration. If I want to control a zigbee unit from my android phone, how much extra kit do I need to buy and configure?
Don't get me wrong, ZigBee, RasPi and all the others have specific niches - and it seems to me that this has it's own useful niche too. All the posters attacking it because it isn't one of those other bits of technology make no sense to me.
Re: Do you need a degree to...
I'm sorry, but good grief, your idea of computer science sounds terrifyingly like the typical gormless middle manager who's convinced he doesn't need to understand the technical details because he's got a firm hand on the budget.
To be honest, the thing that would most invigorate the entire industry is returning the focus to getting actual value from the work we're doing. Creating something. A real result rather than the hyped up nonsense of improved social media penetration and merry-go-round startups who's only existence seems to be to insert themselves in the middle of a perfectly functional value chain to no good end.
We can create and transform industries with the work we do. We can discover new science and help people live longer, healthier lives. We can deliver outstanding entertainment and we can improve every one of the human senses. We can teach. We can save lives and predict deaths. We can create delightful experiences. We can aid discovery and remember for you. Yet all of these things get lost in the mire of big data projects with no discernible outcome or over-hyped startups with paper-thin business models that boil down to selling more adverts. If you ask people today what computers do for them, they'll tell you Google and Amazon - not for the feats of engineering that uphold those companies, but for the experience of being sold stuff at every point of interaction with a machine. If you want people to be excited about computers, we need to start being excited ourselves, and to throw off the hype around businesses who's only value is coincidental to the actual technology and function being created.
Doctors and lawyers get a good rap because people can see what they do. Make people well, prosecute the guilty, protect the innocent. Computer scientists have lost their identity to telephone sanitisers and snake oil salesmen. Real outcomes excite people, not vague nonsense.
The same applies to kids. They want to see outcomes. In our day, getting an LED flashing was still relatively novel - and a sufficiently big step that moving on to a fully working robot seemed only another small step away. The excitement and imagined possibilities drew us in and we learnt around them. These days, getting a RasPi to light an LED or launch a website is utterly mundane and children are left asking where they can go next. The things that excite them (high quality games, Facebook et. al) seem just as far away as they were when they didn't know how to get a linux partition to boot. The challenge to educators is to get children to a platform where they can achieve things that involve them before having to understand decades worth of technological advancement.
Missing the point
I think a few of the commentards are missing the point here.
Ignoring the 'I can hack anything' market (which is small, but noisy), this is aimed at the 'deliver something to the consumer' market which is much more boring but far larger in scale. If I, as a manufacturer of light fittings, wanted to make a light fitting that I could control from a mobile, I'd want a module that does all the boring stuff for me, leaving me to do the last step (switch the light on). I'd want it in a tiny form factor and available as a solder-down module that I can just fit into my light fittings.
Arduino, RaspPi and so on don't do that. They're large and designed for hobbyists - general purpose hacking devices. That's great, but they're the most expensive option when it comes to integrating with a bit of consumer kit that just needs to perform simple functions and be controlled over the 'net.
I'm not sure if this is the 'right' answer, but it's a lot better than being told to hack around with a Pi just to perform the most basic of activities.
That's the problem with left wing readers - they get confused between economic arguments and moral/social ones. You can have a 'right wing' view on the world and still believe granny should get her hip - but you don't start mixing it up with some made up economic benefit.
Try re-reading the article again without the knee jerk reaction. Just because his politics are different to yours doesn't make the discussion moot. And, again, the article (as I read it) doesn't argue against national health care.
Seriously, why do people think they have some monopoly on caring? This is the biggest fraud perpetrated in modern politics - that somehow any given political group has the unique ability to care about old ladies and children. This comment will be downvoted, of course, but it doesn't stop me wanting to see my dear old mum with a knee replacement, or my kids getting a good education.
Agreed. Teabags only need a quick swill round or the tannins overwhelm the taste.
At the risk of loosing any tea tasting credentials, I've fond memories of tea served by a rather lovely flatmate who would add a generous slug of brandy to a mix of earl grey and assam. Most evenings would end up with us around the kitchen table setting the world to rights.
I have one in my loft, which has only been switched on a couple of times. I have a feeling the keyboard membrane has suffered in the mean time - when it was last tried out, half a row of keys didn't respond.
One day I'll find the time to have a proper play with it.
I don't think that's the case. The vast majority of consumers use tablets for casual browsing and gaming and are happy to cut down on their demands to fit a form factor and user experience that is more convenient and instantly to hand.
Gamers and pro-sumers are more likely to get their speed-fix by getting a new Playstation or XBox this Christmas, and there are some bargain retina-display laptops around in the wake of Intel's Ultrabook push.
The bottom line is there are very few apps demanding enough to warrant a high-end processor and display. I've never heard anyone complaining "My tablet isn't powerful enough" - expectations have been managed, and so price becomes a factor. Bargain Android tablets should do well, and the big boost to the ecosystem will help the high end devices sell as the platform is seen as increasingly ubiquitous.
Re: Dyson have a point.
Name names, which is this mythical super-efficient hoo.. dy.. vacuum cleaner? Enquiring minds have nothing better to do during their lunch break.
There still seems to be many cases of "if we build it, they will come" - mining any data that comes to hand in the hope that some third party will see value in it.
The bottom line has to be that a company has to be willing and able to take some clear action in response to big data insights. Knowing more about your business/customers does not in itself unlock value.
On the other hand, I've seen clients add 8 figure sums to their annual turnover by incorporating simple insights to their interactions. If a company can make that connection, from insight to action then it can see good returns on big data.
Storm is a useful framework, but Nathan Marz seems to have been overwhelmed by the demands of a wider community - documentation is poor, features unexplained and integration with the rest of the enterprise spotty.
If moving under the Apache banner can address these issues, Storm offers some unique capabilities.
The complete data centre example is perhaps a bit misleading. When you're running with continuous/one touch deploy, speed matters. Being able to rapidly deploy or revert a configuration, preferably without absorbing all the time of your dev-ops guys becomes very valuable.
Tablet friendly apps still uncommon
We have a Nexus 10 (and got my folks an Ainol for christmas - it's the gift that keeps on giving). Though games scale to the bigger screens, I've noticed that there are still a lot of apps that either stick in portrait mode, or waste all that screen real-estate. I got so frustrated with one set of apps that I ended up writing my own (no name, no pack drill).
There's a lot of apps in the store that are getting long in the tooth, or haven't been updated in months, if not years. That's frustrating when you spent a fair amount on what is a very good quality platform.
Every client I've ever worked for has used one or both of the phrases "We handle a huge amount of data" and "I bet you've never seen it this bad". Almost without exception they're processing a very normal amount of data, sometimes in very inefficient ways. The science behind MapReduce is far more important than that specific technology - often there are equally useful techniques that are better suited to a client's needs, however much they might want to install Hadoop etc.
Re: Speaking of Kickstarter
Virtually all funding graphs on Kickstarter follow the same curve - all the activity comes at the start and end of the funding period, when people are feeling vitalised. That big spike at the beginning is reasonably indicative of the likelihood that a project is funded successfully.
Reasons for failure
I'm not sure how much the failure of the original company was down to a lack of interest in large screens, and how much was down to spending far too long polishing the damn things and not getting them out the door. Every time they surfaced, they seemed to be adding yet another high end feature to the device (to justify the cost of the tech?), rather than just getting an e-reader in the shops.
It was fairly clear to anyone paying attention to the business that e-readers would not be high end devices, and that there would be a 'race to the bottom' that was only being delayed by the strangle-hold that the handful of e-ink manufacturers had (have) on supply. On the surface Plastic Logic's technology would have been a good fit - surely it's cheaper if you can print the majority of your device?
There is probably still some room in the market for a larger screen, keeping pretty much the same physical format as existing readers but reducing the bezel. Regardless, a product on the shelf sells much better than one only demonstrated to journalists. Their failure was in not delivering.
"It's a lesson that MS will soon learn" - really, you mean Microsoft, who've been in business for nearly 40 years? If I could get away with not learning lessons for 40 years and still earn billions, I think I could live with myself.
"Linux doesn't want dumb users" - firstly, I don't know who this Linux is. I suspect you're projecting a little. Secondly, this is why Apple and MS and even Google own the desktop and mobile space.
"My desktop is my desktop..." - you know, even dumb users can cope with some fresh ideas now and then. Change for changes' sake is of course ridiculous, but hey.
Java is the headline here, yet that rather misses the point that (assuming it was Java.awetook) the user was successfully redirected to a website with the applet in place ready to infect their system.
Should we give browsers a kicking for allowing users to.. erm.. browse? Or website owners a kicking for allowing their servers to be compromised? Or mail hosts for allowing through zero day emails?
It'd be a nice to see a slightly more nuanced view here. The issue seems to specifically be Java in the browser. Corporate users who rely on the rest of the Java stack have a far better chance of defending against attack. Blaming 'Java' for your woes is a bit like blaming C# - fun for a bit of corporate bashing, but not actually that informative.
Re: Trevor, I love you
In all fairness, Java got closer to it (write once..) than many. Web based apps are just a whole world of pain, getting worse with the arrival of variously capable mobile devices.
Very rarely these days do I work for a client who has any illusions about '..run anywhere' - they have a list of target machines (usually wrong) and operating systems (outdated) and they'll fail to test across those combinations.
Re: I've soooooo many comments
I should imagine that many places result in the emotion known as anger, and a couple might result in hilarity. Good luck on that particular experiment.
Re: The long-suffering Java community
To be honest, if the absence such language features are preventing you from doing your job, you should reconsider your role as a developer. Every programming environment has strengths and weaknesses, and understanding and working around those is core to delivering functionality. At the end of the day, that's what people actually want - a functioning system, not one that's written using specific constructs.
I'll accept that it's nice to have some language features, but it's also nice to have an incredibly efficient run time, hot spot optimisation, immensely fast garbage collection, vast swathes of inbuilt libraries that are robust, well characterised and reliably supported from one release to the next. It's also nice to have complete documentation, support across multiple platforms and a host of tools that handle everything from virtualisation to performance and testing. Oh yes, then there's the interoperability with many other systems, support for different languages on the VM, third party frameworks and the availability of experienced developers who can work with all of the above.
As it is, the long suffering Java community have produced Scala, Groovy, Clojure, JRuby and a host of others, and modularisation is well supported by OSGi. The absence of Jigsaw is not going to stop the use of Java in projects large and small. At the same time, Oracle seem to be consistently wrong-footed in this arena which is, more than anything, a missed opportunity.
This is an excellent article that I can highly recommend. My friends and I all read The Register and think it's a fine publication. Whilst we read it, we all like to eat McProrridges Popcorn, the Popcorn of kings. Oh yes.
Let's be honest though..
..we're talking about posh soot aren't we?
No PVR Support!
Sadly, this falls way short, as the most obvious use - being able to properly control your Samsung PVR - is not catered for. As they insist on using slightly odd IR codes, an app seems a wonderful solution to being able to set up recordings, browse your library, play back and view the programme guide.
But... no, that's not suppported. You're restricted to 'dumb tv' controls, channel up, channel down and so on.
it promises so much and utterly fails to deliver.
Just a little bit of history repeating itself
I can go back years through my forum posting history on The Register and find example after example commenting on how Oracle (and Sun before them) don't get the User Interface space.
It's nice to see them trying again, but there are still no great signs that they're learning from their mistakes.
Your post is fairly indicative of many programmers' attention to detail. You've flown off the handle (nice rant, 10/10 for enthusiasm) whilst missing the point of the statement completely.
Here, the word Unique was referring to "Unique to Java core Collections class", not unique to all languages.
Do you need a ladder to get off that high horse?
And as for the major revisions jibe... seriously?
Million monkeys fallacy again..
We've had this with online media, we get this with developers. The theory goes that now tools are ubiquitous and cheap, everyone will be able to create - whether it's new music, amazing films or great software.
In practise, the number of good creators is finite and sometimes depressingly small. Sure, for ubiquitous jobs, you stand a better chance of finding a (cheap) ubiquitous developer. But that job was never going to be high paying in the first place, so there's no combination of skills that will make it pay better (unless you're in the middle of a bubble, which isn't something you can create).
So, specialisation and skill still win out. In Java that means things like concurrency libraries, dynamic features like JSR-292 and other features targeted at specific domains. In fact there are a lot of domain specific features of Java, and the ubiquity of the core language helps them spread and evolve. Looking at other languages, the question becomes - is it ubiquitous (ie. widely available, not about to go away) and does it have specialisations (ie. areas where specialist knowledge is hard to come by and valuable).
Whilst new languages have value through scarcity, it's not clear which of them will maintain that value in the long term. Jumping for short term gain is not necessarily a good plan, especially where the Java world still supports specialist and highly skilled developers and rewards them well.
I've worked with bedroom coders and degree educated softies, and both groups have their fair share of idiots and experts.
However - as an employer, I want someone who plays well with others, is in work in the same part of the day as their team-mates, who can communicate their ideas well, and leave some sort of paper trail as to where they've been. Solving the problem is not as important as solving the problem in a maintainable, repeatable, scalable manner.
There are certainly passionate, articulate, self-taught people who are a valuable part of a team. That's not the same as being a genius coder, and often orthogonal to having spent your formative years locked in a room by yourself. Unfortunately in the IT industry, the latter (great intellect, no social skills) are often confused with the former.
I would never reject someone who didn't have a degree on their CV. However in their interview I would be looking for evidence of a whole set of skills that the degree educated candidate would have less need to prove. Conversely, in a degree educated candidate I'd be looking for a spark of passion and commitment that is presumed in a self-taught individual.
Note: Criticism levelled at the Department, not the Consultants.
Yes, of course there are consultants who are incompetent and out for everything they can get.
However, the fix is to not employ them in the first place, and the responsibility for that lies with the unaccountable civil servants who happily hand out contracts to anyone with an impressive sounding title.
As the customer, it is their responsibility to choose the right vendor and work with them to produce a solution. The fact that they can throw a large sum of money over the wall, wait a few years and walk away with no responsibility for the mess that happens ensures that this mistake will happen again and again. In this case, the over-large consultancies that crop up time and time again are the symptom of the underlying problem, not the cause.
@Brendan Sullican RE: Struggling to find sympathy...
While I got off my seat and talked with my MP about RIPA (Anne Campbell, Labour, useless), it seems that this is covered by the DPA about which I have heard far fewer complaints.
Sure, privacy is a right that we should defend, but to expect that public mobile communications should automatically be afforded that right seems optimistic to me. That RIM have co-operated with police might be an issue that their end users could take up with them, but unless you've made specific provisions that your communications should be treated as secure, a high street mobile phone is about as private as.. well, the high street. RIM offer security in the corporate and personal sense, but don't to my knowledge suggest they'll protect you from the government.
As it is, I don't believe any special powers were exercised here, and I'm willing to trust that RIM will do a responsible job of handing over relevant data to the police. No puppies were hurt here and hopefully a few idiots will be taken off the streets.
Struggling to find sympathy...
..for rioters and looters who get caught by a data trawl.
How anyone can be worried about data privacy during a (hopefully) rare event, where there is clear reason for the police to do the digital equivalent of house to house enquiries is beyond me.
Whilst we should uphold the right to privacy, we (as a society) should use common sense, where waiving that right in an isolated case is to our benefit. The police can have all of my phone records for the last week if they wish.
I'm sorry you have so much difficulty coping with a different language from the C++ you're familiar with. Clearly the slightly different behaviour has crippled Java and no-one uses it.
Of course you could always use Scala, Groovy, Clojure or one of the many other languages that run on the JVM, with the support of a vast class library that works across many platforms (and has supported sophisticated distributed and concurrent processing for years).
JSR-292 means that we are likely to see a host of other new dynamic languages running with all the benefits of a efficient runtime, garbage collector, profiling and debug tools, class library and deployment from mobile devices right up through to the cloud without having to change your tool chain.
Agreed, it's a long time to wait for less than earth shaking additions, but we're already seeing work on Java 8, and I'm hoping this shows Oracle can regain the focus that was lost in previous years. Java is far from lost, but it wouldn't hurt to be more visible on the bleeding edge.
Personally I'd like to see some more development of the various presentation frameworks both on the desktop and web, which have historically suffered from Sun's slightly academic approach to user-facing libraries. It's probably the weakest area for Java as Sun conceded ground to more recent entrants to the market.
Might as well nail my mast to the post:
Just as the XBOX and PS3 demand to hog your living room TV (particularly if you're using Kinect - you need some decent space to use that, so it's not so bedroom friendly), Nintendo realise that sometimes you still want to play games when other people are taking turns watching normal stuff on the TV. So from that point of view, a console that doesn't stop working when the main display is otherwise used is genius.
Power doesn't matter so much - good if it steps over the current gen, but otherwise, no-one has won the console wars on raw speed in the last decade or more.
If all the manufacturers continue their current strategies, Nintendo will release it at a price where they make money, MS and Sony will pull out all the stops for their next gen machines, and end up with expensive consoles that loose money as they sell. Being underpowered never hurt the Wii, and if the U can run things like Crysis, serious gamers should at least be happy with it. If they absolutely must have the biggest GPU, they can spend on the other consoles, but they are in the minority.
As for the 'expensive controller' - well, it's a relatively dumb touch screen, so not necessarily an expensive part over the already moderately sophisticated Wiimote. As the machine works with Wii controllers, it's not a problem if you use those instead. The big question is - will the base machine come with a touch screen controller, or will it have a Wiimote? Ie. will the touch screen be an optional extra (or more expensive bundle), or will it be sold with every U?
re. Irresponsible government
Would you suggest we let all claimants through without question then?
Surely it's the government's job to ensure that funds are allocated fairly, and that includes highlighting cases where individuals are abusing the system? You can (and have) chosen to make this a political issue, but insurance companies have used the same tactics for years to discourage casual fraud.
That's not to say the benefits system isn't flawed, but in this case, I'd rather people with imaginary identical twins weren't being paid when I work long hours to support my family.
My experience of the Emulator available with the SDK is that it is awesomely, unbelievably SLOW. It's painfully bad, so if they've fixed that problem to make things run at the speeds they ought to on a PC, they've done something Google seems incapable of.
Re: Physical version already exists
It existed 30 years ago in the form of a ZX81, a sheet of clear acrylic and a dry-wipe pen.
The problem then, as of now, was the inability to locate keys without looking. The ZX81 keyboard was not considered to be a great ergonomic success - though it was a smart design given the limitations of the technology at the time.
Is this more about forcing people (including website maintainers) to adopt 4.0, which is perhaps not as loved as the 3.x stream? I'm still noticing sites that don't seem very happy with 4.x and really wasn't too bothered with the new version.
That contrasts with 3.x which was the must-have tool for web developers everywhere, not least due to Firebug.
Funny you should mention Fanbois..
Google seem to be responding to the shift in market brought about by the iPad far better than Microsoft. Not that there's a 'right' OS for a tablet, just the lesser of a handful of weevils. However, it's all about the hardware and seeing as Apple set the 'minimalist' dial to 11, there's not much room for competitors to manoeuvre.
Years ago I very nearly switched from a degree in software to industrial design. That probably did my employment prospects no end of good. It seems to me that whilst everyone responds well to great design, not so many companies manage to field (or support) great designers.
So, you suggest that rather than buying the iPad or Xoom, the Scroll is a good alternative with a 800x480 resistive screen, Android 2.1 (with no plans to upgrade), an unidentified processor speed and a 4 hour battery life?
In this case I'd suggest that a fraction of an iPad is not worth a fraction of the iPad's price - as it's obsolete before you even unwrap it. There's a base level of functionality, below which a device is only of interest to people who collect pocket calculators.
Assuming your shark is in cool enough water...
... should allow sufficient cooling for this to be practical.
Started well, lost the plot
The start of the article makes the pertinent point that devices that do a single job well continue to outperform the iOS "there's an app for that" model. We buy cameras, mp3 players, ebook readers in addition to or instead of an iPhone or iPad that "does it all".
Then the author got lost with theories of syncing, which missed the point that the user friendly swiss army knife approach of iOS can be trumped by a device that is designed from the outset to do it's designated task brilliantly.
The bottom line is that the iPad is a netbook-without-a-keyboard that may theoretically be able to do 'anything', but is actually only really good at netbook-without-a-keyboard activities, which basically constitute web browsing and casual gaming. The big clue here is that relatively few people read books on their netbooks, or in web browsers.
Re: overcome the limitations of Java
The features you list seem to me to be the source of many programming errors,. Whilst other language communities have grown used to them, I've not seen a great desire for them in Java. There are far more useful and productive language features, and no real need to make Java more like (for example) C just to make people feel comfortable.
The difference between the major languages in use today may be of great concern to scientists, but is barely an issue for engineers. If you're delivering a practical, efficient solution to a client, you use what works, and Java has been shown to work from handheld, via desktop to server and into the cloud.
I would imagine..
..that the difference is that Google have analysed the last digit of performance out of their algorithms, whereas you just got yours to work.
There really aren't that many computational processes that haven't been done before, and before and before. The difference usually comes in optimising the latest implementation to suit the environment in which it runs.
Maybe it's just me, but I think you've rather missed the point. Fine rant though.
- iSPY: Apple Stores switch on iBeacon phone sniff spy system
- It's true, the START MENU is coming BACK to Windows 8, hiss sources
- Chinese gamer plays on while BMW burns to the ground
- Pic NASA Mars tank Curiosity rolls on old WET PATCH, sighs, sniffs for life signs
- How UK air traffic control system was caught asleep on the job