Legit?
whos word do we have for this being real? Apples?
I could well envision this being knocked up as a PR exercise.
At Apple's iPhone event yesterday, the company talked-up its voice-activated AI assistant, Siri - a feature of iOS 5 that'll first see beta release on the iPhone 4S. This concept is nothing new, though. In fact, Apple predicted a virtual assistant would hit tablet-like devices back in the 1980s, when Steve Jobs was busy with …
"a feature of iOS 5 that'll first see beta release on the iPhone 4S."
Although I'd seen the Beta symbol plastered all over the Siri release, you're implying it's part of iOS5, being beta-tested on the 4S and presumably then cascaded across the rest of iOS5 devices?
Seems very odd, regardless.. Especially if the processing is done "in the cloud".
"Your plastic pal who's fun to be with".
It never fails to amuse me that despite 20 years of everyone experiencing just how dumb computers are, they are still prepared to believe that this kind of intelligent behaviour is just a few years away. If you meet someone who claims he can build such a system, don't lend him any money.
I had some thoughts this evening about AI, and I thought I'd run them up the flagpole here:
A computer is a deterministic phenomenon. It takes a variable input, combines it with another, carefully designed input called the "program", and produces an output which is determined by the input data and the input program. If the output is not determined in this way, it has no use, and either the computer or the program are faulty.
In order to design a computer/program combination which emulates a human mind, we will need a deterministic model of the human mind. This is not a programming problem. I'm a programmer, and I can write you a program that will behave in exactly the same way that a human mind will, if you can tell me the exact rules about how the human mind operates.
You can't, though, and nor can anyone else. Psychologists - REAL psychologists, not the unlicensed psychiatrists - will admit that we're not even 1% of the way to a deterministic model of human thought.
In short - artificial intelligence of the sort portrayed in this video, and in lots of other science fiction, is just that - fiction.
Advice to those who can't grasp the above:
1. Be deeply suspicious of any computerised system which claims to emulate human judgement.
2. If you're a programmer - seize any opportunity you see to get into AI. It's a job for life.
This is Apple gettting it's 'Prior Art' onto the table before the patent trolls start getting the lawyer to start suing apple.
'Here you go boys'. Dont even bother trying to sue us over this tech.
Obviously, someone could claim to have developet is prior to that but even their patents from the time will have expired by now
{or will they?}
The actual Crystal Ball is owned by one Anonymous Coward who posted at 7:38 GMT today (in response to someone else's suggestion that Orange's "Wildfire" service had got there first) the following:
"No chance Orange can sue, Apple invented it first, they just haven't got round to rewriting history on it yet."
Well done that Coward! Now could you just tell me the Euromillions numbers for Friday please....
What really?
You think that Siri is an artificial intelligence?
Again... really??
The work they've done could be impressive, but I already have a mic button on my virtual keyboard. When I talk to my Nexus One it sends what I've said back to the cloud, that analyses it and presents me with a nonsensical alternative to what I actually said. I can insert this nonsense into email or text messaging whenever I want to confuse people and make myself look like I've finally lost all my marbles.
What's worse is that without a great signal, it gives up on being able to send what I said back to the cloud.
Oh Siri is different? Obviously someone wasn't paying attention to the keynote, where the cloud and Apple's data centres were mentioned in relation to Siri.
This is not so far removed from the capabilities of the Apple Newton software being able to 'Fax Bob' back in the '90s.
Sure there is voice recognition but that has always been on the cards.
Also much better contextual understanding plus t'internet but this is just scale.
The Newt was the iPad v0.1
IT'S NOT A TABLET! It's a DESK. Look at the thing- it doesn't move when he slides the external storage device onto it, so it's rigidly fastened down.
Also, it predicted external storage on Apple devices and bowties on people so it's not THAT close. And that academics would decide to use the same data format for complex data modelling., which means it's predicting the utterly impossible!
... that the tablet in the future of 1987 had an external storage slot ?
Not in the iPad version of that future of course. ;)
Also notice that a lot of the "voice interaction" is actually nothing of the sort - it's extraneous expositionary commentary. "Thinking aloud" moments. Precisely the sort of thing that computers are hopeless at interpreting and impossible to accomodate in such user interfaces.
There are basically two approaches when a computer receives voice instruction that it doesn't understand -
1) ask for clarification
This will lead to constant requests for clarification when you have those "thinking aloud" moments. "Hmm, that's not really very useful." Computer: "I'm sorry, I don't understand what you need. Try rephrasing"
So the obvious alternative is:
2) ignore voice commands that aren't understood since they may not be commands at all.
This of course leads to even more frustration when what should be perfectly valid commands are apparently ignored.
All this can be solved with sufficient technology and computing power ?
Maybe, but unless and until it can be demonstrated as having been achieved, you might just as well say "This will be solved by magic".
Voice interaction with computers today is not much more advanced than it was 10-15 years ago. What other aspects of software or hardware technology have stood so still for so long ?
Back in the 1990s, they even had a tablet computer you could actually do something on. It was called the Newton. It was able to recognise what you draw on the screen and act on it. Even in it's earliest incarnations you could just write a name somewhere, and mark it so you could call that person. (by beeping out DTMF, mobile phones weren't that popular back then)
Of course Jobs cancelled the project just when it's main points of critique have been fixed and Apple was about to get a huge order.