"Up your shaft"
At Google's Horizon cloud event in San Francisco on Tuesday, the ad giant demonstrated its Cloud Vision API through what it called the GCP Emotobooth. Less of a booth than a corner of a room covered by a connected camera, the experience combined all the forced fun of making faces on demand with the dubious pleasure of having one …
My instructions are to amuse visitors with information about themselves.
I don't see anything amusing about spying on people.
Human beings feel pleasure when they are watched. I have recorded their smiles as I tell them who they are.
Some people just don't understand the dangers of indiscriminate surveillance.
The need to be observed and understood was once satisfied by God. Now we can implement the same functionality with data-mining algorithms.
"It would be a mistake to believe feelings can be measured, sorted, and dealt with based on calculations in the cloud. "
Ah. But they will. And those categorised emotions will end up being the source of 'truth' and if you don't fit that category, the you will just be plonked into the nearest one, or you will be in the wrong.
As anyone who has dealt with any bureaucracy can tell you...
Feelings can be measured, sorted and dealt with - it's something every one of us does all the time. And, among other things, we use 'big data' to do it, i.e. the experiences we have accumulated over time. There are even recurring ways in which we do it, i.e. basically algorithms.
So, in principle, a machine should be able to do it too.
What is still missing though is the ability to feel feelings, which has something to do with being a sentient, self concious entity that is capable of empathy. I think this is what makes the difference here. Right now, machines can't do it; they are faking it.
However, a lot of humans are faking it too. And the expectations for being understood are lower than you'd think. And humans have a strong tendency to provide the missing bits and project them. Case in point: ELIZA. People "talked" to it for hours on end because it "understands me so well".
Right now, I'd say what Google has is a faster ELIZA, upgraded with optical input. It will get better and better at 'faking it', and for a lot of applications 'faking it' will be good enough.
Actually 'doing it' will require self awareness, sentience, empathy and consience, and that is still a long and steep road ahead of AI.
Biting the hand that feeds IT © 1998–2020