back to article My moment face-to-face with Google's AI: It feels your pain, sometimes

At Google's Horizon cloud event in San Francisco on Tuesday, the ad giant demonstrated its Cloud Vision API through what it called the GCP Emotobooth. Less of a booth than a corner of a room covered by a connected camera, the experience combined all the forced fun of making faces on demand with the dubious pleasure of having one …

  1. Mike Shepherd

    "Transporter Room"

    "Thank you"

    "Up your shaft"

    1. David 132 Silver badge
      Happy

      Re: "Transporter Room"

      Well of course the elevator's emotions were insincere - it was voiced by Spock.

  2. Anonymous Coward
    Anonymous Coward

    My instructions are to amuse visitors with information about themselves.

    I don't see anything amusing about spying on people.

    Human beings feel pleasure when they are watched. I have recorded their smiles as I tell them who they are.

    Some people just don't understand the dangers of indiscriminate surveillance.

    The need to be observed and understood was once satisfied by God. Now we can implement the same functionality with data-mining algorithms.

    1. allthecoolshortnamesweretaken

      Interesting. Personally, I feel the need to be understood, but not the need to be observed.

  3. find users who cut cat tail

    If they can detect fake joy now, they have still some way to go...

    http://images-cdn.9gag.com/photo/4748011_700b.jpg

  4. m0rt

    "It would be a mistake to believe feelings can be measured, sorted, and dealt with based on calculations in the cloud. "

    Ah. But they will. And those categorised emotions will end up being the source of 'truth' and if you don't fit that category, the you will just be plonked into the nearest one, or you will be in the wrong.

    As anyone who has dealt with any bureaucracy can tell you...

  5. allthecoolshortnamesweretaken

    Feelings can be measured, sorted and dealt with - it's something every one of us does all the time. And, among other things, we use 'big data' to do it, i.e. the experiences we have accumulated over time. There are even recurring ways in which we do it, i.e. basically algorithms.

    So, in principle, a machine should be able to do it too.

    What is still missing though is the ability to feel feelings, which has something to do with being a sentient, self concious entity that is capable of empathy. I think this is what makes the difference here. Right now, machines can't do it; they are faking it.

    However, a lot of humans are faking it too. And the expectations for being understood are lower than you'd think. And humans have a strong tendency to provide the missing bits and project them. Case in point: ELIZA. People "talked" to it for hours on end because it "understands me so well".

    Right now, I'd say what Google has is a faster ELIZA, upgraded with optical input. It will get better and better at 'faking it', and for a lot of applications 'faking it' will be good enough.

    Actually 'doing it' will require self awareness, sentience, empathy and consience, and that is still a long and steep road ahead of AI.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2020