back to article Voice assistants are always listening. So why won't they call police if they hear a crime?

If you saw someone being assaulted, you'd probably whip out your phone and dial for help. But when one of our newly ubiquitous devices hears a crime, it does nothing. If Alexa or Google Assistant or Siri hears an assault, or a rape, they sit there waiting for their cues to act. But that's a load of malarkey. Listening for a …

  1. Anonymous Coward
    Anonymous Coward

    What do you suggest they do?

    The only thing I can think of is that when they hear the cues that someone needs the emergency services is that they activate but then who is going to be on the other end? You would need it to first send a location then connect and share audio with a person who could determine if help was needed or if it's a mistake. That's a lot of people and you couldn't just use the 999 people because there aren't enough as it is and they need to be dealing with confirmed emergencies. There's also the privacy aspect, where do you draw the line, do you really think the government won't see this as an opportunity to spy on people? Mention drugs, anything terrorist related, something against the government and it gets grabbed.

    It's shouldn't happen in my opinion regardless of the perceived benefits.

    1. AMBxx Silver badge

      There have already been stories of these things placing orders for stuff when adverts are shown on TV.

      I don't think I dare watch another episode of CSI! Heaven forbid I watch a film about terrorism or wife beating.

      1. David Gosnell

        Somehow this conjures reverse images of Kevin in Home Alone, scaring the burglars with the video film

      2. boltar Silver badge

        "I don't think I dare watch another episode of CSI! Heaven forbid I watch a film about terrorism or wife beating."

        Quite. This whole article is a load of touchy feely guff by someone who clearly didn't even spend 2 minutes considering the dangers of false positives. But then this sort of quality of "journalism" is where El Reg appears to be heading these days.

    2. sorry, what?
      FAIL

      "There's no easy way through this"...

      Really?!

      I take the easy way: I reject voice assistants. Not a single one is enabled on any device in my household.

      Very easy.

      1. Archtech Silver badge

        Re: "There's no easy way through this"...

        "I reject voice assistants. Not a single one is enabled on any device in my household".

        As far as you know.

        1. sorry, what?
          Meh

          Re: "There's no easy way through this"...

          @Archtech, You have heard of outgoing firewall rules, right?

          1. Anonymous Coward
            Anonymous Coward

            @Boltar

            100% agree, there is something fishy in the cupboard and isn't an old tin of cat food.

            I've just been listening to the news and how all the trains in the East of England are being shut down or limited, with no bus replacements, because there is snow *on the way*!!

            The lunatics have taken over the asylum and sanity no longer means what we used to think it did. Being sane in this country now feels dangerous. I think it's time to bug out.

            PS It's now a moot point, but the benefits of these types of systems are not for the *consumer*, so why is anyone surprised when they are not geared up to provide things people actually need?* Plus, where is the, very obvious, solution of NOT HAVING THE DAMNED THINGS!!? It isn't like they are providing an essential service, like electricity or water, is it?

            *Although as has been mentioned, I'm not entirely sure how a monitoring system would cope with TV shows, or the actors guild rehearsals of Macbeth, for example.

            1. ma1010 Silver badge
              Happy

              Re: @Boltar

              Being sane in this country now feels dangerous. I think it's time to bug out.

              Ditto here. For $DEITY'S sake, don't look for sanity in the U.S. (where I live - not that you're likely to look here, but just making sure), Where do you suggest we go to find this sanity you speak of?

              1. Sir Runcible Spoon Silver badge

                Re: @Boltar

                I'm not aware there is much left to head towards unfortunately.

                Even the once great and practical land of Oz seems to have been infected with it.

                20'years ago: house on fire = stick another shrimp on the bbq. Now? They're more likely to ban house ownership.

                1. The Central Scrutinizer

                  Re: @Boltar

                  Don't bother coming to Australia looking for sanity. It fell down the back of the couch years ago and hasn't been seen since. The drive to back door crypto is on, house prices are mental, business is actively trying to worsen working conditions for wage slaves and the politicians are too busy infighting to even pretend that they're running the place. You'll have to look elsewhere.

                2. onefang Silver badge

                  Re: @Boltar

                  "Even the once great and practical land of Oz seems to have been infected with it."

                  Yeah, I've been thinking that leaving Oz would be a good idea, if only there was a decent place to go to that isn't below freezing a lot of the time. I hate the cold.

              2. JohnFen Silver badge

                Re: @Boltar

                "Where do you suggest we go to find this sanity you speak of?"

                Even if there were such a place, remember that almost no nation will take you unless you're wealthy or have a special skill that they are badly in need of.

            2. peterjames

              Re: @Boltar

              Well (on Macbeth) - it would detect the guilt - and if any sane/unburdened by the said self-inflicted emotion - it would probably decide to do what you just described - check out.

              Do correct me if wrong - but once the asylum decides its collective guilt must be suffered out - there is not much one can do but wait for it to take place - any help only makes the ones who know (or consider) they have done wrong - just accumulate the need for self-punishment (and they know how they want it too - not too soft, not too hard either - no permanent damage, just cleansing of the soul - I dare say (as a complete non-Briton and so to be booed off the stage for exposing the mechanism - greetings from a place I won't name)).

        2. Muscleguy Silver badge

          Re: "There's no easy way through this"...

          I have denied the Google App permission to access the microphone on my phone. Saying 'Okay Google' does nothing. I NEVER tap the Google search bar either, I use the Startpage shortcut instead. On my previous phone I managed to disable the Google App completely. It required a hard reset when I tried it on this iteration.

          I run Oversight on this laptop, our TV is decidedly non smart and the digibox is not a voice activated one. I'm not interested in that. As a scientist I am adept at forming specific search queries and finding stuff myself.

      2. Mark 85 Silver badge

        @sorry, what? -- Re: "There's no easy way through this"...

        Yes, it's easy for us techies who have a bit of deserved paranoia. But what about the average "Joe User" who hasn't clue or isn't one of us? They have no idea what's being slurped and most don't care. If they did, FB wouldn't have ever been a "thing". And let's not even get into "smart" devices that send everything back to the mothership.

    3. John Smith 19 Gold badge
      Gimp

      The mfg don't mind spying on you. They do mind being responsible for a response

      But make no mistake, they are spying on you. 24/7/365

      But with a non requested response people might start to realize it a bit more.

  2. Anonymous Coward
    Anonymous Coward

    It's not *your* security they are listening-in on to protect

    It's *Their's*, you seditious bastard.

    If you've got nothing to Hide, then you've got nothing to Fear, Winston.

    1. AMBxx Silver badge

      Re: It's not *your* security they are listening-in on to protect

      We need Anonymous Cowards to have the option to be 'Sarcastic Anonymouse Cowards'. I think the spirit of this post was lost on a few people.

  3. ratfox Silver badge
    Stop

    First, we need the assistants to be able to distinguish what's real and what's TV. I bet the plods won't be too happy to be called from 100'000 different homes at the same time during Dexter reruns.

    1. Symon Silver badge
      Coat

      I always found Dexter very sinister. ---->

      1. Archtech Silver badge

        You are definitely

        Right!

        1. Anonymous Coward
          Anonymous Coward

          Re: You are definitely

          don't you mean "Left"?

    2. peterjames

      If only real humans could do that...

  4. smudge Silver badge

    Is it live, or is it Memorex?

    These devices, or the cloud services that power them, can easily understand when someone is angry, or terrified or in pain.

    And can they tell when that someone is physically present in the room, and when they are being relayed through the loud, high-quality home cinema setup in the room?

  5. Halcin

    What about a spoilt child expressing genuine distress over not being allowed to stay up late?

    1. Pascal Monett Silver badge

      Indeed

      How is Siri going to make the difference between a child being assaulted by a pedophile and one who stubbed his/her toe and is expressing justifiable but innocent pain ?

      Siri won't.

      So then the argument will become "put a 911 operator on the line to listen and decide", and then we have two issues : violation of a person's property by forcing it to dial 911 without consent (plus all the people in the vicinity - great use of resources there, no possibility of confusion at all) and shoving the responsibility to a harassed phone operator. Sounds like kicking into the sidelines to me.

      This whole article is a brilliant demonstration of the stupidity of the "think of the children" mentality. Oh, so phones can do this now ? Well enroll them into the police force and have them tell on everything all the time, because that is the only thing that can protect the children.

      No thought of any consequences, or even of the practical issues that can be encountered.

      In short, useless and dangerous.

    2. Anonymous Coward
      Anonymous Coward

      What about a spoilt child

      simples, you come up to them real quick, whack them on the head and muffle the cries, silly!

      1. Anonymous Coward
        Anonymous Coward

        Re: What about a spoilt child

        They'd first have a lot of explaining to do about how they got out of the cupboard!

  6. gnasher729 Silver badge

    I don’t know what Alexa does, but Siri waits for the words “hey Siri”, hard coded in a very low power chip, and doesn’t listen to anything until it gets that signal.

    1. Adam 52 Silver badge

      Quite. The article is based on a misunderstanding, deliberate or otherwise, of how the technology works.

      1. Anonymous Coward
        Anonymous Coward

        It isn't the matter of understanding the technology

        and how it works. These things have the potential to be used buy anyone who can hack the system or gain access via legal means to snoop on you and what you are doing 24/7.

        I disable the assistant in my phone and check that it remains disabled after every update (rare these days)

        Having been around tech and working on it for more than 40 years NONE of the bits of kit that runs Alexa/Siri/Bixby/or whatever will ever come into my home let alone be plugged in and switched on.

        I've done without them for 60 years and can carry on doing without them for a few more decades thank you very much.

        IMHO, the sooner these die a slow lingering death the better but people love to show off their blingy toys. 'Alexa, tell Bezos to get F*****d'.

      2. AMBxx Silver badge
        Big Brother

        >> doesn’t listen to anything until it gets that signal.

        Didn't you read the small print on page 44 of the latest update?

        Joking aside, I wouldn't trust them not to change it in the future.

        1. Warm Braw Silver badge

          The trend is definitely to move more of the voice recognition into the device itself - at the moment there is a lot of cost tied up in cloud-side systems that isn't covered by the low-margin bug speaker. That could be used to send less data to the cloud or to process and retain more data locally and send it to the cloud on demand. It will all depends on which is more profitable...

      3. David Nash Silver badge

        Misunderstanding of how the tech works

        "...can easily understand when someone is angry, or terrified or in pain"

        I would doubt this very much. They are only machines. They don't actually "understand" anything.

    2. Joe Harrison Silver badge

      That's what they want you to think

      Have you ever actually seen this very low power chip mmmhey?

    3. Anonymous Coward
      Anonymous Coward

      Alexa does the same. I thought the linked article might explain that they have secretly been 'listening to everything', but alas not.

      The device doesn't send conversations back until it hears it's trigger word. The linked article was just explaining that the developers could get the exact words used to trigger their skill (rather than them having to set the trigger words and just get told that theat pattern had been activated).

      So regardless for the fact that there doesn't appear to be any evidence for the fact that it is forever sending every conversation back to the mothership (sure, they could, sure they or the government might want it) the amount of false alarms to a busy service would be ridiculous.

      A far better solution would be to be able to record a hardcoded 'code word' into the device that would trigger the emergency service call and tell the operator to listen in with the current location. Just make sure the code word is not something that might be said during 'normal times'?

      1. Francis Boyle Silver badge

        Just

        make sure you don't use your 'usual' safeword.

      2. chrisf1

        or indeed a simple 'alexa call the police' for most purposes. Given drop in functionality etc that seems pretty rational.

    4. onefang Silver badge
      Alert

      'I don’t know what Alexa does, but Siri waits for the words “hey Siri”, hard coded in a very low power chip, and doesn’t listen to anything until it gets that signal.'

      Which would fail the other way, "HEY SIRI, HELP! I'M BEING MURD...'' gurgle, thump. Hmm, thinks Siri, they are mixing their English and French?

  7. Lysenko

    SWATted by Siri?

    No thanks. It is a dumb an idea as a car that does an emergency stop every time it hears the word "brake" (or was it "break"?). If they have to do something like this (and Alexa at least could do it now), it should be based on a safeword (or phrase), not AI trying to disentangle someone doing DIY while watching a horror movie.

    1. Charles 9 Silver badge

      Re: SWATted by Siri?

      But verbal cues will immediately be noted as less than useless if someone is being STRANGLED.

      Though I completely agree about the playback issue. Alexas ALREADY respond to the TV.

  8. Scott Broukell
    Meh

    I threw my latest voice assistant in the bin after it blatantly took no emergency action whatsoever upon my extremely audible reaction to the news that the local Krappy Fried Chicken shop was going to be closed for a whole week!, yes, seven whole days! It just bloomin sat there and did absolutely nothing, I was anticipating maybe an automated emergency call for a supply of spicy fried wings to be air-lifted to my address, but no, nothing. I had to put the call in myself, what's the point of these devices. Waste of money!

    1. disgustedoftunbridgewells Silver badge

      There is a Just Eat integration that could have saved you.

      1. Anonymous Coward
        Anonymous Coward

        "There is a Just Eat integration that could have saved you."

        ..or potentially make you ill.

  9. Korev Silver badge

    Liability?

    You could also imagine a scenario where $VENDOR implements this and something bad happens and no Police are called. The victim's family could then sue Amazon, Google etc.

    1. Brian 18

      Re: Liability?

      Or as others have mentioned, $VENDOR implements this and people have SWAT show up for watching a horror movie or crime drama.

  10. Christian Berger Silver badge

    Given the current track record of voice recognition...

    ... you'd surely not want them to call the emergency services. After all the vast majority of calls would be "butt-calls".

    Besides the majority of people still believe that they don't send everything to be processed to some manufacturers cloud.

  11. Dan 55 Silver badge

    Hey Siri! Argh! Help!

    They do wait for keywords before slurping, or at least that's what we're told

    1. Christian Berger Silver badge

      Re: Hey Siri! Argh! Help!

      Yes, but voice recognition is a fuzzy thing, and typically more users will complain when it didn't recognice the keyword as to when it did, and silently sent a recording to the server to double check.

    2. MrXavia

      Re: Hey Siri! Argh! Help!

      Of course they do wait for keywords, can you imagine the processing required to do 24/7 speech recognition in the cloud?? and then imagine the cost! I would bet most companies want to shift the speech processing out of the cloud back to the device if they can and reduce their costs

  12. Old Timer

    This 'journalist' has no idea how the technology works. Oh for the days of Mike Magee.

  13. Archtech Silver badge

    Very valuable article!

    I think we are all deeply indebted to Mr Pesce. In the first place, his article outlines a significant improvement on the current interpretation of the Turing Test. As originally conceived, the Turing Test seems unduly reliant on conversational tricks. Of course, Alan Turing had little alternative as at the time computers could barely produced English output on teletypes; they couldn't do much else.

    But when you come to think of it, isn't the ability to detect and report a crime a far better test of humanity? Bear in mind that most human beings would fail such a test some of the time. Mr Pesce seems to be thinking mainly of overt, violent crimes that are accompanied by lots of screaming and roaring. But, as others have observed, how does a machine limited to the sense of hearing distinguish between a massacre in progress, an angry child arguing with its parent, or for that matter a group of teenage boys having fun? Not to mention TV, radio, etc. And what electronic assistant could listen to the average rapper, turned up loud, without calling 999 immediately?

    And that's before you even delve into the intricacies of the law. In today's Western nations there are so many laws and regulations that most of us are no doubt breaking some of them most of the time. Should our little electronic buddies begin to act like the stereotypical Hitler Youth child who informed on his parents?

    1. AMBxx Silver badge
      Mushroom

      Re: Very valuable article!

      Siri could start reporting micro agressions to keep the snowflakes happy.

      1. Sir Runcible Spoon Silver badge
        Coffee/keyboard

        @AMBxx

        Priceless :) +10 virtual upvotes in addition to the actual one.

        1. AMBxx Silver badge
          Happy

          Re: @AMBxx

          @Sir Runcible

          Looks like at least 2 snowflakes disagree with us!

          A big Register Welcome to the easily offended who haven't worked out how to manage a pithy reply.

    2. ma1010 Silver badge
      Big Brother

      Re: Very valuable article!

      Should our little electronic buddies begin to act like the stereotypical Hitler Youth child who informed on his parents?

      Big Brother says "Yes" to that question.

      I expect the U.S. Congress would, too, and probably the lawmakers in most countries, including the U.K. (see Snooper's Charter).

      <sarcasm>After all, if you are not engaged in crimethink, what do you have to fear from the Ministry of Love?</sarcasm>

  14. Chris King Silver badge

    False positives

    So how are they going to prevent false positives ?

    Do we want our phones to go all Clippy on us, shouting "Hey ! It sounds like you're committing a murder !"

    Google would only go and commercialise that, telling you where you can buy shovels, bags of lime and old carpet.

    1. disgustedoftunbridgewells Silver badge

      Re: False positives

      Commit lots of murders? Perhaps you'd like a Dash button for shovels, bags of lime and old carpet.

    2. Doctor Syntax Silver badge

      Re: False positives

      "Google would only go and commercialise that, telling you where you can buy shovels, bags of lime and old carpet."

      You sound as if you're having problems with your boss. Do you want to buy a cattle-prod?

      1. Symon Silver badge
        Big Brother

        Re: False positives

        Customers who committed murder also liked:-

        Arson

        Rape

        Genocide

        The world according to Clarkson

        1. Anonymous Coward
          Anonymous Coward

          Re: False positives

          Don't forget truck driving :)

    3. Anonymous Coward
      Anonymous Coward

      Re: False positives

      "Hey! It sounds like your wife is beating you. Would you like some help?"

      STFU Alexa! We're into S&M. I like it and you're spoiling the mood!

      "You have new recommendations: The 50 Shades of Grey Collection and padded underwear to wear tomorrow." *sounds of Amazon Echo being smashed*

      1. Anonymous Coward
        Anonymous Coward

        Re: False positives

        "Hey! It sounds like your wife is beating you. Would you like some help with that?"

        Yes! Hurt me Alexa! I've been a bad boy!

    4. Fruit and Nutcase Silver badge
      Joke

      Re: False positives

      @Chris King

      Clippy: "Hey ! It sounds like you're committing a murder !". Have you a plan to dispose of the body? May I suggest...

  15. Timmy B Silver badge

    An actual usefull scenario...

    We care for a couple of very elderly and infirm relatives. We currently have an emergency phone line with buzzers and such that they can use if one of them has an emergency. Alexa could perhaps do this job quite well. Have buttons that trigger Alexa to make an emergency call and then turn into a wireless phone that they can use to talk to the emergency operator. Much cheaper than the subscription we have to pay for the current service.

    User presses button - Alexa calls 999 and says "I have an emergency at such and such and address where there are x elderly people with x medical needs. Please send an ambulance. Do you wish to speak to them?"

    There are some issues with power cuts but then the current system has to be plugged in too.

    1. Baldrickk Silver badge

      Re: An actual usefull scenario...

      You can buy apps that enable an "emergency button" on smartphones. You can buy a bluetooth alert button that does the same, and you just wear it like a necklace.

      In both cases, they just call the emergency services or another number (e.g. a carer.) Emergency calls do allow the operator to get your location these days I believe.

      And I can always just say "ok google, phone 999 on speakerphone" if I have, say fallen through my glass coffee table and my hands are busy trying to staunch any wounds.

    2. zaax

      Re: An actual usefull scenario...

      They already have that in my area its called Careline

      1. Lysenko

        Re: An actual usefull scenario...

        The hole with all those systems is that someone who collapses (stroke, fall, cardiac event etc) is quite likely incapable of coherent speech or pressing buttons. We've been asked to build something to plug that gap, but short of death detection (by thermal imaging), we haven't come up with a workable approach that doesn't require strapping sensors onto the target or making huge assumptions (e.g. no-one sits still on the floor, ergo alarm) triggering false positives.

        1. 's water music Silver badge
          Joke

          Re: An actual usefull scenario...

          huge assumptions (e.g. no-one sits still on the floor, ergo alarm)

          #thefloorislava

        2. Timmy B Silver badge

          Re: An actual usefull scenario...

          @Lysenko.

          Those things already exist. You can get g-force monitors to see if people have fallen and proximity monitors to see if they leave a particular area, also bed pressure monitors that call a care company if a person has got up within certain times and not returned after a set period.

          1. Anonymous Coward
            Anonymous Coward

            Re: An actual usefull scenario...

            Remote monitoring of stroke patients can be achieved with an upper arm patch to see if they are doing their exercises and talking.

            Some sensors go further than that without being particularly bulky - for example.

          2. Lysenko

            Re: An actual usefull scenario...

            Those things already exist. You can get g-force monitors to see if people have fallen and ...

            Yes, I know, however, we were asked to cover the "20" bit of the 80/20 rule and (crucially) not to depend on instrumenting the individual directly. That means accelerometers and magnetometers (for example) are out for posture detection and in with OpenCV. Same for temperature (microbolometers, not thermocouples). Way too many corner cases for my liking with either approach.

            1. Sir Runcible Spoon Silver badge

              @Lysenko

              Do you think it would be possible to hook up multiple MS Kinect's to a monitoring system?

              Programming the correct criteria would take a bit of doing obviously, but I believe the Kinect has quite a decent sensor array.

              1. Lysenko

                Re: @Lysenko

                Do you think it would be possible to hook up multiple MS Kinect's to a monitoring system?

                That's exactly what we did for PoC, with a FliR Lepton bolted on the end. It's workable.

        3. Steve Davies 3 Silver badge

          Re: An actual usefull scenario...

          I'm not sure why you got a downvote. All I can hope is that the person/bot that did that does not get old.

          My 95 year old Mother had one of these systems and it worked. The people on the other end of the phone did not mind the odd false alarm. You actually gave them one every month when you tested the system.

          These things allow people who should otherwise be in ca care home to live independantly.

          My Mum got one of these (you have to pay a rental fee btw) about 10 years ago after she had her second knee replaced. That meant that she could not kneel and if she fell over, she could not get up again without help.

          She's now in a care home and will celebrate her 96th next week. I am firmly of the opinion that these things work and there is no need to complicate it with any more technology. They can't be hacked either.

        4. NXM

          No- one sits still on the floor

          I've not only stayed still on the floor for long periods, I've actually fallen off after a particularly arduous 'board meeting' at the pub

      2. Timmy B Silver badge

        Re: An actual usefull scenario...

        Careline..

        This is the service we have. But they charge for the equipment, batteries and the service. £14 to change a battery that's £3 on ebay....

        The one time we have actually needed them they spent more than 20 minutes trying to get sense out of a 98 year old with dementia before ringing the ambulance.

    3. Anonymous Coward
      Anonymous Coward

      Re: An actual usefull scenario...

      "Have buttons that trigger Alexa to make an emergency call and then turn into a wireless phone that they can use to talk to the emergency operator. "

      The Doro 610/810 mobiles have a button on the back which can be preprogrammed to contact a series of people.

      1. Anonymous Coward
        Anonymous Coward

        Re: An actual usefull scenario...

        I am currently building a gadget to remind me when to take my pills and do my glucose blood tests. The major problem was how to get it to recognise a meal time that could be quite variable.

        The answer appears to be to make the device a mug "coaster". If my mug is sitting on it at the appropriate time - and occasionally being removed - then I am alive and well and eating a meal.

        As always the problem is finding the right plastic box for the job - the electronics and programming are easy.

        Basically an Arduino Nano with an accurate TCXO clock backed up by battery - and periodically synchronised with MSF/DCF etc radio time transmissions. Infra-red collision detectors that work through a plastic box lid to see reflections off the mug. Other available modules could provide Buetooth, wifi comms.

  16. This post has been deleted by its author

    1. I ain't Spartacus Gold badge
      Devil

      Re: False positives

      Depending on your country of residence of course: The police break down the door, shoot all your children and then blessed peace is restored. You get to snooze in front of the telly in peace.

      Oh sorry, did I say that out loud. Erm...

      1. Mark 85 Silver badge
        Facepalm

        Re: False positives

        Or.. they shoot all the adults first.

        1. I ain't Spartacus Gold badge
          Happy

          Re: False positives

          Or.. they shoot all the adults first.

          Well OK, that's bad. No snoozing in front of the telly with a cuppa and some cake.

          But at least you'll still get some bloody peace and quiet...

  17. zaax

    Someone who has no idea of how technology works, why is this post allowed?

    1. Sir Runcible Spoon Silver badge

      You ask why?

      Because El Reg seems to have been bought out by /.

  18. Ambivalous Crowboard

    "Assistants are always listening, this is bad. I want them to listen more to fix this"

    You do realise that you contradict yourself throughout this article?

    "Voice assistants are always listening. So why won't they call Police if they hear a crime?"

    Because that device listens for something that resembles a button being pressed. It does not continually upload the stream of audio to the cloud to be processed. Except you appear to not want this...

    "These devices, or the cloud services that power them, can easily understand when someone is angry, or terrified or in pain. It should almost be trivial to detect when something is way out of range, and flag that."

    It isn't, unless you want them to continually stream your home audio to the cloud for processing, which you appear to be against.

    "Listening means being responsible for whatever you hear."

    Which is why they don't listen. They wait for a predictable nudge; a vocal button-press. And if you don't press the button (say the wake-word), they don't hear anything.

    "We're listening as never before, and we have to do something about it."

    No, we aren't. This has always been the same; saying "Alexa" to wake up the hockey-puck is no different to pressing the "voice command" button on the side of your BlackBerrry 9700, except instead of pressing a button with your fingers, you press it with your voice.

    The one exception would be Samsung's smart TV; I don't see that brand name in your diatribe.

  19. Nick Kew Silver badge
    Devil

    On hold

    Oh yes please. Alert the emergency services when I'm on forever-hold trying to call someone's customer services modelled on Kafka's castle. Preferably before rather than after the rising blood pressure leads to a fatal heart attack.

    Better still, as soon as the "on hold" becomes a yob screaming too aggressively.

  20. Velv Silver badge
    Big Brother

    Only a small step...

    Since devices are already listening it’s only a small step for Theresa May and Amber Rudd to demand the vendors build access to the device for security services. It’s (apparently) easy to build back doors into security, so why not a bug in everyone home.

    And before you say “but you need to activate”, the devices are listening

    1. Anonymous Coward
      Big Brother

      Re: Only a small step...

      ( fap fap fap ) Call all his mates. With instruction to receiving 'phones to put it on speakerphone.

  21. iron Silver badge
    Big Brother

    the stupidity of humanity

    Well done NSA, GCHQ, et al you have convinced Mark Pesce that 1984 is a blueprint for the way society should work. I've only been up an hour and already I'm depressed about the stupidity of humanity and our society, as if Mondays weren't bad enough so thanks for that.

    I would never allow Alexa / Siri / whatever device in my home to spy for its owner Amazon / Apple / etc. If they spied for the authorities as well I'd go out of my way to disable them in other people's homes too.

    1. Charles 9 Silver badge

      Re: the stupidity of humanity

      What makes you think you'll have a say in the matter. Soon they'll be in things brought in without your knowledge (unless you routinely use metal detectors) or simply mandated for safety EVERYWHERE, meaning moving won't be an option, either.

  22. whatsyourShtoile

    the real issue

    If you use the phone to make a call from a library, it will work but it will go straight through to The Police.

  23. Jason Bloomberg Silver badge
    Joke

    It gets harder every day

    So, when I am breaking into a property to beat the living crap out of someone, I have to remember to cut the fibre as well as the telephone lines. Thanks for the heads-up.

    1. Aladdin Sane Silver badge
      Trollface

      Re: It gets harder every day

      Nah, just yell "Siri/Alexa/Cortana/OK Google open the door" through the letter box.

    2. Mark 85 Silver badge

      Re: It gets harder every day

      There's a fatal flaw in your plan.... wireless. You'll need to search the house* for any devices first before engaging in the mayhem.

      *Using some super douper equipment only available to CSI types.

  24. Geekpride

    Bug report for The Register: You appear to have posted your April Fools article over a month early.

  25. poli0121

    as far as AI are today this is unworkable idea

    Imagine the situation, when you get Police kicking in your door at 10pm as you were just watching a sizzling CSI episode on TV or some gory horror movie :D

    AI still have a long way to go to be able to simply correctly recognise what you are saying when making your request to them - not to mention to be able to judge whats going on at your house from context of what is heard.

    Having a sort of voice "PANIC button" implemented where they call 999 on a cue might be doable.

  26. scrubber

    "Crime"

    Theresa May's definition of a crime and my definition are quite far apart. That's why we have courts and juries.

    1. Anonymous Coward
      Anonymous Coward

      Re: "Crime"

      "That's why we have courts and juries."

      Theresa will soon change that if she gets her way.

  27. Anonymous Coward
    Anonymous Coward

    I once had a key to a neighbouring friend's house because we shared some resource. One day I let myself in - expecting no one to be home. Inside there was a loud sound of bumping, screaming, and moaning. Oops - a quiet retreat to avoid disturbing their "I'll have what she's having" moment. NSFW

    A bit like the lodger next door who was quickly labelled "Roger the Rabbit" for the wall penetrating loud noises made by his girlfriend in apparent ecstasy late at night.

  28. Notwork

    I really don't want a voice assistant calling the police every time I watch a ripped episode of GOT.

  29. sisk Silver badge

    I would be very impressed indeed if you could design an Alexa-like device that could tell the difference between someone screaming in fear and someone screaming in pleasure. Humans have been known to fail that particular test, to the great embarrassment of many. Or, even more difficult, tell the difference between a child screaming because they're hurt and a child screaming because an older sibling is tickling them. Oh, and while you're at it, you'd better be able to tell the difference between an actual human and a TV.

    The technical challenges with this entire idea are probably insurmountable without some serious AI. And, frankly, if they tried to implement this anyway the number of false alarms coming into the local police department would ultimately end up costing more lives, not saving lives.

    1. onefang Silver badge
      Childcatcher

      "Or, even more difficult, tell the difference between a child screaming because they're hurt and a child screaming because an older sibling is tickling them."

      I'm well aware of that particular problem. Due to odd acoustics in my area, I'm in ear shot of three local schools. I don't think I could tell the difference between school lunch time and some crazed loon on the loose with a machete.

      Very odd acoustics, I've overheard conversations between construction workers building a hospital, in the next suburb.

  30. Anonymous Coward
    Anonymous Coward

    Hey Siri

    Dial 0118 999 881 999 119 725... 3

  31. Whiznot

    Keeping up with the times, the Register descends into idiocy.

  32. Greg D

    So....

    Extrapolating this logic into real life, when watching GoT with my surround sound on, I can expect a visit from all the local emergency services then??

    1. onefang Silver badge

      Re: So....

      "Extrapolating this logic into real life, when watching GoT with my surround sound on, I can expect a visit from all the local emergency services then??"

      Including the fire brigade, and pest control with really large nets, coz they heard there where dragons in the area.

  33. ab-gam
    Thumb Down

    Even More Intrusion is Needed!

    Otherwise their AI will not be able to properly identify Vigorous and Enthusiastic Sex vs, well....

  34. JohnFen Silver badge

    A profoundly terrible idea

    This strikes me as being a profoundly terrible idea. At least in the US, the potential for harm and loss of life as a result of this would outweigh the benefit.

    Even ignoring the obvious and severe privacy implications, there is no way that these devices could understand context. What if it's triggered by the audio of a movie? What if it misinterpreted your kinky sec roleplaying? and so on. The end result will be that your "assistant" would be SWATting you, and putting you at very real risk of being seriously injured or killed by the police.

  35. Kaltern

    Scaremongering...

    ...scaremongering everywhere!

    Too many people believe too much crap on the internet.

    * Assistants are not clever.

    * They are not self-aware.

    * They do not make decisions.

    * They are not listening to every word you speak and sending them to SPECTRE

    If every conversation was recorded and sent back for analysis, that would be a SHIT load of data, which would be nigh on impossible to shift through with any degree of accuracy.

    Of course, there might be hidden keywords we know nothing about.. Jihad...Allah...Liberal...

    Bottom line though, these devices are just voice activated internet data grabbers. They won't help you in a crisis, they won't ask if you're ok, and they won't call 999/911 if they think you're not.

    Because they can't think.

    Terrible article, El Reg. Must be a REALLY slow news day.

    1. Charles 9 Silver badge

      Re: Scaremongering...

      "If every conversation was recorded and sent back for analysis, that would be a SHIT load of data, which would be nigh on impossible to shift through with any degree of accuracy."

      Isn't that the purpose of that data center in Utah (well, that and act as a cover for the working quantum computer)? And let's not forget China's ambitions.

  36. Old Handle
    Thumb Down

    What a silly article

    How exactly is Alexa supposed to know whether something is a crime? Do you really want it analyzing everything it hears to that extent?

  37. Cynic_999 Silver badge

    Employment problems

    Please would The Register refrain from employing "journalists" who have been fired from the Daily Mail for not understanding what they are writing about?

  38. Fruit and Nutcase Silver badge
    Black Helicopters

    I could murder

    a cup of tea right now.

    1. onefang Silver badge
      Black Helicopters

      Re: I could murder

      Here's $20, get me a gram of leaves. Earl Grey, or some fruit infusion, don't forget to get change this time.

  39. Graham Cobb

    Irony

    I am disappointed that almost all the commentards here have missed the irony in the article. It is actually really quite thought-provoking.

    Of course we are told that the devices are just listening for their wake-up keywords. And some of them probably are. But we have no way of knowing what undocumented wake up keywords are built in, or whether there are any other circumstances in which they will start to record, send and process audio.

    There have been various rumours of Google, Amazon and Smart TVs listening in for shopping-related terms in order to target advertising. And if they aren't doing that today, they certainly will be just as soon as they can get good enough local processing (which won't be hard in mains-powered devices).

    The article raises the question: if they are going to do that for their own commercial ends why wouldn't we require them to also do similar things for social good reasons? Good question.

    It also highlights the fact that if that question is asked, the manufacturers will push back very hard because the last thing they want is for us to be reminded that they are listening all the time and could be processing anything we say. They either will want to make a virtue of not being advertising-driven (Apple) or they need us to forget all about them being there and being unguarded in what we say (everyone else).

    And, of course, that is without even getting into the surveillance issues.

    Good, thought-provoking article. Pity that we don't teach irony any more and people started discussing how a device would decide automatically whether to call the police (particularly as the answer is obvious: do what a human would do, ask "are you all right?").

    1. JohnFen Silver badge

      Re: Irony

      "if they are going to do that for their own commercial ends why wouldn't we require them to also do similar things for social good reasons?"

      Because the proposal is impossible to do, at least right now and in the foreseeable future, without causing much greater harm than any benefit it could result in.

    2. Steve Knox Silver badge
      Facepalm

      Re: Irony

      But we have no way of knowing what undocumented wake up keywords are built in, or whether there are any other circumstances in which they will start to record, send and process audio.

      Bullshit. Code review, disassembly, fuzzing inputs, monitoring network traffic. Talk to any security researcher before you spout off on what we have "no way of knowing".

      There have been various rumours of Google, Amazon and Smart TVs listening in for shopping-related terms in order to target advertising.

      There have been various rumours of Elvis sightings and lizard overlords. Cite credible sources or don't repeat shit.

      And if they aren't doing that today, they certainly will be just as soon as they can get good enough local processing (which won't be hard in mains-powered devices).

      This is the one thing you've said so far that I agree with. Amazon, Google, et al are motivated by selling stuff or advertising. Anything they can do to increase profit from that is likely to happen. The only reason I don't think it's happening right now is that the local processing requirements are higher than what we find in the relatively lightweight devices available today. The only ones that might be able to approach this computing-wise are smartphones, and they're too motivated by keeping battery life within reason to go very far with this.

      The article raises the question: if they are going to do that for their own commercial ends why wouldn't we require them to also do similar things for social good reasons? Good question.

      Remember what I just said about local processing requirements? Okay, now scale that up exponentially. We can't be just talking keyword recognition here, because voice recognition, as good as it is, still has a lot of trouble, especially with similar words like "grape" and "rape". It'll need full contextual recognition, which even the full-bore cloud "AI" systems haven't been able to even start to get right. Otherwise your phone or in-home device will be asking "are you alright" so often that you'll likely smash it just to get some peace.

  40. Anonymous Coward
    Boffin

    If you're going to have it then make it work for you

    It would be possible to localize some analysis to the Voice assistant device and an owner could choose what they wanted the voice assistant to listen our for, not just "Ok Google" but other certain sounds, even smashing glass of a break in. cries of help

    The voice assistant could then initiate a conversation, asking Are you alright, do you need help, or a warning that it is programmed to auto respond in 10 - seconds or something to that effect.

    These trigger situations and sounds could be established at initial set-up for the insecure, for it to be successful it would need to be put in before any situation arose, so it would be active when it occurred.

    The Public should press the manufacturer to provide these facilities in their voice assistant.

  41. DougS Silver badge

    What a ridiculous idea

    How the heck is it going to be able to tell the sounds of an assault versus the sounds of an assault happening on TV? Or two boys fighting? (like many of us did with our brothers all the damn time) Or (consensual) rough sex?

    Can we as humans reliably tell when there's something criminal happening in the next apartment over? So how the hell is Alexa going to be able to? The kind person who comes up with this boneheaded idea is the kind of person who thinks Alexa is "AI".

  42. Paper

    Not just listening...

    As an aside, Google (and I assume Apple) keeps anything we ever said to them, tracks where we go, what we search, who we call, what our interests are, when we use our devices, which apps we use, etc, etc.

    Five years ago, had either company just introduced all these privacy invasions in one go, would any of us been happy to say yes? Or would we have freaked out an said it would be too much?

    I just went through my Google account removing personal info, a shocking amount. It's impossible to disable the Google Assistant on my Nexus 6P - it's always listening when I'm on the home screen. So I denied the Google app permission to use the microphone!

    1. onefang Silver badge

      Re: Not just listening...

      "It's impossible to disable the Google Assistant on my Nexus 6P - it's always listening when I'm on the home screen. So I denied the Google app permission to use the microphone!"

      Or install a different home / launcher app. There are quite a few good ones.

      1. Paper

        Re: Not just listening...

        Ah thank you for the tip. I didn't realise it could be swapped out.

        1. Charles 9 Silver badge

          Re: Not just listening...

          You can swap it out, but are you sure it's completely turned off? Some system apps (like the default home app) can't be disabled, not even with the App Settings.

  43. TrumpSlurp the Troll Silver badge
    Windows

    Solution looking for a problem?

    Posters are suggesting things like listening for the sound of beaking glass to identify a break in. No! Sensors on the glass linked to a burglar alarm. Panic button if you are really worried. Problem already solved.

    Monitoring the aged and infirm? They have to give up all ideas of personal privacy once serious nursing care is involved. After all in a hospital or nursing home you have none. So cameras in every room with a remote 24/7 monitoring staff. You can cover an awful lot of homes that way. You can connect remotely to the house, sound a loud alarm, talk via a speaker, and call the emergency services. There may be a role for a speaker system in this, but I don't see it as the central role.

    What I am seeing is "We have a product. What new markets are there?" not "We have a problem to solve. What is the most effective technology?".

    When you only have a hammer every problem looks like a nail.

    1. Anonymous Coward
      Anonymous Coward

      Re: Solution looking for a problem?

      "After all in a hospital or nursing home you have none."

      Wanna bet? Last I checked there are still no cameras in the toilets, as there's still too much risk of lawsuits even in a hospital setting. Same for home care, there WILL be no-camera zones...or there will be peeping-tom lawsuits.

  44. Anonymous Coward
    Anonymous Coward

    These things have called the police in the past based on television, which is why they can't do it now. Imagine the police showing up at your door because they think a crime is going down, guns drawn, all because you decided to watch <insert gangster/crime-drama here> - AI just isn't smart enough to grasp context well enough to be expected to do this accurately, not even remotely close. Until it is, doing this is just going to create more problems than it solves.

    1. Charles 9 Silver badge

      And here's the catch. Humans can't tell the difference most of the time. An episode of Adam-12 had the police respond to a call of two women screaming (possible assault). Turns out the two ladies were practicing karate and their screams were just their kiai. If WE with our highly-evolved brains and senses can't tell the difference, what chance does a machine have?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019