back to article Apple: Ok, ok, we'll stop listening in on your Siri conversations. For now, but maybe in the future

Apple has hit pause on contractors listening into and making notes on recordings of people using its Siri digital assistant after the secretive practice was exposed. Cook & Co has tried to differentiate itself from the rest of the tech industry by stressing its privacy credentials, but it has notably not ended the Siri …

  1. Marketing Hack Silver badge
    Go

    Business opportunity for Apple

    On the bright side, maybe Apple can monetize all those recordings by offering a service for use during marital/relationship arguments, where one spouse can access Siri's snooping to prove to the other spouse that, yes, they had in fact agreed to pick up the drycleaning by today, or had agreed to pick up their better half at the airport as such-and-such time.

  2. N2 Silver badge

    Just another piece of intrusive shyte

    That gets turned off and blocked

    1. Chronos Silver badge
      Holmes

      Re: Just another piece of intrusive shyte

      ..or, perhaps, don't buy one in the first place?

      1. Fatman Silver badge
        FAIL

        Re: ..or, perhaps, don't buy one in the first pla

        I wish I could give you a 1000 up votes.

        I can not understand (actually, in a way, I do) the gullibility and stupidity of `Joe and Jane Sixpack`.

        WHO in their right mind would want a FUCKING LIVE MICROPHONE listening to the daily goings on in your home?

        ONLY the terminally stupid, who can not grok the privacy implications would be so goddamn reckless.

        1. The_Idiot

          Re: ..or, perhaps, don't buy one in the first pla

          "...ONLY the terminally stupid"

          Perhaps not. Maybe high-risk, low capability individuals, like my wife, who has advanced Multiple Sclerosis and is totally wheelchair bound? We can't afford day-time or live-in care, so while I'm at work, she's on her own. While the FUCKING LIVE MICROPHONE system (and recording/ playback) I have in place is NOT Siri, it is _still_ a fucking live microphone, so if something bad happened I would stand a chance of knowing. Does that make her, in your view, 'terminally stupid'? You are, of course, entitled to your opinion either way.

          1. Benson's Cycle

            Re: ..or, perhaps, don't buy one in the first pla

            I think there is merit in both your posts, and possibly you don't disagree as much as you think. Obviously there are cases where a microphone with remote access and speech recognition are extremely useful. The problem is that you will not get one with adequate security from Apple, Google or Amazon.

            Back in the 1990s we were discussing such things as part of future home automation, but it never occurred to anybody that what actually arose would be mediated by three US companies all of whom were trying to sell us something. My last built-in car satnav had speech recognition without being connected in any way to the Internet, so it's possible. Just being able to make a phone call over the PSTN or send an SMS would be adequate for many disabled people. My father has a phone with a fall detector that just gets the GPS location and starts to dial the emergency call list. Because people are disabled and need support should not mean handing over their voice recordings to US (or any) corporations.

        2. LDS Silver badge

          Re: ..or, perhaps, don't buy one in the first pla

          Please tell me how I could by a phone without a mic?

          1. SkippyBing Silver badge

            Re: ..or, perhaps, don't buy one in the first pla

            Why are you talking to your phone?! Use it for passive aggressive text based communication like God intended.

          2. Chronos Silver badge
            Big Brother

            Re: ..or, perhaps, don't buy one in the first pla

            Telephones can be disconnected from any network activity when not needed (a[e|i]r[o]plane mode) which precludes snoopage. Google's creepy voice "search" assistant can be lobotomised quite easily. Echo devices and Google's equivalent are on all the time and have no other utility than to listen for a "wake word" which, since you don't know how many there are, could be anything from "Alexa" to "fuck" and anything in between. Disable network traffic to and from those and they become just another useless lump of plastic with blinkenlichts or, at best, a rather expensive Bluetooth speaker. Basically, you're paying for a bug to be installed into your home.

        3. gnasher729 Silver badge

          Re: ..or, perhaps, don't buy one in the first pla

          "WHO in their right mind would want a FUCKING LIVE MICROPHONE listening to the daily goings on in your home?"

          Siri doesn't listen "to the daily goings on". It only starts listening when you say "Hey Siri".

  3. DougS Silver badge

    What the article leaves out

    Is that when recordings when sent to Apple for "grading" they were anonymized, with no way to link them back to a particular person. Now if you said your name or address in the recording that would "out" you for that particular recording, but there isn't any way for them to search for other recordings for that particular person even if others had happened to also be uploaded.

    Google and Amazon upload ALL your yours, and with your ID attached - that's how they're able to offer you the option to delete your saved recordings. Apple can't offer you that, because it has no idea which recordings it has "graded" are yours versus anyone else's.

    1. fwthinks
      Big Brother

      Re: What the article leaves out

      I am willing to accept that each company does things differently, but they are all the same when it comes to explaining what happens behind the scenes - which is to try to hide or obfuscate the process as much as possible.

      To whole discussion is about the human review of recordings, but this makes one big assumption - that they have sufficient controls around segregation of duties. While there may be external teams given some recordings, I hate to think about the number of people who would have access to the raw data just from an IT admin perspective.

      Working in IT - I know that often what is recorded in the security policy does not align with reality. If Apple/Amazon/Google say they are strictly controlling access to this data - how would we know this is really true unless there was impartial review. So all we are left with is their word and unfortunately they all lost my trust years ago.

    2. NetBlackOps

      Re: What the article leaves out

      According to the whistleblower, PII features prominently in Apple's Wi-Fi recordings.

    3. The_Idiot

      Re: What the article leaves out

      "there isn't any way for them to search for other recordings for that particular person"

      Well, with respect, 'there isn't any way' can, like most absolutes (and potentially all of them, but I wouldn't wish to use an absolute (blush)) trip you up. Recordings could, at least in principle, be processed for audio pattern recognition (I'm not fond of the term 'voiceprint'). Would it be 100% reliable? Probably not. Would it qualify as 'a way to search for other recordings for that particular person'? I would suggest it might - but, of course, I'm an Idiot, so I'm sure others (and possibly you) might think otherwise :-).

      1. TechnicalBen Silver badge

        Re: What the article leaves out

        "Siri how do I get to [street number and postcode]".

        "Siri, Call [full name] please"

        Yep, easy to de-anonymize.

        1. DougS Silver badge

          Re: What the article leaves out

          Sure, for that specific recording. But if you have two queries that both happened to be uploaded for review, one is "Siri how do I get to <home address> from here?" and the other "Siri where can I find donkey porn?" it is easy to link the first to you because PII is included in the query. The second can't be linked to you, because it doesn't have any information with which to do that, and there is no way to link the first and second query because they don't include a common "ID" or anything to link them as coming from the same phone.

          The only possible way to link them would be if you could listen to ALL recordings and match by voice, which I doubt is any more reliable for either false positive or false negative matches than facial recognition when using an unconstrained search set.

    4. Jove Bronze badge

      Re: What the article leaves out

      ... and yet we know that the process of "anonymizing data" is circumvented by use of third-party data to details of identities identities.

      1. DougS Silver badge

        Re: What the article leaves out

        That's only true for "anonymized data". If the data is truly anonymous, i.e. nothing in it to tie to a particular device, person or other recording then the only way you can de-anonymize it would be based on the content of the recording. How many interactions with Siri, Alexa, Google, Cortana do you think include anything like that?

        The problem you speak of is when they for instance remove the name and address from personal information, but leave the sex, age and city. Combined with other information something like a "30 year female living in <town with population of 20,000>" would already be narrowed down to perhaps a hundred or so, so it wouldn't take much additional like "single mother of two and Alexa user" coming from another source to narrow it down a single individual.

        1. Jove Bronze badge

          Re: What the article leaves out

          The data collected from devices in the first instance is rarely anonymous - it is tied to a device, individual, agent in some way in the first instance.

          You are also ignoring the third-party data gathers that make available other data - either to customers or leaks ;) - that can be used to restore identity information to anonymized data sets.

  4. Woodnag

    participate?

    "users will have the ability to choose to participate in grading".

    I wouldn't call being given the option to allow 3rd parties to listen to previously-presumed-private sounds "participation".

  5. Anonymous Coward
    Anonymous Coward

    Salt the data ?

    Maybe Siri users could get together an organise a series of events to fill Apples database up with loads of dodgy data. Maybe tens of thousands of people suddenly plotting a murder, or terrorist act within earshot of Siri ?

    The only way the public will ever wrest control of their data back from these internet behomoths is to ensure it is of no value to them by poisoning the well.

    The above would also have the added bonus of sticking it to all the slavering LEAs worldwide who like snooping in places they shouldn't be.

    1. The Dark Side Of The Mind (TDSOTM)

      Re: Salt the data ?

      This might work if privacy would have the same meme potential and media coverage as Area 51. Until that happens there can only be a bunch of geeks playing pranks on phones, like it was in the (not so distant) past...

    2. Jove Bronze badge

      Re: Salt the data ?

      ... they could organise a "Speak Welsh Only" day - that should cause a few problems for the listeners.

  6. Anonymous Coward
    Anonymous Coward

    We get the problems...

    ...but to the mindless army of bearded millennial baristas trying to follow in our footsteps - lack privacy is the same as lack of socks.

  7. I.Geller Bronze badge

    Without personal profiles, updates and purging - the listening is the lost of money and time.

    1. Jove Bronze badge

      Without personal profiles, updates and purging - the listening is the lost of money and time.

      ... is this an example transcript from Siri? It is certainly as unintelligible as a some of the recordings might be.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019