back to article How do we stop facial recognition from becoming the next Facebook: ubiquitous and useful yet dangerous, impervious and misunderstood?

Facial recognition is having a rough time of it lately. Just six months ago, people were excited about Apple allowing you to open your phone just by looking at it. A year ago, Facebook users joyfully tagged their friends in photos. But then the tech got better, and so did the concerns. In May, San Francisco became the first …

  1. Anonymous Coward
    Anonymous Coward

    Facial recognition itself isn't bad

    It is the storage and widespread use of facial matching that makes it terrible. Its fine on an iPhone, since the data it uses to match your face is stored in a secure area of the phone and never leaves the phone (if you backup and restore a phone via iCloud, you have to re-train it recognize you) An Android phone that did similar and made sure Google couldn't get their dirty paws on your picture would also be fine.

    The terrible thing is when companies/governments that have a picture of your face along with name (i.e. Facebook, DMV, customs) use it for uses you never intended and have no control over, like Facebook trying to tag you in pictures you didn't post - or maybe even aren't on Facebook - or governments using it for surveillance and tracking your every move.

    No doubt if Google has your photo and matching name they'll be wanting to set up cameras in meatspace for Minority Report like maximally invasive advertising that will shout at you as you walk by "hey DougS, you searched for jeans 31 days ago, come on in to Eddie Bauer and check out our selection of jeans, if you buy in the next 30 minutes we'll give you 20% off!"

    1. macjules

      Re: Facial recognition itself isn't bad

      We have already had the invasive system you mention in place for some years, in the form of iBeacon. To date I have only seen iBeacon implemented successfully in one place in the UK - in a Wetherspoon pub trial where it worked perfectly with iOS devices of 6 or greater, but crashed when handling Android. If an iOS device owner purchased, say, 2 pints of lager via iPay he/she might then receive a push notification that "Buy another 2 pints now and we'll chuck in 2 packets of crisps".

      I have seen intelligent use ideas for the technology, such as Waitrose shopping lists, using ML and based upon your past 3 months purchases to average what you probably need, but as yet have not seen any successful realtime deployment.

      1. Anonymous Coward
        Anonymous Coward

        Re: Facial recognition itself isn't bad

        Facial recognition advertisements require only that you have eyes, while beacons require 1) you are carrying a phone that supports them 2) it has bluetooth enabled 3) it is running an app that can interpret them.

        It can't be trigged by paying with Apple Pay (which uses NFC not bluetooth) so that test must have involved running an app of some sort. i.e. you would have to deliberately opt in.

    2. Pascal Monett Silver badge

      Re: Facial recognition itself isn't bad

      Yes, at this point it is. Did you miss the part where it only works averagely well if you're white ? I think black people already have enough chances to be wrongly arrested (and possibly shot) without adding a "computer says so" to the list.

      1. Anonymous Coward
        Anonymous Coward

        Re: Facial recognition itself isn't bad

        That issue may happen regardless - i.e. cops might be more likely to think "yeah he looks like the guy on the wanted poster" if the suspect is black than white. Banning facial recognition doesn't eliminate bias, in many cases it will make it worse.

        With facial recognition it is possible to put rules in place. For example they could require that the software only put forth a match when there is less than 1% chance of a false positive. Now that might drive the false negative rate to 20% and they'll miss a lot of suspects, you'd need to balance the two. But whatever metrics they choose if they enforce them then there won't be any bias by race. If the software performs better for some and worse for others then the chances of catching a suspect who is one of the "worse for others" will be less because their false negative rate would be higher. Which will cause the police to put pressure on the developers to fix that shortcoming and make it work equally well across all races.

  2. JohnFen

    But that's accurate

    "especially since it has led to a broader sense that the technology is inherently dangerous."

    But it really is inherently dangerous.

    That said, I don't think bans are the way to handle this because banning technology never actually works. Regulating it has a much better (although imperfect) track record.

  3. Rol

    The antisocial network.

    My darling niece tagged me in a group photo on Facebook.

    She's currently serving a very long and arduous sentence - reading every word of Facebook's T&C's until I am satisfied she understands what she did wrong and how nothing she does can undo the damage.

    She has now stopped tagging photos and is talking her friends into doing the same, despite the fact it's too late for most of that generation who unwittingly worked for the Stasi network.

    After extensive plastic surgery, I feel free again, but the legal battle with Facebook to pay for it all is just rumbling on and on.

    1. Tromos
      Flame

      Re: The antisocial network.

      Facebook holds photo and contact data for millions of people who have not signed up to their services and hence have even less control of this data. This blatantly flouts the intention of GDPR as they are using an army of 'darling nieces' to provide contact lists and names to link to faces, etc. without the explicit permission of the people involved.

      Facebook MUST be made to go through all the data they hold and delete everything they cannot provide explicit consent for.

      1. DontFeedTheTrolls
        Headmaster

        Re: The antisocial network.

        And Twitter

        And Snapchat

        And Instagram

        And Reddit

        And ...

        1. Charles 9

          Re: The antisocial network.

          And if you go that far, you might as well ban cameras since it's going to be impossible to track or even obtain the explicit informed consent of every bystander caught in the frame by accident when they just as quickly vanish into the crowd never to be seen again...

          1. Intractable Potsherd

            Re: The antisocial network.

            It is interesting that, if the camera were invented for the first time today, there would be far more scrutiny of its privacy implications. The freedom of people to take photographs of whoever and wherever they want would certainly not be assumed as it has developed since Fox Talbot's day.

  4. Danny Boyd
    Thumb Down

    "Moore admits that facial recognition use is going to be based on a "social contract""

    Yeah, sure. And "social credit" too.

    1. Anonymous Coward
      Anonymous Coward

      There are any number of terms a "social contract" can contain. In this case, my terms are simple: you use it, I wear a ski mask in public. You ban ski masks, I stop leaving my home. If you want my participation in your system, that's the price of admission. That's the true meaning of a "social contract"; like any contract, a meeting of the minds is required. If there is no meeting of the minds or I don't agree to your terms, there is no contract.

      1. Danny Boyd

        That was a reference to China, not lack of understanding of term "social contract". Thanks for the clarification though.

      2. Anonymous Coward
        Anonymous Coward

        Works both ways. Where are you going to get your daily bread without some kind of interaction with the rest of society?

  5. Steve Davies 3 Silver badge
    Joke

    And in other news...

    Sales of Donald Trump facemasks areas where Facial Recognition has been deployer have quadrupled.

    Records of arrest warrants for one Donald J Trump have soared as he becomes the countries most wanted fugitive overtaking Ronald Regan.

    [see icon]

  6. T. F. M. Reader

    Real-time surveillance is not the biggest problem. Scouring footage after the fact is.

    [Moore] doesn't know of any law enforcement agency that wants to use the technology for real-time surveillance. They want to use it as an investigative tool after the fact by scouring footage.

    After what fact? The fact that someone with access - authorized or not - decides that an investigation may serve his or someone else's purposes? The extent to which even seemingly intelligent and educated people cling to the fallacy that "only the good guys will use it and only for noble purposes" is absolutely staggering, IMHO.

    I would be much less concerned about real-time as long as nothing is stored for any significant duration. I am still absolutely horrified by the prospect of real-time surveillance, mind you, but it is the prospect of data stored indefinitely that covers me in cold sweat.

    I have a proposition that Mr. Moore won't like very much, but then I disagree with him on a very basic level. There must be laws that ban storing visual information that can be used for facial recognition for longer than absolutely necessary. Absolutely necessary is, in my mind, something like a week at most. If no crime is reported within a week then any later investigation absolutely must proceed without footage. This is because the benefits (that no one denies) are so outweighed by the potential of abuse, and not just by the government. The penalty for not clearing stored footage must be really heavy, and much heavier yet for proven abuse. This is for static CCTV cameras and the like - can't effectively enforce it for mobile phone footage, TV crews, etc., but then any footage will be occasional, not regular.

    Yes. police work will be more difficult on occasion. Get over it. My work is difficult, too. And I insist that police work must be difficult. It is easy only in police states.

    1. Anonymous Coward
      Anonymous Coward

      Re: Real-time surveillance is not the biggest problem. Scouring footage after the fact is.

      "Yes. police work will be more difficult on occasion."

      Which means crimes don't get solved. Which means people get victimized, even killed. Your loved one could easily be next.

      1. Intractable Potsherd

        Re: Real-time surveillance is not the biggest problem. Scouring footage after the fact is.

        Utter nonsense. This country has never been safer.

        1. Anonymous Coward
          Anonymous Coward

          Re: Real-time surveillance is not the biggest problem. Scouring footage after the fact is.

          Uh...9/11? Nukes all over the place? If this is the safest the world has been, it's never been all that safe to begin with.

          1. Intractable Potsherd

            Re: Real-time surveillance is not the biggest problem. Scouring footage after the fact is.

            I said "the country", and was specifically referring to the UK at the time. However, taking you up on your points: the actual event of 9/11 (flying aircraft into buildings) was ultimately trivial for everyone except those directly affected (and my condolences still go to those who lost people on the day, but not more than the people who have lost others due to the idiotic "War on Terror" around the world). The response of the USA and others since then are causes for more concern, and there is no doubt that the world would be even safer if the US had taken a more reasoned response.

            With regard to "nukes everywhere", that is sheer hyperbole. Yes there are a lot, but there is no evidence anyone is going to use them in the foreseeable future. The threat of mutually assured destruction has made the world significantly safer for the last 70 years or so than it was before.

  7. Pascal Monett Silver badge

    "It is not the right way to regulate,"

    Well obviously it doesn't suit Moore, but his arguments don't really suit me.

    1) he doesn't know of any law enforcement agency that wants to use the technology for real-time surveillance

    The word missing there is "yet". He also states that we don't have the processing power, again missing the word "yet". We will get there, and the police already have a fairly extensive track record of abusing their powers and tools with gay abandon.

    2) He says that combined with improvements in the technology, we are rapidly getting to the point where within two-to-three years, the degree of accuracy in facial recognition will be in "high 90s" for all types of people

    Come back in two-to-three years then, and we'll talk about it again.

    3) it would be harder for a police officer to justify, say, stopping a black man because he thought he looked like a suspect if there was a facial recognition result that said it was only 80 per cent accurate

    I think a policeman would take a 4 out of 5 as a perfect reason for stopping the guy, with whatever consequences that may follow.

    4) the issue only got a "spotlight on it because facial recognition was in the same sentence."

    Well duh, if there hadn't been a camera, the guy wouldn't have felt the need to hide his face. Facial recog was at the very base of the problem, so yeah, it got the spotlight and rightly so.

    5) "Guns are a serious problem," he notes. "This technology is there to make better decisions."

    Sure, because FR is going to keep someone from pulling a gun. Way to go there, Moore. Let's not address the issue of guns, let's just put a band-aid over it and we can all feel all nice and fuzzy.

    6) We have turned down multiple clients where their use of the technology was not aligned with what we wanted to do.

    I am so impressed. How lucky we are to have you. Now what are you going to do about your competition ? Are you going to ensure that they act with the same, admirable, attitude ? How ?

    It may be that regulation should happen at a federal level, I'm not qualified to have an opinion on that either way.

    But I'm pretty sure that,whatever the level, the regulation should give clear guidance as to where FR is acceptable, how the data should be treated, how long it can legally be stored and what procedures should ensure that the data is properly deleted when its expiry date is passed.

    Oh, and selling the database should be a federal crime passable of 5-10 years without parole.

  8. Anonymous Coward
    Anonymous Coward

    How do we stop facial recognition

    yeah, like, how do you stop London cops using facial recognition?

    ...

    ok, so we shot all the politicians, we also shot those who promptly stepped into their boots and, think of the children, so we shot their children and their children's children. We then shot all other unrelated wanna be politician-zombies piling up to replace the ex-politicians, and now we've bolted the stable door. Only there's this constant, growing banging from the outside, and we're run out of bullets and it's kinda stuffy and dark in here and the superman's late again... ANY ideas?!

    ...

    Hey, I got an idea: we could all post a "stop facial recognition" message on facebook, and all the posts will have that camera icon crossed out, that should work!

  9. DontFeedTheTrolls
    Boffin

    "They want to use it as an investigative tool after the fact by scouring footage"

    Two words - scope creep

    As the technology matures the real time use will increase and new use cases will emerge. Some of those could be very beneficial in reducing and resolving crime, but at what cost. So it does make sense to close the stable door for now and keep the horses under control.

    1. MrXavia
      Big Brother

      I can see many valid uses of facial recognition,

      probably the best use is with CCTV footage to track a suspects movements away from a crime, or to help identify a criminal against a database of convicted criminals, basically doing what human eyeballs do now...

      But that is why we need legislation restricting its usage, especially use by the police and government, they are much less trustworthy than any private corporation.

      I think if you are innocent then you should not have bio-metrics stored in any police database, this includes all situations such as Arrested but released without charge, charged but charges dropped before trial, charged but found not guilty at trial.

      And even if found guilty, I think you should be removed from the databases a set time after the sentence has been served.

      I even think that it is not generally in the public interest for convicted criminals details to be released to the public, I believe everyone has the right to anonymity. (the daily mail would be upset)

      The whole criminal justice system is flawed, it harms the innocent, prevents rehabilitation of criminals, and doesn't address the root causes of crime.

      1. Charles 9

        "But that is why we need legislation restricting its usage, especially use by the police and government, they are much less trustworthy than any private corporation."

        Corporations can go transnational and play sovereignty against you, so your claim is debatable.

      2. Charles 9

        "The whole criminal justice system is flawed, it harms the innocent, prevents rehabilitation of criminals, and doesn't address the root causes of crime."

        Root causes are usually human or societal factors: both of which tend to have long histories and will be difficult to solve without side effects (due to institutional, societal, and cultural momentum). Some people are simply dead-ended; dealing with dead ends is a moral quandry.

        As for rehabilitation of criminals, one must recognize when criminals don't want to be rehabilitated. For example, there's no real way to change a sociopath. That means again you're dealing with dead ends. Not only that, erring on the side of caution can result in collateral damage of its own: like the "suspects" not brought in that end up going on rampages. Feels much like a dilemma: damned if you do, damned if you don't.

        If the criminal justice system is flawed, it's because it's the product of humans, which are hopelessly flawed themselves. Thing is, no one's been able to do much better, meaning we could be staring at a least-worst system that's still unacceptable.

  10. Anonymous Coward
    Anonymous Coward

    As Private Eye points out, it is still unable to identify any of the winners of last year's Love Island/X-Factor/Britain's Got Talent.

  11. Anonymous Coward
    Anonymous Coward

    Full Disclosure

    We've seen a number of prosecutions collapse lately (in the UK) because of lack of full disclosure, where more extensive investigation has brought to light new evidence. So if someone is prosecuted on the basis of a 90% accurate facial recognition, but claims they were elsewhere, what is the judicial obligation to examine all other digital trails - other facial recognition systems in and out of the area, card, phone usage, CCTV etc, on the grounds that something in them might exculpate the accused?

    1. Anonymous Coward
      Anonymous Coward

      Re: Full Disclosure

      Sounds like an open argument: meaning one that can be presented itself before the courts. Can someone be compelled (say by subpoena) to allow someone under criminal charge (thus subject to rights allowed under the Sixth Amendment) or an agent thereof access to potential exculpatory evidence such that you describe? Not being allowed could be construed as in violation of the Amendment's requirement of fairness and impartiality.

  12. Cynic_999

    Legal fact

    Everything you are doing routinely today could make you a serious criminal next year merely by the act of a politician signing a new piece of legislation that makes it illegal.

    People need to realize that not all criminal acts are bad or even undesirable - in some cases it can be exactly the opposite. Recently an Italian ship's captain was arrested & charged for the crime of saving people's lives ...

    1. Anonymous Coward
      Anonymous Coward

      rubbish

      Not that old chestnut again, from her activities its more than likely she is involved in people trafficking and on a near industrial scale.

      1. Cynic_999

        Re: rubbish

        It is a very recent event, so could hardly be called an "old chestnut"

    2. Anonymous Coward
      Anonymous Coward

      Re: Legal fact

      "Everything you are doing routinely today could make you a serious criminal next year merely by the act of a politician signing a new piece of legislation that makes it illegal."

      But usually not retroactively. The US specifically forbids this under the Constitution's Article I, Section 9.

      "People need to realize that not all criminal acts are bad or even undesirable - in some cases it can be exactly the opposite. Recently an Italian ship's captain was arrested & charged for the crime of saving people's lives ..."

      Springing a condemned from death row may not be construed in a positive light. Everything in context.

      1. Cynic_999

        Re: Legal fact

        "

        But usually not retroactively. The US specifically forbids this under the Constitution's Article I, Section 9.

        "

        It would be outrageous if you *could* be charged for doing something that was not a crime when you did it. Although there are some things that can effectively make you a criminal for an act you carried out in the past when it was legal - if I bought a drug that was legal when I bought it but has since been made illegal (of which there are many), then I can still be charged with "possession" unless I get rid of it immediately the legislation takes effect, because "possession" is a continuous act.

        However that's not what I was meaning. If you have become accustomed to doing a certain thing (e.g. having a glass of wine with dinner), and that thing is made illegal (e.g. alcohol prohibition), then you will be inclined to ignore the law and become a criminal. Or if the thing that is made illegal is extremely popular but very difficult to police (e.g. make it illegal to have sex with a person who has consumed alcohol, which is a possibility given the direction we are heading), then many people will become criminals and thus become vulnerable to being imprisoned should the authorities (or a spiteful ex-partner) wish to silence or punish them. Which is the case in China for example where certain popular Internet activities are illegal but not usually policed, making a high percentage of the population frightened to "make waves" lest they draw the authority's attention to their illegal activities.

        1. Intractable Potsherd

          Re: Legal fact

          Excellent points, Cynic. Too MA T people forget these things.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like