back to article We listened to more than 3 hours of US Congress testimony on facial recognition so you didn't have to go through it

AI experts, lawyers, and law enforcement urged US Congress to regulate the use of facial recognition technology during a hearing held by the House Committee on Oversight and Reform on Wednesday. The technical issues and social impacts of using AI software to analyse images or videos are well known. There have been repeated …

  1. JeffyPoooh
    Pint

    File under: No Sh1t Einstein

    "AI systems perform worse on data that they haven’t seen before."

    1. JeffyPoooh
      Pint

      Re: File under: No Sh1t Einstein

      ...and that's why, given that A.I is famously 'hard', A.I. outdoors is even harder.

      'Hard' in the I.T. sense of nearly impossible.

      Outdoors is where you'll more often see things that you've not seen before. There's no training data set for everything outdoors. It's different everyday.

      "A.I. is hard."

      "A.I. outdoors is even harder."

      These observations can be chiseled into the granite walls of I.T. campuses everywhere.

      The hopelessly naive Self-Driving fanboi crowd will eventually relearn this.

    2. pavel.petrman

      Re: File under: No Sh1t Einstein

      In my office we have a glossary, in which the term AI stands for "hyped up limited state machine".

    3. The Indomitable Gall

      Re: File under: No Sh1t Einstein

      Said like that it's obvious, but the limited public understanding of AI and machine learning means it's crucially important that it is said.

  2. Anonymous Coward
    Anonymous Coward

    What's the issue?

    False Positives aren't a big deal. Innocent people will be occasionally dragged out of a restaurant, threatened with guns, violently handcuffed, thrown in jail for days or weeks, but then they'll eventually be released without charge (usually).

    As long as they're not actually convicted, then there's really zero harm at all. Right?

    /sarcasm alert

    1. jmch Silver badge

      Re: What's the issue?

      "False Positives aren't a big deal"

      You jest of course, but this is the crux of a problem that nay people seem to struggle with. 100% success rate is impossible, so either the system is calibrated to catch as many people as possible (but will create lots of false positives), or the system is calibrated to be as sure as possible of it's results (resulting in many false negatives).

      Not sure if I remember correctly but wasn't it one of the US founding fathers who said something like (paraphrasing generously here) "better 10 guilty men go free than 1 innocent man be jalied"?

      1. BebopWeBop

        Re: What's the issue?

        I think is was a British legal scholar, Henry Blackstone (late 18th century) but I am happy to be corrected. It is known as 'Blackstone's ratio. It was frequently cited by the founding fathers though.

        1. Michael Wojcik Silver badge

          Re: What's the issue?

          William Blackstone, in the case of Blackstone's Ratio. You're right about British and 18th century, though.

          Henry Blackstone (the 18th century British legal scholar) was William Blackstone's nephew. It was William who wrote Commentaries on the Laws of England and coined the Ratio. He formulated it as "It is better that ten guilty persons escape than that one innocent suffer".

          Benjamin Franklin cranked it up by an order of magnitude. to 100 guilty : 1 innocent. But Franklin liked to be hyperbolic.

      2. Michael Wojcik Silver badge

        Re: What's the issue?

        either the system is calibrated to catch as many people as possible (but will create lots of false positives), or the system is calibrated to be as sure as possible of it's results (resulting in many false negatives)

        I'm no fan of facial recognition systems, but I have to point out that the binary you present is not accurate.

        In machine learning, researchers generally talk about "precision" (the inverse of the false-positive probability, or the ratio of correct results to all results) and "recall" (the inverse of the false-negative probability, or the fraction of desired results that were actually returned).

        Those are often combined in a single metric, the F-measure. F1 weighs precision and recall equally; for other values of the β-parameter (the subscript) the F-measure returns a metric weighted by assigning β-times as much importance to recall as to precision. (So β > 1 values recall over precision - you're interested in more results, even if more of them may be wrong - and 0 < β < 1 values precision over recall, preferring fewer but more accurate results.)

        With most classifying systems, you can decide what β value you believe best matches your goals (possibly by using a held-aside corpus or other training set and an optimization process). Then you can tune the parameters of your system to maximize that Fβ, again using an optimization process such as Expectation Maximization (assuming your system's parameterization is convex; there are more complex training approaches for non-convex models).

        But while the situation is more nuanced than a "prefer false positives" / "prefer false negatives" toggle, you and other commentators are of course correct that these systems will always be imperfect and hampered by biases in their training data, among other problems.

  3. Mark 85

    The hearing took place at the same time as Amazon shareholders tried to stop Rekognition being sold to law enforcement. The proposal was defeated, but the vote tallies were not immediately disclosed

    Would anyone expect otherwise? Corporates aren't about ethics but profits. AI still has a long way to go and it's really not intelligence but pattern matching. But it has a catching buzz phrase. <sigh>

    1. big_D Silver badge
      Coat

      And this is why the current definition of capitalism just doesn't work.

      It cares about profit. It doesn't care about longevity, people, the environment. A new definition of how well a company is doing needs to be implemented that takes into account the harm the company does and subtract that from its profits for shareholder value...

      But as long as the rich keep holding the majority stakes in companies, this probably won't happen. Monetary wealth is just so much more important than being able to breathe or being free...

      1. imanidiot Silver badge

        According to pure capitalist doctrine, companies SHOULDN'T care about "longevity, people, the environment.". In pure capitalism, if the people/consumers actually cared about those things they wouldn't buy the product. In reality people don't give a shit. And THAT unfortunately is why we can't have nice things.

        1. Anonymous Coward
          Anonymous Coward

          "According to pure capitalist doctrine, companies SHOULDN'T care about "longevity, people, the environment.". In pure capitalism, if the people/consumers actually cared about those things they wouldn't buy the product. In reality people don't give a shit. And THAT unfortunately is why we can't have nice things."

          In Capitalism as defined historically, companies would care because the people who purchased the item lived in the same place as the manufacturer. If the manufacturer acts malevolently, the people can stop purchasing and put the company out of business. The problem is that the folks who defined capitalism never imagined globalism... that you can manufacture and pollute and basically just shiite on all those around you because THEY aren't your customers... some guy halfway across the world is. And he doesn't care. And that is the problem:

          Capitalism does not work very well any more because products are not made where they are sold. The consequences of doing business (pollution, low wages, etc) no longer matter because they aren't relying (on those who they are screwing) to purchase the product. Business can be as evil as it wants, and suffer no consequences on the sell side. Thus, Bejing and the Chinese countryside is blanketed by air pollution, and no one in the UK or US feels the brunt of that pollution enough to stop buying their products (even if they could).

          When manufacturing location and destination markets are divorced, Capitalism has no incentive to play nice. Pure capitalist doctrine works locally, fails globally.

        2. Crazy Operations Guy

          The problem is also that a lot of people give a shit, but lose their enthusiasm when they realize that companies are being controlled by shareholders that don't care about the company at all beyond the few bucks they can make int he short-term. The biggest problem in our modern form of capitalism is that companies are being controlled more and more by short-term traders that care little about the future of the company and only care about short term gains so they can pump the stock then dump for a few bucks of profit. The way everything is set up is that it encourages shareholders to buy into a company, force shoddy, but heavily-hyped, products onto the market. Then before the flaws int eh product are noticed by the public, the shareholders have rode off into the sunset with their piles of cash, heading off to ruin another company.

          For the last few years, I've been in favor of moving away from a stock market system and replace it with a corporate bond market. At least then you'll get investors that give at least a fraction shit about what happens with the company 5, 10 years in the future.

      2. Doctor Syntax Silver badge

        "A new definition of how well a company is doing needs to be implemented that takes into account the harm the company does and subtract that from its profits for shareholder value."

        Even under old-fashioned definition successful suits from those false positives or fines for breaking privacy laws can subtract from those profits. And on the subject of the latter, if any EU citizens are included in the training data I'd expect that would introduce GDPR charges.

        1. big_D Silver badge

          All training/test data has to be anonymized.

          1. SteveMcG

            How do you anonymise a face.

            1. big_D Silver badge

              For such training data, it falls under the same GDPR categories as other personal data. You need the written consent of the owner of the face in order to use it.

              Test databases (i.e. not production) need to have all data anonymized.

        2. Anonymous Coward
          Anonymous Coward

          The big problem is that even if someone was acquitted due to a false positive, their life is still in shambles. Some years ago I was running to get to a meeting, but unbeknownst to me, I was also running in the same direction as a suspect in a murder. I was held for 3 days until they figured out I wasn't their guy (I was arrested on Friday, they couldn't get an eyewitness in until Monday evening), but in that time I lost my job, everyone who new me found out I was picked up as a murder suspect and since the police never picked up anyone else for it, I got saddled with the stigma of being the sole suspect in a murder. Every time a prospective employer did a background check, they'd find my arrest and would refuse to hire me even though I was cleared of all suspicion. A lot of background check companies won't bother updating their databases, so even if the police expunge your record, its still there in these company's databases, which getting them to fix their mistake is a Kafkaesque / Sisyphean nightmare.

          Several years of my life out the window because I ran from where I took lunch to the office. Wasn't even an important meeting anyway.

  4. P. Lee

    Wrong Problem

    The problem is not that it works badly.

    The problem is what it enables if it works well.

    Smile for the nice Chinese social credit score system coming to a place near you.

    You will smile for it, or you won't be able to get home, or a home or online, or log into a computer or phone. Get caught signing "ok" to a friend and get fired, lose your banking etc.

    You think people in the West don't want this? Take another look at twitter.

    1. Anonymous Coward
      Anonymous Coward

      Re: The Big Brother is watching.

      So smile!

    2. vtcodger Silver badge

      Re: Wrong Problem

      Take another look at twitter.

      Why would I do that? I took a couple of looks at twitter when it first started and concluded that twitter is for twits. Nothing that I've seen in the decade since has caused me to question that conclusion.

      1. Anonymous Coward
        Anonymous Coward

        Re: Wrong Problem

        "Take another look at twitter."

        "Why would I do that? I took a couple of looks at twitter when it first started and concluded that twitter is for twits. Nothing that I've seen in the decade since has caused me to question that conclusion."

        But I am sure that an awful lot of the 'Training Data' generated over that decade has reinforced the correctness of your initial conclusion to somewhere in the 99.999% region !!! ;) :)

    3. Graham Cobb Silver badge

      Re: Wrong Problem

      This is my concern as well.

      I know policing is a hard job. And I have respect for the people who take it on. But I strongly believe, unfortunately, that it has to remain at least as hard.

      In order to keep society free, it is important that "the establishment" (of whatever type, colour, view, etc) cannot control the population. Government must be by consent.

      That means that campaigners, researchers, journalists (formal and informal), radicals, politicians (formal and informal) and demonstrators must all be protected from identification (at least if they wish to be). Realistically that means it must be too hard/expensive to collect and identify them.

      The most effective way, over the years, to achieve that seems to be to limit the resources of the tools (particularly, police, military and secret services) the establishment can bring to bear so that they will, at least, prioritise and be selective.

      Yes, sorry, that means some small criminals will get away. That is what we mean when we say that liberty means giving up some safety.

      1. Anonymous Coward
        Anonymous Coward

        Re: Wrong Problem

        "Yes, sorry, that means some small criminals will get away. That is what we mean when we say that liberty means giving up some safety."

        But if a small criminal can get away, so can a big one as there's no way to tell the difference until after the fact, by which point many people (including your loved ones) could well be horrifically dead. Then what?

        1. Graham Cobb Silver badge

          Re: Wrong Problem

          No. The point is that if resources are restricted then priority calls have to be made. Serious crimes (including murder) will still get investigated.

          1. Anonymous Coward
            Anonymous Coward

            Re: Wrong Problem

            Which only get investigated after the fact, by which time it's too damn late. How many mass murderers were on investigatory watchlists and still raised hell and got families to chew out governments for not doing enough, eh?

      2. ParksAndWildlife

        Re: Wrong Problem

        The solution is not to protect anyone - including the police, government officials, and everyone else. Surveillance data collected by government is public property and should be available to everyone. In the United States of America, this falls under the equal protection amendment. If the police can track members of the public, the members of the public should be able to track them and their political masters. After all, if they have nothing to hide....

  5. Updraft102

    Racist indeed.

    I am not sure how the software expressed the opinion that some races are superior to others, but I can tell you that as a white male, I am highly offended that it treats me worse than a black female.

  6. Anonymous Coward
    Anonymous Coward

    "Long story short: Models are ineffective, racist, dumb..."

    So a perfect representation of American society...

    1. Anonymous Coward
      Anonymous Coward

      "Long story short: Models are ineffective, racist, dumb..."

      So a perfect representation of American society...

      While you might have got away with that 5 years ago, the entirety of Europe has shown to be a racist cesspool. Hell, that is THE primary reason for BREXIT. Pot meet kettle...

  7. Bronek Kozicki

    " It’s simply absurd for elected politicians to be wanted criminals."

    It bloody is NOT absurd at all.

    1. jmch Silver badge
      Thumb Up

      Re: " It’s simply absurd for elected politicians to be wanted criminals."

      I saw that line and jumped straight to the comments to vent, half-sure that someone would already have singled that line for the bit of bullshit it is

      1. Loatesy

        Re: " It’s simply absurd for elected politicians to be wanted criminals."

        me too, though I suspect the author was being extreeeemly sarcastic!

    2. Rich 11

      Re: " It’s simply absurd for elected politicians to be wanted criminals."

      You may have missed a little sarcasm there.

    3. BebopWeBop

      Re: " It’s simply absurd for elected politicians to be wanted criminals."

      See Pterry's commentary on the land of XXX where, as sensible people who have learnt from some of their mistakes, lock up politicians on election. [The Forgotten Continent]

      1. Evil Scot

        Re: " It’s simply absurd for elected politicians to be wanted criminals."

        "The Last Continent" in some markets.

        See also Sorcerer's stone and Midnight Riot.

        1. Michael Wojcik Silver badge

          Re: " It’s simply absurd for elected politicians to be wanted criminals."

          I don't recall any particular political commentary in Midnight Riot. The first Rivers of London novel, yes? Am I forgetting something?

          (I admit I had mixed feelings about MR. Not bad, but why did it take so long for Peter to make the association to a certain well-known fictional character? I've never seen that sort of performance in the flesh, as it were, but I recognized it on the first description. I'm always annoyed when I figure out aspects of a mystery or crime drama before the detectives.)

  8. BebopWeBop
    Devil

    Are you sure

    showed Amazon Rekognition incorrectly matched members of the US Congress to criminal mugshots

    Was this actually a mistake?

    1. lglethal Silver badge
      Go

      Re: Are you sure

      Only the crime that they're matched up with. No way would a Politician even get out of bed for less than a few million. Most of the crims in the Police database are guilty of stealing a lot less than that...

      1. BebopWeBop

        Re: Are you sure

        Upvote, but I should point out that the number of multi-millionaires grows by the day.........

  9. Doctor Syntax Silver badge

    Michael Punke ... said at the time it has “not received a single report of misuse by law enforcement.” It’s difficult to verify that claim, however, considering that the police haven’t been transparent about how it’s used.

    More to the point, is there any definition of what legitimate use and, by implication, and misuse would be? Without that it would be impossible to make such a report.

  10. Kevin Johnston

    Legal Options?

    There is an option here which would be simple and also allow properly managed deployment of systems like this.

    You simply put into law that biometric storage/analysis systems are unlawful EXCEPT when specifically permitted and only under the terms specifically listed. This prevents scope-creep while providing a route for these systems to go beyond the labs

  11. Earth Resident

    That's not all

    While they are at it, San Francisco and other American cities should ban the use of Stingray devices by their police to hijack mobile phone connections in order to tap conversation and geo-locate the owner.

    1. JeffyPoooh
      Pint

      Re: That's not all

      "...Stingray devices by their police to hijack mobile phone connections..."

      As a public service, perhaps the police could do us a favor by connecting their Stingray devices directly to a local fiber optic connection. At least we'd get much better data connections to the internet while we are being tracked and surveilled.

  12. Hans 1

    @elReg

    Long story short: AI is ineffective, racist, dumb...

    #TFTFY

  13. Phage

    Terrifying

    Did you see the objector who had his photo taken by force and given a fine for non-compliance ? In a policy vacuum, terrible things rush in to fill the space.

    https://www.bbc.co.uk/news/av/technology-48228677/could-facial-recognition-cut-crime

  14. John Smith 19 Gold badge
    WTF?

    TL;DR. GIGO

    I think that summarizes how well this BS works IRL.

  15. jelabarre59

    Wanted or unwanted?

    Amazon Rekognition incorrectly matched members of the US Congress to criminal mugshots,...

    No, those were probably correct matches.

    It’s simply absurd for elected politicians to be wanted criminals.

    Well at least that's correct. Politicians are *UN-wanted* criminals.

  16. Randy Hudson

    Misread that

    At first I read that as:

    “Feces maybe the final frontier of privacy”

    Is nothing sacred?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like