back to article Uncle Sam stonewalls probe into its secretive airport facial-recognition technology. Now the ACLU is suing

The American Civil Liberties Union (ACLU) is suing Uncle Sam's Homeland Security, and multiple government agencies, claiming the g-men stonewalled on what they are doing with people's faces scanned at US airports. The civil-rights warriors hope to extract information from the organizations via the courts. In December, Homeland …

  1. Local Laddie

    Facial Recog..Same old, same old...

    Seeing how uncle Sam has got away with this for so long, knida explains why the same is happening in UK airports. Liverpool Airport has very obvious facial recognition cameras placed just after security and throughout the teminal, but, when asked no-one in secuirty (police included) seems to be able to explain what happens with the images taken.

  2. Pascal Monett Silver badge

    "analyses of the effectiveness of facial recognition technology"

    We have one already, and the effectiveness is 12.5%.

    I doubt US airports are doing any better.

  3. VibhorTyagi

    Uncle Sam able to engineer AI facial tracking software but...

    The worst part about dealing with algorithms, are their affinity towards unpredictability. Scientists had managed to engineer AI sorting software some time ago in 2017. The thing is that this AI randomly discriminated against women - almost on a whim! It even showed preference against people of color, over much fairer counterparts.

    https://www.theguardian.com/inequality/2017/aug/08/rise-of-the-racist-robots-how-ai-is-learning-all-our-worst-impulses

    ICE, TSA and CBP are at a loss for being allowed to engineer AI trackers for their own uses (which, as mentioned, create almost 80% false positives).

    ~Vibhor Tyagi (Techie at Engineer.AI)

    1. Anonymous Coward
      Anonymous Coward

      Re: ... this AI randomly discriminated ...

      It's never been quite clear to me ... is it better that (a) the automated surveillance system is good at recognizing you, or (b) that it's bad at recognizing you? As someone who objects to all this unnecessary and intrusive tracking, I might prefer to be in category (b) ... or maybe only until it misidentifies me as Osama Bin Laden or whoever. But since I'm not actually OBL, is the occasional inconvenience of producing hardcopy ID really worse than being accurately tracked everywhere for no good reason?

      It's a bit of a conundrum.

      1. Danny Boyd

        Re: ... this AI randomly discriminated ...

        [The problem with using FR for law enforcement is,] it is bad when the AI system recognizes you as somebody else [the cops are after].

  4. Jimmy2Cows Silver badge
    FAIL

    We won't scan everyone. Pinky promise!

    In December, Homeland Security dropped plans to scan the faces of everyone traveling through US airports, and instead focused on identifying anyone not a US citizen or permanent resident.

    And how do you make that determination without scanning everyone?

    1. This post has been deleted by its author

    2. onemark03

      Re: We won't scan everyone. Pinky promise!

      @ jimmy2cows:

      "And how do you make that determination without scanning everyone?"

      Er, by checking passports first, perhaps?

      1. John Brown (no body) Silver badge

        Re: We won't scan everyone. Pinky promise!

        So the camera turns off when it detects a passport of a US citizen or permanent resident approaching? Or are these "special" cameras only located in highly specific locations so that they can only scan the relevant people through one access point while everyone else gets funneled through the non-camera aisle?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2020