Reply to post: Its the legal implications.

America's top maker of cop body cameras says facial-recog AI isn't safe

martinusher Silver badge

Its the legal implications.

Making a body camera is a trivial exercise these days. Making a system that can collect evidence quality video and audio, manging it through a chain of custody, is a whole different game. These companies sell systems, its how they can justify selling the individual units for silly prices.

They're also aware of the legal implications of getting things wrong. No facial recognition system is going to be 100% perfect so it will make mistakes and those mistakes could cause legal blowback, including a liability issue for the manufacturer. Hence the need to be super-cautious. However, in real life these systems only need to work as well as, or preferably somewhat better than, human recognition to be useful because they're not going to be used instead of humans but as an assistant.

There's a lot more to this technology than just matching a face to a name. If you arrive at Los Angeles airport at the international terminal and you're a US passport holder you'll be processed automatically by a kiosk. Image processing is also being used to identify people who are acting oddly or looking stressed (or rather "more stressed than normal"). This stuff is intrusive, is potentially a civil liberties killer but then its really only ANPR for people and everyone in the UK's been living with ANPR and its consequences for years, haven't they?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019