A big biometric security company in the UK, Facewatch, is in hot water after their facial recognition system caused a major snafu - the system wrongly identified a 19-year-old girl as a shoplifter.

    • Thassodar@lemm.ee
      link
      fedilink
      English
      arrow-up
      12
      ·
      4 months ago

      Shit even the motion sensors on the automated sinks have trouble recognizing dark skinned people! You have to show your palm to turn the water on most times!

    • nyan@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      8
      ·
      4 months ago

      Technically, there’s a tendency for them to be trained on datasets that don’t include nearly enough dark-skinned people. As a result, they don’t learn to make the necessary distinctions. I’d like to think that the selection of datasets for training facial recognition AI has improved since the most egregious cases of that. I’m not willing to bet on it, though.

    • CeeBee@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      30
      ·
      4 months ago

      No they aren’t. This is the narrative that keeps getting repeated over and over. And the citation for it is usually the ACLU’s test on Amazon’s Rekognition system, which was deliberately flawed to produce this exact outcome (people years later still saying the same thing).

      The top FR systems have no issues with any skin tones or connections.