A big biometric security company in the UK, Facewatch, is in hot water after their facial recognition system caused a major snafu - the system wrongly identified a 19-year-old girl as a shoplifter.

  • CeeBee@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    4
    ·
    1 month ago

    What I’m saying is we had a deployment in a large facility. It was a partnership with the org that owned the facility to allow us to use their location as a real-world testing area. We’re talking about multiple buildings, multiple locations, and thousands of people (all aware of the system being used).

    Two of the employees were twins. It wasn’t planned, but it did give us a chance to see if twins were a weak point.

    That’s all I’m saying. It’s mostly anecdotal, as I can’t share details or numbers.

    • boatswain@infosec.pub
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      1 month ago

      Two of the employees were twins. It wasn’t planned, but it did give us a chance to see if twins were a weak point.

      No, it gave you a chance to see if that particular set of twins was a weak point.

      • CeeBee@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 month ago

        With that logic we would need to test the system on every living person to see where it fails.

        The system had been tested ad nauseum in a variety of scenarios (including with twins and every other combination you can think of, and many you can’t). In this particular situation, a real-world test in a large facility with many hundreds of cameras everywhere, there happened to be twins.

        It’s a strong data point regardless of your opinion. If it was the only one then you’d have a point. But like I said, it was an anecdotal example.