• ISOmorph@feddit.de
    link
    fedilink
    English
    arrow-up
    84
    arrow-down
    1
    ·
    edit-2
    1 year ago

    I just rewatched Minority Report a couple weeks ago for some reason. There was this scene where Tom Cruise had to get a new set of eyes because he got automatically identified everywhere. Everyone in the movie was constantly bombarded with personalized ads everywhere they went… That movie is over 20 years old and they knew exactly what was coming. Fucking scary shit…

    • qooqie@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      They already have certain things in some airports that scan you and tell you your boarding info on screens. These screens are designed to only be seen from the angle you’re at so it’s not too far a leap to imagine ads of the future using facial recognition with this technology

  • ShakeThatYam@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    ·
    edit-2
    1 year ago

    I was just at an airport that was doing this. I entered my ID into a slot and they scanned my face. There was a sign that said that the face scan would be immediately deleted, but I don’t believe it. I didn’t really want to be face scanned but I also didn’t want to make a scene.

  • ApeNo1@lemm.ee
    link
    fedilink
    English
    arrow-up
    30
    ·
    1 year ago

    I won’t comment on the ethical pros and cons of this being deployed in airports but from a system perspective it needs to be much higher than 97%. LAX processes about 240,000 travellers a day. 3% error translates to over 7000 travellers a day being incorrectly processed. What you want is closer to 99.9 so the error is in the hundreds and can reasonably be corrected with human intervention. This may sound like an easy push but anyone experienced in training AI/ML systems knows this is still a fair bit of work and every single percent increase in accuracy is significant.

    • cyd@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      I imagine the vast majority of those 3% error cases get rerouted to a human border official for handling. This is basically a sanity check, and sounds reasonable. The use of AI in the first instance shouldn’t be making things worse, since AI is already superior than humans at facial recognitiob. I wouldn’t be surprised if normal border officials have a significantly higher than 3% error rate in face matching.

      • Riskable@programming.dev
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        rerouted to a human border official for handling.

        No, the person who was misidentified will be routed to a human TSA agent for harassment. Every single time they fly.

      • mightyfoolish@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        Will this be like the “random” checks if your complexion is olive or darker or if your name seems kind of funny?

      • ApeNo1@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        100% this will already be better than humans, but similar to say autonomous driving, the goals should be better than human otherwise we see vendors doing just enough to achieve the simple goal of saving costs or making sales. I would hope they run this in parallel and the system flags anything with confidence less than a threshold for human scrutiny and comparison. Analysing the human decisions in parallel to the AI decisions will help to refine the models and also give some visibility to current accuracy with just human checks. This training and review aspect is a lot of work.

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I also want to know the statistics in regards to people in makeup. With a bit of makeup I bet you could get this system to think you’re whoever’s photo is on your ID.

      Camera-based systems are usually quite easy to fool so it could result in a seriously false sense of security.

      • Piecemakers@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        A “false sense of security” has always been TSA’s mission statement, so that’s on-brand AF.

  • Gr8fulZach@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    1 year ago

    Came in on an international flight from the UK to US recently and at US CBP they barely said “hello,” didn’t even ask for passports, declaration form or anything. Had all four of us in my family get a quick pic on the webcam and pushed us through within about 30 seconds. It was nice to keep the line moving but a little freaky that he didn’t even ask to see our actual passports.

    • Cryst@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I dont know why, but a lot of these folks have a stick up their ass. Maybe they need therapy.

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      Casinos have completely different goals in regards to facial recognition technology. They’re looking for specific people that do things like count cards, pick pockets, fraudsters, etc. If someone matches they’ll have a security person go and do a real-world double-check that only requires knowing the person’s profile (which will include height and estimated weight, tattoos, typical behaviors, and other unique features).

      The TSA is looking for anything suspicious. And “suspicious” has an extremely broad meaning when it comes to a poorly-paid person who would be over the moon if they actually caught a real criminal/terrorist instead of just spending day after day being a professional annoyer of completely normal people.

      If the facial recognition system thinks a person doesn’t match their picture ID that doesn’t mean they’re a criminal, a terrorist, or any problem whatsoever. It just means that the system disagrees with their appearance. Is that enough to warrant harassing someone? Making them miss their flight? Invading their privacy and permanently recording all sorts of information about them in the TSA’s databases?

  • betterdeadthanreddit@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    19
    ·
    1 year ago

    Why does the TSA even get funding for stuff like this? They’re the tiger-repellant rock of government agencies. Every TSA employee’s shift should begin with a self-report phone call to the Fraud, Waste & Abuse hotline.

  • Pyr_Pressure@lemmy.ca
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    4
    ·
    1 year ago

    The more I read about the states and the more time goes on, I just have less and less desire to ever visit.

  • jmp242@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    I’m not really thrilled about this. I don’t like bio-metrics really. Plus masks. Also, I think a lot of these aren’t actually 97% accurate. Marketing.