“A reasonably good clone can be created with under a minute of audio and some are claiming that even a few seconds may be enough.” Mom wary about answering calls for fear voice will be cloned for future virtual kidnapping.

  • BassTurd@lemmy.world
    link
    fedilink
    English
    arrow-up
    77
    ·
    1 year ago

    Unless I know who you are, I’m not answering your call. 90% of the time it’s a robot and the other 10% can leave a voicemail.

    • radix@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      Isn’t a voicemail worse for detecting deepfakes because it doesn’t require it to dynamically listen and respond?

      • BassTurd@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        ·
        1 year ago

        I’m not personally concerned about getting duped by a deep fake. I just don’t want to talk to any robots, solicitors, or scammers.

    • Coliseum7428@kbin.social
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I have had calls from similar numbers to my own, and seen caller ID’s for people that aren’t contacts. I haven’t picked them up, but the temptation was there to do so.

    • bluekieran@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      You might know the number. My wife used to live in Kenya and renamed her “Mum”/“Dad” contacts after they once got a call from her stolen phone saying she’d been arrested and they needed to send money for bail.

    • 857@fedia.io
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      I’ll go one farther - unless it’s my doc, my wife, or my boss, I’m neither answering the call nor listening to the voicemail. That’s what easily skimmable voicemail transcription is for…

      I don’t love the privacy implications of transcribed voicemail, ofc, but it’s better for my own privacy/threat model than answering the phone to robots, scammers, and etc. It’s also a hell of a lot better for my mental health, vs listening to them.

  • chamaeleon@kbin.social
    link
    fedilink
    arrow-up
    56
    ·
    1 year ago

    Real kidnappers will not be happy about this as deepfake becomes more prevalent and calls for ransom gets ignored more and more. Do they have union that can go on strike to raise awareness about this?

    • Johnny Utah@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      As awful as it sounds this needs to be setup between family members. Agree on a phrase or code word to check and make sure they are who they say they are. This is already common when it comes to alarm system monitoring companies, got to make sure the intruder isn’t the one answering the phone and telling them false alarm.

  • RFBurns@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    The ‘hostage-taker’ will never be able to duplicate my family’s grammar and sentence structure quirks, so I won’t care how it “sounds”…

  • Raphael@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    16
    ·
    edit-2
    1 year ago

    I’m pro-AI but any technology that can lead to the creation of deepfakes must be explicitly banned.

    Naturally, we’re already talking about criminals but you combat this issue the same way you combat school shootings. Banning the root of the issue and actively persecuting anyone who dares acquire it illegally.

    EDIT: The victims wouldn’t have fallen to this deepfake scam if they had their own deepfake scam. Scam the criminal before he scams you!!!

  • solarview@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    13
    ·
    edit-2
    1 year ago

    Perhaps there should be government controlled licenses for some technologies, like for gun ownership? Although there’s probably all sorts of ways that be circumvented. Not sure how best to control this though.

    *edit wow thanks, nice to know what sort of community this is. Nothing in the responses so far have told me anything I didn’t already know, and I did point out that it would be circumvented. I don’t see any other ideas though. Maybe with that sort of negative defeatist response we’ll never even try. Fuck it right, let’s just watch the world burn. /s

    • Gray@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      3
      ·
      1 year ago

      Ah yes because making something illegal stops criminals from using it. Problem solved.

    • Barbarian@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      3
      ·
      1 year ago

      Basically impossible.

      It’s against the ToS to use tools like teamviewer to run user support scams, for example, but people do it anyway. You can’t legislate against criminals, because if they’re already breaking the law, why would they care about another law?

      The only way forward here is enforcement. There needs to be better coordination between governments to track down and prosecute those running the scams. There’s been a lot of pressure on India, for example, to clean up their act with their very lax cybercrime enforcement, but it’s very much an uphill battle.

    • SmashingSquid@notyour.rodeo
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      Not really comparable to guns, making it harder to get a physical object is much different from preventing people from downloading software. Even 3d printed guns require equipment and knowledge to make use of the download.