• Hodor@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    1
    ·
    16 hours ago

    Except now they record your voice and use it to train voice ai and scam you harder. My coworker’s ex-husband got a call from their “daughter” distressed “kidnapped” needing money for ransom. Sent it and called the ex-wife. Daughter was sleeping at home.

    • thethunderwolf@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      41 minutes ago

      I’ve heard of this scenario as an example of why not to put your face on the internet. Now with AI it’s actually happening.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      14 hours ago

      I wonder if they do. That seems like a lot of effort to go to for the average person for a scammer.

      It seems easier to have a generic voice, rely on the fact that phone audio quality isn’t great to bridge the gap, and use a shotgun approach.

      Some places do, since there were a few high profile attacks, but they were nearly all targeting organisations by pretending to be the CEO or something.

      • SaharaMaleikuhm@feddit.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 hours ago

        I still have an ace up my sleeve: I don’t pick up the phone unless I know who is calling or am otherwise expecting a call.
        Right now I just get the occasional one liner email: “hey Sahara what are you doing tonight?” Who the hell falls for that?

      • TehWorld@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        12 hours ago

        Once it’s automated it’s the same either way. Probably something even vibe code could pull off.