Basman

Scammers are now using AI to sound like family members in distress (article)

7 posts in this topic

https://web.archive.org/web/20230305142817/https://www.washingtonpost.com/technology/2023/03/05/ai-voice-scam/

Quote

But such safeguards are too late for victims like Benjamin Perkin, whose elderly parents lost thousands of dollars to a voice scam.

His voice-cloning nightmare started when his parents received a phone call from an alleged lawyer, saying their son had killed a U.S. diplomat in a car accident. Perkin was in jail and needed money for legal fees.

The lawyer put Perkin, 39, on the phone, who said he loved them, appreciated them and needed the money. A few hours later, the lawyer called Perkin’s parents again, saying their son needed $21,000 ($15,449) before a court date later that day.

Perkin’s parents later told him the call seemed unusual, but they couldn’t shake the feeling they’d really talked to their son.

The voice sounded “close enough for my parents to truly believe they did speak with me,” he said. In their state of panic, they rushed to several banks to get cash and sent the lawyer the money through a bitcoin terminal.

When the real Perkin called his parents that night for a casual check-in, they were confused.

It’s unclear where the scammers got his voice, although Perkin has posted YouTube videos talking about his snowmobiling hobby. The family has filed a police report with Canada’s federal authorities, Perkin said, but that hasn’t brought the cash back.

It seems the best way to handle a fishy call from a distressed family member is to hang up and then call the family member in question directly.

Share this post


Link to post
Share on other sites

AI is going to usher in a whole new wave of doing crime, similar to the internet. This is only the tip of the iceberg.


 

 

Share this post


Link to post
Share on other sites

@Basman

On 03/04/2023 at 11:56 AM, Basman said:

https://web.archive.org/web/20230305142817/https://www.washingtonpost.com/technology/2023/03/05/ai-voice-scam/

It seems the best way to handle a fishy call from a distressed family member is to hang up and then call the family member in question directly.

   Yes, when dealing with these situations it's best to trust your intuition. If it feels bad, despite looking very similar or sounds similar to a loved one, stop, slow breaths and double-triple check before making any decisions especially with money or other important decisions that can change your life.

Share this post


Link to post
Share on other sites

@aurum

On 04/04/2023 at 1:07 AM, aurum said:

AI is going to usher in a whole new wave of doing crime, similar to the internet. This is only the tip of the iceberg.

   It sucks but there's so many scammers out there that AI would also be used by them.

Share this post


Link to post
Share on other sites

My bank doesn't let me use ChatGPT-4 because my bank says that it's a scam.

So even the creators of AI are scammers.

Share this post


Link to post
Share on other sites
5 hours ago, Blackhawk said:

My bank doesn't let me use ChatGPT-4 because my bank says that it's a scam.

So even the creators of AI are scammers.

So that tech is blocked to you. I also had smiliar rule in the army, I wasn't able to use a smart phone during my service time for security reasons. 

I would say top down orders are usualy double standarts. Big corporations and military complex can easily bypass the nation state structure while individuals are stuck with regulations and left with misinformation. 

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now