Scammers Use AI to Clone Girl’s Voice in $1 Million Kidnapping Scheme

The acceleration of artificial intelligence has made tools more accessible to anyone, including scammers always finding ways to wet their beaks. Bad actors have been using AI to clone voices for extortion scams. 

Jennifer DeStefano, a mom from Arizona, picked up a call from an unknown number and heard crying from someone she was sure was her 15-year-old daughter Brie. “Mom, I messed up,” she heard her daughter say, before a male voice started to make demands.

“It was never a question of who is this? It was completely her voice, it was her inflection, it was the way she would have cried – I never doubted for one second it was her. That was the freaky part that really got me to my core,” she said. 

The supposed kidnapper demanded US$1 million but agreed to lower it to $50,000. While DeStefano was talking to him, one of her friends called her husband and confirmed that her daughter was home and safe and not kidnapped.

The police are investigating the scammer, but it’s unknown how many others have fallen for the scam.

In March, an older couple in Regina, Saskatchewan received a call from who they were sure was their grandson, who said that he was in jail without his wallet or cell phone and that he needed cash for bail. The couple rushed to their bank to withdraw CA$3,000, the maximum amount allowable, and rushed to the next branch to withdraw more. 

Like DeStefano, they were saved from being scammed when a banker at the second branch pulled them aside to tell them that another couple received a similar call but later found out that it had been faked.

The Federal Trade Commission (FTC), which has warned about AI technology “turbocharging” fraud and scams, said that these scammers are known to ask victims to send crypto, wire money, or pay the ransom with gift cards, and once the money or gift card numbers have been transferred, there’s almost no way of getting them back.

The FTC found that the impostor scam was the top two most popular racket in the US in 2022, with more than 36,000 cases reported and $11 million in losses from the 5,100 of those cases that happened over the phone.

Information for this story was found via First Post, the Independent, CNN, Washington Post, Arizona Family, and the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

Leave a Reply