Scammers Use AI to Clone Girl’s Voice in $1 Million Kidnapping Scheme

The acceleration of artificial intelligence has made tools more accessible to anyone, including scammers always finding ways to wet their beaks. Bad actors have been using AI to clone voices for extortion scams. 

Jennifer DeStefano, a mom from Arizona, picked up a call from an unknown number and heard crying from someone she was sure was her 15-year-old daughter Brie. “Mom, I messed up,” she heard her daughter say, before a male voice started to make demands.

“It was never a question of who is this? It was completely her voice, it was her inflection, it was the way she would have cried – I never doubted for one second it was her. That was the freaky part that really got me to my core,” she said. 

The supposed kidnapper demanded US$1 million but agreed to lower it to $50,000. While DeStefano was talking to him, one of her friends called her husband and confirmed that her daughter was home and safe and not kidnapped.

The police are investigating the scammer, but it’s unknown how many others have fallen for the scam.

In March, an older couple in Regina, Saskatchewan received a call from who they were sure was their grandson, who said that he was in jail without his wallet or cell phone and that he needed cash for bail. The couple rushed to their bank to withdraw CA$3,000, the maximum amount allowable, and rushed to the next branch to withdraw more. 

Like DeStefano, they were saved from being scammed when a banker at the second branch pulled them aside to tell them that another couple received a similar call but later found out that it had been faked.

The Federal Trade Commission (FTC), which has warned about AI technology “turbocharging” fraud and scams, said that these scammers are known to ask victims to send crypto, wire money, or pay the ransom with gift cards, and once the money or gift card numbers have been transferred, there’s almost no way of getting them back.

The FTC found that the impostor scam was the top two most popular racket in the US in 2022, with more than 36,000 cases reported and $11 million in losses from the 5,100 of those cases that happened over the phone.


Information for this story was found via First Post, the Independent, CNN, Washington Post, Arizona Family, and the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

Video Articles

Gold and Silver May Be Ready for Another Run | Shawn Khunkhun – Contango Silver & Gold

Silver Is Strong Again, and This Producer Is Ramping Up | Arturo Prestamo – Santacruz Silver

Gold Giant Agnico Eagle Makes a Critical Minerals Bet | Avenir Minerals x Fox River

Recommended

Altamira Gold Extends Maria Bonita Porphyry System Westward With 70.6 Metres At 0.51 g/t Hit

Antimony Resources Reports 13.9% Antimony in Latest Drill Core at Bald Hill

Related News

AI Crackdown? US, China Mull Security Reviews On AI Tools Like ChatGPT

The Biden administration has begun investigating whether artificial-intelligence systems such as ChatGPT should be subject...

Tuesday, April 11, 2023, 04:25:00 PM

Japan’s Government Foregoes Copyright Enforcement on Data Used for AI Training

The Japanese government recently asserted that it would not require copyrights on data used in...

Saturday, June 3, 2023, 07:15:00 AM

xAI Confirms Layoffs During Reorganization as Half of Founding Team Departs

Elon Musk’s artificial intelligence startup xAI laid off employees this week as part of a...

Friday, February 13, 2026, 12:09:00 PM

Canada Creates First-Ever AI Ministry, Names Former Journalist to Lead It

Former journalist Evan Solomon has been appointed as Canada’s first Minister of Artificial Intelligence and...

Thursday, May 15, 2025, 02:17:00 PM

Coming Soon: AI That Can Self-Replicate

If reading news about artificial intelligence makes you feel like humanity is inching closer and...

Wednesday, January 17, 2024, 04:51:00 PM