Scammers Use AI to Clone Girl’s Voice in $1 Million Kidnapping Scheme

The acceleration of artificial intelligence has made tools more accessible to anyone, including scammers always finding ways to wet their beaks. Bad actors have been using AI to clone voices for extortion scams. 

Jennifer DeStefano, a mom from Arizona, picked up a call from an unknown number and heard crying from someone she was sure was her 15-year-old daughter Brie. “Mom, I messed up,” she heard her daughter say, before a male voice started to make demands.

“It was never a question of who is this? It was completely her voice, it was her inflection, it was the way she would have cried – I never doubted for one second it was her. That was the freaky part that really got me to my core,” she said. 

The supposed kidnapper demanded US$1 million but agreed to lower it to $50,000. While DeStefano was talking to him, one of her friends called her husband and confirmed that her daughter was home and safe and not kidnapped.

The police are investigating the scammer, but it’s unknown how many others have fallen for the scam.

In March, an older couple in Regina, Saskatchewan received a call from who they were sure was their grandson, who said that he was in jail without his wallet or cell phone and that he needed cash for bail. The couple rushed to their bank to withdraw CA$3,000, the maximum amount allowable, and rushed to the next branch to withdraw more. 

Like DeStefano, they were saved from being scammed when a banker at the second branch pulled them aside to tell them that another couple received a similar call but later found out that it had been faked.

The Federal Trade Commission (FTC), which has warned about AI technology “turbocharging” fraud and scams, said that these scammers are known to ask victims to send crypto, wire money, or pay the ransom with gift cards, and once the money or gift card numbers have been transferred, there’s almost no way of getting them back.

The FTC found that the impostor scam was the top two most popular racket in the US in 2022, with more than 36,000 cases reported and $11 million in losses from the 5,100 of those cases that happened over the phone.


Information for this story was found via First Post, the Independent, CNN, Washington Post, Arizona Family, and the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

Video Articles

Why the Market May Be Misreading Iran | David Woo

Why US Fertilizer Supply Could Matter a Lot More Now | Pat Varas – Sage Potash

Roscan Gold: Mali Discount Hits Kandiole PEA

Recommended

First Majestic Aims To Restart Production At Jerritt Canyon In H2 2027

Mercado Minerals Identifies A Series Of New Targets Following LiDAR Survey At Copalito

Related News

Meta’s New AI Chatbot Said That Mark Zuckerberg’s Company ‘Exploits People For Money’

BlenderBot 3, Meta Platforms’ (NASDAQ: META) latest artificial intelligence-powered chatbot was recently released for a...

Monday, August 15, 2022, 04:35:00 PM

EU Antitrust Watchdog Opens Broad AI Stack Probes — Cloud, Data, and Chatbot Markets All in Scope

The European Commission is targeting nearly every layer of the artificial intelligence value chain —...

Monday, March 23, 2026, 12:54:00 PM

Meta’s Plan to Fill Facebook and Instagram with AI Users Sparks Widespread Concern

Meta Platforms, Inc. (NASDAQ: META), the parent company of Facebook and Instagram, has announced plans...

Monday, December 30, 2024, 02:07:00 PM

BHP: Rising AI Demand Could Trigger Copper Crisis by 2050

The rapid rise of artificial intelligence is transforming industries worldwide, but it may also create...

Wednesday, September 18, 2024, 02:49:00 PM

Elon Musk Wants to Create New AI Start-Up to Compete With OpenAI

Elon Musk is reportedly developing plans to launch a new artificial intelligence start-up to compete...

Saturday, April 15, 2023, 01:34:00 PM