When Debbie Shelton Moore picked up the phone, her 22-year-old daughter’s terrified cries for help could be heard. Then a man picked up the phone and made a ransom demand.
“The man had said, ‘Your daughter’s been kidnapped and we want $50,000,'” the Georgia mother told 11Alive, an Atlanta NBC affiliate. “Then they had her crying, like, ‘Mom, mom’ in the background.”
“It was her voice and that’s why I was totally freaking out,” Shelton Moore continued.
Her daughter wasn’t on the phone, though. It was a synthetic clone of her voice created by AI specifically to con the worried mother.
As AI has gained popularity, criminals all around the nation have started leveraging technology to make phony voices for extortion schemes. According to a May McAfee survey of 7,000 people, one in four people had either been the victim of or know someone who has been the victim of a scam employing a voice replicated through AI.
In two occurrences in Arizona in March, scammers sought ransom using AI-generated voices of family members.
Shelton Moore’s phone had a number with the same area code as Lauren, her daughter, who lived there. The alarmed mother from Georgia thought her daughter had been in a car accident.
“My heart is beating and I’m shaking,” Shelton Moore said of the moment when she received the ransom call. The voice of her daughter “was 100% believable,” she continued, “enough to almost give me a heart attack from sheer panic.”
“It was all just kind of a blur because all I was thinking was, ‘How am I going to get my daughter? How in the world are we supposed to get him money?'” she told 11Alive.
Shelton Moore was told they had her kid in the back of a truck by one of the male voices on the phone.
Their daughter was able to FaceTime with her husband, who works in cybersecurity, and he was able to expose the hoax.
Lauren was safe when local police arrived in response to the ransom call, according to 11Alive. However, Shelton Moore advised families to take measures to avoid con artists.
“Of course, when you hear their voice, you’re not going to think clearly and you will panic,” she said. “The whole family needs to have a safe word or safe phrase that they’re not going to forget under duress.”
Additionally, the Cobb County Sheriff’s Office earlier this month advised families to have a safe word or phrase because AI scams are common in the county.