A Los Angeles man lost $25,000 to fraudsters who used artificial intelligence technology to clone his son's voice in an elaborate telephone scam, highlighting the growing sophistication of AI-enabled fraud schemes.
The victim, identified only as Anthony for privacy reasons, received what he believed was a distressing call from his son claiming involvement in a vehicle accident with a pregnant woman. The scammers subsequently posed as attorneys requesting bail money through multiple transactions.
"It was his voice. It was absolutely his voice," Anthony said of the initial call he received. "There was no doubt about it."
Los Angeles Police Department detective Chelsea Saeger said, "The scammers are just becoming more clever and sophisticated." The technology requires just seconds of audio to generate a realistic voice duplicate. "They are using social media and technology to craft these very believable and convincing stories, and people really do believe they're talking to a grandchild or a government official," he said.
"They call, and when you answer, and it's a scammer, there's silence," Saeger said. "They want you to say 'hello' or 'is anybody there?' All they need is three seconds of your voice to input it into AI and to clone it."
The fraud unfolded when a caller claiming to be attorney Michael Roberts contacted Anthony requesting $9,200 for his son's bail, threatening 45 days of jail time without payment. After Anthony's attempts to reach his son went to voicemail, he withdrew the money from his bank.
The perpetrators instructed Anthony to hand over the cash to an Uber driver who arrived at his residence. Surveillance footage captured the exchange as Anthony's daughter delivered the money in a manila envelope after verifying the driver's credentials.
The scheme escalated when a second caller, identifying himself as attorney Mark Cohen, claimed the pregnant woman had died and demanded an additional sum between $15,800 and $25,000. Anthony complied with the second payment, again utilizing an Uber driver for the transfer.
As the father-daughter duo researched on the internet, the daughter gave him the bad news, "' Dad, I hope I'm wrong. I think you've just been scammed out of $25,000.' It never even crossed my mind until she said those words," Anthony recalled.
LAPD investigators noted that Uber or Lyft-hired drivers are typically not implicated in this fraud. They are not even aware that they are involved. Detectives are apprehensive to disclose excessive details regarding this ongoing investigation.
The scammers maintained pressure through artificial time constraints and emotional manipulation, preventing the victim from thoroughly verifying the situation. "They moved me so fast," Anthony said. "I never had a chance to do a second call unless I were to say to them, 'Hold it. I'm stopping this whole thing for a minute. I want to talk to my son. I don't care if he's in jail or where he is; I want to talk to my son.' You don't think that way. You don't."
Police advise against sending money to an individual you do not know, regardless of whether they claim to be a government agency or financial institution. Those locations will never contact you and request that you transmit money immediately.
Saeger explained that scammers often gather voice samples through social media platforms. "They'll go through your video posts, and if you or a loved one are speaking, they can grab your voice that way," she said.
"Most recently, they've been asking victims to deposit money into crypto ATM machines or transfer money into crypto accounts," Saeger said. "So if you receive a call and they're requesting you to do any of those things, that is an immediate red flag, and it's probably a scam."
Despite the embarrassment, Anthony shared his experience publicly to raise awareness about AI-enabled voice scams. "That's my message to everyone watching - to protect themselves and their own families," he said. "That's why I'm doing this."