Artificial Intelligence

AI voice cloning ushers in new scamming frontier

NBCUniversal Media, LLC

Our voices are what makes us unique. From opera singers and yelling sports officials to crying babies, our voices project personality, urgency and language.

But what happens if voice uniqueness can be captured by an Artificial Intelligence (AI) cloning software?

Voice cloning can now be created quickly and cheaply, and is not just limited to celebrities and politicians.

The technology, if used with nefarious intent, may also pose a risk to consumers.

“It makes it scarier, makes it harder to defend against," said Chris Carlis, a consultant for the Lombard-based security firm Dolos Group.

AI voice cloning technology is making scams harder to detect.

“The scams where, you know, my grandson called me up, he’s on vacation. He says he needs bail money or whatever. Using AI voice cloning could now potentially sound exactly like the grandson’s voice,” Carlis noted.

Carlis said voice cloning has also hurt some businesses by convincing coworkers to wire money to scammers inadvertently.

Carlis showed a sample, using NBC Chicago reporter Patrick Fazio’s voice to clone the words: "In your inbox is an invoice. You need to process it right away.”

Carlis cloned the voice and added a muffled sound of airport noise in the background to create a sense of urgency.

He used a website that charged just $5.00 to create a fake voice mail, cloning Fazio's voice with audio that he found online.

The ability of AI to trick people is not limited to voicemails and audio recordings.

“There is a possibility for real time voice cloning, and it’s getting better," Carlis noted.

The Federal Trade Commission and the Federal Bureau of Investigation have both issued warnings on what artificial intelligence technology is capable of.

“It can be very challenging to find the people perpetrating these crimes, because they generally are very technologically savvy,” said FBI Special Agent Siobhan Johnson.

Johnson said it can take longer to track down cyber criminals, including those who use artificial intelligence.

“The technology is really very new. And so it’s really just starting to trickle in. But there is concern that we’ll see a lot more of it in the coming years," Johnson added.

The FBI’s internet crime report showed Americans lost over $10 billion dollars last year from internet scams that involved AI and/or voice cloning.

So what can you do to protect yourself—specifically from voice cloning?

“Go online. Look at your social media. Is it locked or private? Can anybody just go and watch your videos and steal your voice or your visual content?,” Johnson said.

“You should really trust your gut if something seems a little bit off,” added Carlis.

Security expert Chris Carlis had advice for people who think they might be getting a voice cloning call.

“Slow down. Run it past somebody else. Double check, “ said Carlis. “Give it some thought instead of losing a whole bunch of money.”

It’s not just voice cloning. Security experts believe AI video manipulation will advance to where scammers could impersonate someone else with a full AI face and voice swap making it so convincible that it would be even more difficult for people to detect.

If you feel you’ve been the victim of an internet crime, complaints can be filed at the FBI’s internet crime complaint Center (IC3) at this link.

Contact Us