Voice cloning is only going to get more common
Key Features
AI scams, using cloned voices of people you know, are a thing.
Spotting AI voice cloning scams is getting harder.
Protecting yourself is just the same as with regular scam calls.
Imagine getting a call from a family member or close friend. They're in trouble, and they need money. The number checks out. It's their voice. But it's not them. It's an AI impersonator scamming you.
According to a report from McAfee, these AI voice scams are on the rise thanks to AI impersonators. And while they have been around in some form or other forever, technology is making the con particularly convincing. Caller ID spoofing combined with AI voice cloning would probably convince most people. So how does it work? And how can you protect yourself?
"It is true that AI voice generation is improving extremely quickly. ElevenLabs voice cloning technology is scarily good, especially when you consider the small amount of data needed to make it work. I can see situations where people may be fooled, especially if they are placed in a state of heightened emotions," AI consultant Richard Batt told Lifewire via email.
Voice Cloning Leads to AI Voice Scams
Armed with a cloned voice, a scammer can call up and pretend to be somebody you already know. For instance, in April, there was a story about a mother scammed with a fake kidnapping of her daughter using an AI-cloned voice.
It seems utterly incredible that such a thing is even possible, but our incredulity may make these scams even more effective.
So, how does it work? To clone a voice, you need a recording of the original voice. Thanks to TikTok and Instagram videos, plenty of training data is out there. And your social network posts leak lots of other data about your relationships with people to be used to make a more convincing scam.
Right now, the level of effort is still high. You have to pick a target, find a voice example, clone it, and make the call, spoofing caller ID along the way. But if we know anything, it's that scammers are likely to become ever more sophisticated.
"AI voice scams can be tailored to build highly personalized profiles of targets by ingesting data from public internet sources. Social Engineering Toolkits have been around for years and are readily used by modern-day cyber criminals to aggregate information about target victims. Unlike modern-day cyber cons, AI does not get tired or discouraged in the least, not to mention it can synthesize just about any voice and language," James Leone, cybersecurity consultant at IBM, told Lifewire via email.
And a scam doesn't have to be a fake kidnapping attempt. It could equally be an AI chatbot phoning you up and pretending to be from your bank. That's more at the level of automation of 419 scams, those classic email scams where a Nigerian prince wanted to get his money out of the country. If you don't need people to do the calling, you can flood the world with voice call spam.
"The biggest danger is just how easy and cheap it is to commit this type of scam. In the past, these types of fakes would require 10 hours of audio (or video) of a subject, but now, because the technology is advancing so quickly, all anyone needs is just 10 seconds of audio," Rijul Gupta, CEO and co-founder of DeepMedia, a deepfake creation and detection company, told Lifewire via email.
Can You Protect Against AI Voice Cloning Scams?
Despite the high-tech nature of these scams, the way to protect yourself from phone scams hasn't changed much.
First, you should not trust caller ID. Then, you should always hang up and call back on a number you already have or look up yourself.
That takes care of pretty much all the 'regular' scams but should also be effective against cloned voice scams. The problem there is the social engineering element. If caller ID primes you to expect to hear your spouse, and the voice clone is good and sounds like your husband, then you may be convinced. Right now, the models are good, but not that good.
"[I]t's important to understand that these models are not yet capable of realistic shifts in tone. A small clip of the voice which only requires a tone may fool people, a longer clip probably won't," says Batt.
But given the speed of AI development right now, voice cloning scams will only get better, so we'd better watch out and maybe stop taking any calls from numbers you don't recognize.
Lifewire