Saturday, 13 May 2023 04:11

Why AI voice scams mean you should probably never answer your phone

Rate this item
(1 Vote)

Voice cloning is only going to get more common

Key Features

AI scams, using cloned voices of people you know, are a thing.

Spotting AI voice cloning scams is getting harder.

Protecting yourself is just the same as with regular scam calls.

Imagine getting a call from a family member or close friend. They're in trouble, and they need money. The number checks out. It's their voice. But it's not them. It's an AI impersonator scamming you.

According to a report from McAfee, these AI voice scams are on the rise thanks to AI impersonators. And while they have been around in some form or other forever, technology is making the con particularly convincing. Caller ID spoofing combined with AI voice cloning would probably convince most people. So how does it work? And how can you protect yourself?

"It is true that AI voice generation is improving extremely quickly. ElevenLabs voice cloning technology is scarily good, especially when you consider the small amount of data needed to make it work. I can see situations where people may be fooled, especially if they are placed in a state of heightened emotions," AI consultant Richard Batt told Lifewire via email.

Voice Cloning Leads to AI Voice Scams

Armed with a cloned voice, a scammer can call up and pretend to be somebody you already know. For instance, in April, there was a story about a mother scammed with a fake kidnapping of her daughter using an AI-cloned voice.

It seems utterly incredible that such a thing is even possible, but our incredulity may make these scams even more effective.

So, how does it work? To clone a voice, you need a recording of the original voice. Thanks to TikTok and Instagram videos, plenty of training data is out there. And your social network posts leak lots of other data about your relationships with people to be used to make a more convincing scam.

Right now, the level of effort is still high. You have to pick a target, find a voice example, clone it, and make the call, spoofing caller ID along the way. But if we know anything, it's that scammers are likely to become ever more sophisticated.

"AI voice scams can be tailored to build highly personalized profiles of targets by ingesting data from public internet sources. Social Engineering Toolkits have been around for years and are readily used by modern-day cyber criminals to aggregate information about target victims. Unlike modern-day cyber cons, AI does not get tired or discouraged in the least, not to mention it can synthesize just about any voice and language," James Leone, cybersecurity consultant at IBM, told Lifewire via email.

And a scam doesn't have to be a fake kidnapping attempt. It could equally be an AI chatbot phoning you up and pretending to be from your bank. That's more at the level of automation of 419 scams, those classic email scams where a Nigerian prince wanted to get his money out of the country. If you don't need people to do the calling, you can flood the world with voice call spam.

"The biggest danger is just how easy and cheap it is to commit this type of scam. In the past, these types of fakes would require 10 hours of audio (or video) of a subject, but now, because the technology is advancing so quickly, all anyone needs is just 10 seconds of audio," Rijul Gupta, CEO and co-founder of DeepMedia, a deepfake creation and detection company, told Lifewire via email.

Can You Protect Against AI Voice Cloning Scams?

Despite the high-tech nature of these scams, the way to protect yourself from phone scams hasn't changed much.

First, you should not trust caller ID. Then, you should always hang up and call back on a number you already have or look up yourself.

That takes care of pretty much all the 'regular' scams but should also be effective against cloned voice scams. The problem there is the social engineering element. If caller ID primes you to expect to hear your spouse, and the voice clone is good and sounds like your husband, then you may be convinced. Right now, the models are good, but not that good.

"[I]t's important to understand that these models are not yet capable of realistic shifts in tone. A small clip of the voice which only requires a tone may fool people, a longer clip probably won't," says Batt.

But given the speed of AI development right now, voice cloning scams will only get better, so we'd better watch out and maybe stop taking any calls from numbers you don't recognize.

 

Lifewire

November 21, 2024

How small businesses can leverage dark social to drive word-of-mouth marketing

Key Takeaways Dark social refers to sharing online content through private communication channels like email,…
November 21, 2024

Northern leaders say won’t support Tinubu for re-election due to president’s incompetence, bad policies

The Arewa Consultative Forum (ACF) has announced its decision to support northerners running for the…
November 18, 2024

The magic and the minefield of confidence: Self doubt, hubris and everything in between - The Economist

Confidence is contagious. Someone declaring a position with ringing certainty is more likely to inspire…
November 16, 2024

Influencer eats pig feed in extreme attempt to save money

Popular Douyin streamer Kong Yufeng recently sparked controversy in China by eating pig feed on…
November 21, 2024

50 terrorists killed as Boko Haram insurgents ambush security personnel guarding national grid in Niger…

At least 50 Boko Haram fighters were killed on Tuesday and seven members of Nigeria's…
November 21, 2024

What to know after Day 1001 of Russia-Ukraine war

WESTERN PERSPECTIVE US reopens Kyiv embassy after Ukraine firing of ATACMS missiles into Russia prompted…
November 21, 2024

Nigeria comes top in instant payment system inclusivity index in Africa

Nigeria’s instant payment system is projected to advance to the maturity inclusion spectrum ahead of…
October 27, 2024

Nigeria awarded 3-0 win over Libya after airport fiasco

Nigeria have been awarded a 3-0 victory over Libya, and three vital points, from their…

NEWSSCROLL TEAM: 'Sina Kawonise: Publisher/Editor-in-Chief; Prof Wale Are Olaitan: Editorial Consultant; Femi Kawonise: Head, Production & Administration; Afolabi Ajibola: IT Manager;
Contact Us: [email protected] Tel/WhatsApp: +234 811 395 4049

Copyright © 2015 - 2024 NewsScroll. All rights reserved.