Saturday, 13 May 2023 04:11

Why AI voice scams mean you should probably never answer your phone

Rate this item
(1 Vote)

Voice cloning is only going to get more common

Key Features

AI scams, using cloned voices of people you know, are a thing.

Spotting AI voice cloning scams is getting harder.

Protecting yourself is just the same as with regular scam calls.

Imagine getting a call from a family member or close friend. They're in trouble, and they need money. The number checks out. It's their voice. But it's not them. It's an AI impersonator scamming you.

According to a report from McAfee, these AI voice scams are on the rise thanks to AI impersonators. And while they have been around in some form or other forever, technology is making the con particularly convincing. Caller ID spoofing combined with AI voice cloning would probably convince most people. So how does it work? And how can you protect yourself?

"It is true that AI voice generation is improving extremely quickly. ElevenLabs voice cloning technology is scarily good, especially when you consider the small amount of data needed to make it work. I can see situations where people may be fooled, especially if they are placed in a state of heightened emotions," AI consultant Richard Batt told Lifewire via email.

Voice Cloning Leads to AI Voice Scams

Armed with a cloned voice, a scammer can call up and pretend to be somebody you already know. For instance, in April, there was a story about a mother scammed with a fake kidnapping of her daughter using an AI-cloned voice.

It seems utterly incredible that such a thing is even possible, but our incredulity may make these scams even more effective.

So, how does it work? To clone a voice, you need a recording of the original voice. Thanks to TikTok and Instagram videos, plenty of training data is out there. And your social network posts leak lots of other data about your relationships with people to be used to make a more convincing scam.

Right now, the level of effort is still high. You have to pick a target, find a voice example, clone it, and make the call, spoofing caller ID along the way. But if we know anything, it's that scammers are likely to become ever more sophisticated.

"AI voice scams can be tailored to build highly personalized profiles of targets by ingesting data from public internet sources. Social Engineering Toolkits have been around for years and are readily used by modern-day cyber criminals to aggregate information about target victims. Unlike modern-day cyber cons, AI does not get tired or discouraged in the least, not to mention it can synthesize just about any voice and language," James Leone, cybersecurity consultant at IBM, told Lifewire via email.

And a scam doesn't have to be a fake kidnapping attempt. It could equally be an AI chatbot phoning you up and pretending to be from your bank. That's more at the level of automation of 419 scams, those classic email scams where a Nigerian prince wanted to get his money out of the country. If you don't need people to do the calling, you can flood the world with voice call spam.

"The biggest danger is just how easy and cheap it is to commit this type of scam. In the past, these types of fakes would require 10 hours of audio (or video) of a subject, but now, because the technology is advancing so quickly, all anyone needs is just 10 seconds of audio," Rijul Gupta, CEO and co-founder of DeepMedia, a deepfake creation and detection company, told Lifewire via email.

Can You Protect Against AI Voice Cloning Scams?

Despite the high-tech nature of these scams, the way to protect yourself from phone scams hasn't changed much.

First, you should not trust caller ID. Then, you should always hang up and call back on a number you already have or look up yourself.

That takes care of pretty much all the 'regular' scams but should also be effective against cloned voice scams. The problem there is the social engineering element. If caller ID primes you to expect to hear your spouse, and the voice clone is good and sounds like your husband, then you may be convinced. Right now, the models are good, but not that good.

"[I]t's important to understand that these models are not yet capable of realistic shifts in tone. A small clip of the voice which only requires a tone may fool people, a longer clip probably won't," says Batt.

But given the speed of AI development right now, voice cloning scams will only get better, so we'd better watch out and maybe stop taking any calls from numbers you don't recognize.

 

Lifewire

December 25, 2024

Investors add N500bn profit on Christmas Eve to the N1trn raked in last week as…

The Nigerian Exchange (NGX) is ending the year on a high note, with investors adding…
December 27, 2024

Bauchi governor accuses Tinubu of anti-North policies, warns of backlash

Bauchi State Governor Bala Mohammed has criticized President Bola Tinubu’s tax reform policies, calling them…
December 27, 2024

Scientists tracked 1,000 kids for 40 years. This was the No. 1 predictor of financial success

If you wanted to figure out what really matters for raising happy, successful kids, you’d…
December 21, 2024

‘Professional Back-Scratchers’ charge up to $130 per hour

The Scratcher Girls is an unconventional relaxation therapy studio that charges clients up to $130…
December 27, 2024

Christmas Day attack on Benue community claims 11 lives

At least 11 people have been reportedly killed in Tor Azege community in Kwande Local…
December 27, 2024

Here’s the latest as Israel-Hamas war enters Day 448

Israel strikes Houthi targets in Yemen, killing six Israel struck multiple targets linked to the…
December 25, 2024

Stem cell therapy to correct heart failure in children could 'transform lives'

Renowned visionary English physician William Harvey wrote in 1651 about how our blood contains all…
December 17, 2024

Ademola Lookman named 2024 CAF Men’s Player of the year. These players won in other…

Ademola Lookman, the Super Eagles winger, was crowned the 2024 CAF Men’s Player of the…

NEWSSCROLL TEAM: 'Sina Kawonise: Publisher/Editor-in-Chief; Prof Wale Are Olaitan: Editorial Consultant; Femi Kawonise: Head, Production & Administration; Afolabi Ajibola: IT Manager;
Contact Us: [email protected] Tel/WhatsApp: +234 811 395 4049

Copyright © 2015 - 2024 NewsScroll. All rights reserved.