Science and Tech

"Hello, I am your grandson and I need 3,000 dollars": there are already scams for the elderly with voices generated by AI

Don't believe anything you hear, the "deep fakes" audio

Ruth Card, 73, received a call from her grandson Brandon. Or so she thought, because the voice that sounded on the other end of the phone was exactly that of her grandson. “Grandma, I’m in jail, without a wallet, without a phone,” that voice told her, more or less overwhelmed by those words. “I need money for bail.”

Ruth and her husband went without thinking to the bank to withdraw 3,000 Canadian dollars and when trying to do the same in another office, the bank manager detected that something strange was happening and spoke to them: another client had gone through the same thing and had noticed realized that the voice on the other end was not who it was supposed to be. It was a voice generated by an artificial intelligence system. An audio deepfake.

they told it in The Washington Post and they warned of a new wave of fraud in which AI-generated voices are used to deceive and scam all kinds of people, often the elderly. Pretending that their family members are in trouble often works for such victims.

FTC data reveal that in 2022 this type of fraud in which someone impersonates another person was the second most frequent, with more than 36,000 complaints of people who were deceived (or almost) by others who pretended to be friends or relatives. Large scams with this type of system are the order of the day: at the end of 2021 someone managed to steal 35 million dollars from a bank using this technology.

With the new generation of AI engines that are capable of emulating any human voice through a little training —Microsoft presented the powerful VALL-E just a few weeks ago— things have gotten even worse. In all cases the system is essentially the same: an impostor impersonates someone you trust — a child, a lover, a friend — and convinces the victim to send money because he’s in big trouble.

The bottom line here is that spoofing is much more convincing when you hear that person’s voice talking to you about that issue. It’s an artificially mimicked voice, but advances in AI—these types of systems have been in development for years—make it difficult to distinguish, especially when the person receiving the call does so with that tone of urgency and concern. And for older people, less knowledgeable about these advances or what this technology is capable of doing, this becomes a very real threat.

The voices don’t even need to be taken from someone voluntarily: just a YouTube video, a podcast or a clip on TikTok, Instagram or Facebook to end up generating a surprisingly credible artificial voice. From there, the cybercriminal can say whatever he wants with that voice.

Going after these scammers is especially difficult. Will Maxson, from the FTC’s marketing division, explained that these criminals can use a phone anywhere in the world and that complicates issues such as jurisdiction over each particular case.

The advice of this body is clear: if you receive a call from someone you love who urgently asks you for money, try to put that call on hold and try to call that person separately. The number may look like yours, but scammers may have impersonated it too.

You should never give money in the form of gift cards because they are difficult to monitor, and of course you have to be very attentive and not comply with those requests for money that, above all, are usually requested urgently.

Image: Javier Pastor with DALL-E 2

Source link