They thought their loved ones were calling for help: It was an AI scam (Getty)
They thought their loved ones were calling for help: It was an AI scam (Getty)

The man who called Ruth Card sounded like her grandson, Brandon. So when he told her that he was in jail, without a wallet or cell phone, and that he needed money to pay his bail, Card was quick to do whatever he could to help.

“It was definitely a feeling of… to fear“he said. “That we have to help you now.”

Card, 73, and her husband, Greg Grace, 75, rushed to their bank in Regina, Saskatchewan, and they withdrew 3,000 canadian dollars (2,207 US dollars), the daily high. They ran to a second agency for more money. But the bank manager led them to his office: another customer got a similar call and found that the eerily accurate voice had been faked, Card remembers the banker saying. The man on the phone was probably not his grandson.

It was then that they realized that they had been mistaken.

“They tricked us,” Card said in an interview with The Washington Post. “We were convinced we were talking to Brandon.”

FILE PHOTO: US startup Replika shows a user interacting with a smartphone app to customize an avatar for a personal AI chatbot (Reuters)
FILE PHOTO: US startup Replika shows a user interacting with a smartphone app to customize an avatar for a personal AI chatbot (Reuters)

While the phishing scams In the United States, Card’s experience points to a worrying trend. Technology is making it easier and cheaper for malefactors imitate voices, convincing people, often elderly people, that their loved ones are in danger. In 2022, impostor scams were the second most popular scam in the United States, with more than 36,000 complaints of people being deceived by those posing as friends and family, according to data from the Federal Trade Commission. More than 5,100 of these incidents took place over the phone, which meant more than 11 million dollars in losses, according to FTC officials.

Advances in artificial intelligence have added a new level of terrifying, allowing malicious actors to replicate a voice with just an audio sample of a few sentences. Thanks to artificial intelligence, several cheap online tools can translate an audio file into a voice replica, allowing a scammer to make you “speak” what you type.

Experts say federal regulators, law enforcement and the courts are ill-equipped to thwart this growing scam. Most victims have few clues to identify the perpetrator And it’s difficult for the police to trace calls and funds from scammers who operate around the world. And there is little legal precedent for courts to hold the companies that make the tools accountable for their use.

The RAE chose in the end
The RAE chose the term “artificial intelligence” as the word of the year for 2022 due to its relevance in technology

“It’s scary,” says Hany Farid, a professor of digital forensics at the University of California, Berkeley. “It’s sort of the perfect storm… [con] all the necessary ingredients to create chaos.”

While imposter scams take many forms, they work in essentially the same way: a scammer poses as someone they trust – a child, lover or friend – and he convinces the victim to send him money because he is in trouble.

But artificially generated voice technology is making the hoax more convincing. Victims say they react with visceral horror when they hear of their loved ones in distress.

This is an obscure consequence of the recent boom generative artificial intelligence, which supports software that creates text, images, or sounds from data given to it. Advances in math and computing power have improved the training mechanisms for this type of software, leading a fleet of companies to release eerily lifelike chatbots, image makers and voice makers.

This is a grim consequence of the recent boom in generative artificial intelligence.
This is a grim consequence of the recent boom in generative artificial intelligence.

AI voice generation software analyzes what makes a person’s voice unique (age, gender, accent, etc.) data base from voices to find similarities and predict patternsFarid explains.

You can then recreate the tonehe Bell and the individual sounds of a person’s voice to create a similar overall effect, he added. It requires a small audio sample, taken from places like YouTube, podcasts, ads, TikTok, Instagram or Facebook videos, Farid said.

“Two years ago, even a year ago, you needed a lot of audio to clone a person’s voice,” Farid said. “Now… if you have a Facebook page… or if you recorded a TikTok and your voice is there for 30 seconds, people can clone your voice.”

Speech generation software searches a vast database of voices to find similar voices and predict patterns
Speech generation software searches a vast database of voices to find similar voices and predict patterns

Companies like ElevenLabs, an AI speech synthesis start-up founded in 2022, turn a short vocal sample into a synthetically generated voice via a text-to-speech tool. ElevenLabs’ software can be free or cost anywhere from $5 to $330 a month to use, depending on the site, with higher prices allowing users to generate more audio.

ElevenLabs hit the news after criticism of its tool, which was used to replicate the voices of celebrities saying things they never did, like Emma Watson falsely reciting passages from Adolf Hitler’s “Mein Kampf.” ElevenLabs did not return a request for comment, but in a Twitter thread the company said it is implementing safeguards to curb misuse, including banning free users from creating custom voices and releasing a tool to detect AI-generated audio.

But those safeguards come too late for victims like Benjamin Perkin, whose elderly parents they lost thousands of dollars in a voiceover scam.

His nightmare of voice cloning began when his parents got a call from a supposed lawyer saying their son had killed an American diplomat in a car accident. Perkin was in prison and needed money for legal fees.

A keyboard is reflected on a computer screen showing the ChatGPT website, an OpenAI AI chatbot (REUTERS/Florence Lo)
A keyboard is reflected on a computer screen showing the ChatGPT website, an OpenAI AI chatbot (REUTERS/Florence Lo)

The lawyer put Perkin, 39, on the phone, who said he loved them, appreciated them and needed the money. A few hours later, the attorney called Perkin’s parents again, saying their son needed $21,000 ($15,449) before a court appearance later that day.

Perkin’s parents later told him that the call seemed unusual to them, but they couldn’t get rid of the feeling that they actually talked to their son.

The voice sounded “close enough that my parents actually believed they had spoken to me,” he said. Panicked, they rushed to several banks to withdraw money and sent the money to the lawyer via a bitcoin terminal.

When the real Perkin called his parents that night to do a casual check, they were confused.

It’s unclear where the scammers got their voice from, although Perkin has posted videos on YouTube discussing his love of snowmobiling. The family filed a complaint with Canadian federal authorities, Perkin explains, but were unable to recover the money.

(Andean)
(Andean)

“The money disappeared,” he said. “There is no insurance. It cannot be retrieved. It disappeared”.

Will Maxson, deputy director of the FTC’s division of marketing practices, said that tracking voice scammers can be “especially difficultbecause they might be using a phone based on Any part of the worldmaking it even difficult to identify which body has jurisdiction over a given case.

Maxson called for constant vigilance. If a loved one tells you they need money, put the call on hold and try calling your family member separately, he said. If a suspicious call comes from a family member’s number, be aware that it can also be spoofed. Never pay with gift cards, because they are difficult to track, he added, and be wary of any solicitation of money.

Eva Velasquez, executive director of the Identity Theft Resource Center, says police find it difficult to track down voice cloning thieves. Velasquez, who spent 21 years at the San Diego District Attorney’s Office investigating consumer fraud, said police departments may not have enough money and staff to fund a dedicated unit to track fraud.

Larger departments need to allocate resources to cases that can be resolved, he said. Victims of voice scams may not have a lot of information to provide to police for investigationsmaking it difficult for employees to spend a lot of time or staff, especially for minor losses.

“If they don’t have any information about it,” he said. “Where do they start?”

Farid said that the courts should require responsibilities for AI companies if the products they make cause harm. Legal experts, including Supreme Court Justice Neil Gorsuch, said in February that the legal protections that protect social networks from lawsuits may not apply to AI-created work.

For Card, the experience made her more vigilant. Last year, she spoke to her local newspaper, the Regina Leader-Post, to warn people about these scams. Since she didn’t lose any money, she didn’t report it to the police.

Most of all, she says, she feels ashamed.

“It wasn’t a very compelling story,” he says. “But it didn’t have to be better than it was to convince us.”

© The Washington Post 2023

Continue reading:

Potentially disastrous scenarios in the face of the advancement of Artificial Intelligence
Controversy in Japan for the first manga made by Artificial Intelligence
Why ChatGPT can change the world of work