If somebody called you from an unknown number, claiming to be from your bank, and asked you to “confirm” your sensitive account information, you would probably recognize right away that this was a scam and hang up.
But what if the voice on the other end of the line was a loved one? You recognize them as someone from your family. They sound upset as they describe an emergency and beg for your help. Who could be so cold-hearted as to refuse their request for money?
Unfortunately, this is a new kind of scam that has been using the power of Artificial Intelligence (AI) to fool tens of thousands of people into sending millions of dollars to online thieves.
The Washington Post tells the story of a Canadian couple in their 70s who received a call from someone who sounded exactly like their grandson Brandon. He said he was in jail, with no wallet or cellphone, and needed cash for bail.
“We were sucked in,” his grandmother said. “We were convinced that we were talking to Brandon.”
The couple dashed down to their bank and withdrew the daily maximum ($2,207 in U.S.currency). Then they hurried to a second branch for more money. But an alert bank manager pulled them aside and told them how another patron had gotten a similar call and learned that the eerily accurate voice had been faked. That’s when they realized they’d been duped.
The Post reports that technology is making it easier and cheaper for bad actors to mimic voices, convincing people, often the elderly, that their loved ones are in distress. “In 2022, impostor scams were the second most popular racket in America, with over 36,000 reports of people being swindled by those pretending to be friends and family, according to data from the Federal Trade Commission. Over 5,100 of those incidents happened over the phone, accounting for over $11 million in losses, FTC officials said.”
Advancements in AI technology now allow bad actors to replicate a voice with an audio sample of just a few sentences. Easily available online tools can then translate an audio file into a replica of a voice, allowing a scammer to make it “speak” whatever they type.
It’s difficult for law enforcement to find and prosecute these thieves. And Vice reports that “the courts have not yet decided when or if companies will be held liable for harms caused by deepfake voice technology—or any of the other increasingly popular AI technology, like ChatGPT—where defamation and misinformation risks seem to be rising.”
One way to short-circuit this type of scam is to ask the “loved one” about something only he or she would know. But the surest way to confirm that the need is genuine is to contact family to find out if the loved one is the location and situation they claim. This would have saved the Canadian couple a trip to the bank and the surrounding stress of the entire situation.
With scams like this on the rise, be highly vigilant of any unsolicited requests for money or personal information. Your trusted advisor is happy to share resources to help you protect yourself from online thieves. Schedule a call with me by clicking this link.