They thought loved ones were calling for help. It was an AI scam.
The man calling Ruth Card sounded just like her grandson Brandon. So when he said he was in jail, with no wallet or cellphone, and needed cash for bail, Card scrambled to do whatever she could to help.
“It was definitely this feeling of … fear,” she said. “That we’ve got to help him right now.”
Card, 73, and her husband, Greg Grace, 75, dashed to their bank in Regina, Saskatchewan, and withdrew 3,000 Canadian dollars ($2,207 in U.S. currency), the daily maximum. They hurried to a second branch for more money. But a bank manager pulled them into his office: Another patron had gotten a similar call and learned the eerily accurate voice had been faked, Card recalled the banker saying. The man on the phone probably wasn’t their grandson.
That’s when they realized they’d been duped.
Advancements in artificial intelligence have added a terrifying new layer, allowing bad actors to replicate a voice with just an audio sample of a few sentences. Powered by AI, a slew of cheap online tools can translate an audio file into a replica of a voice, allowing a swindler to make it “speak” whatever they type.
It sounds similar to a popular scam here in Japan called "ore ore" where the caller will just say "It's me" and allow the victim to guess who "me" is. This is obviously much more primitive, but it occasionally worked. Typically old people are targeted.
Anaxagoras wrote: ↑Mon Mar 06, 2023 3:24 amIt sounds similar to a popular scam here in Japan called "ore ore" where the caller will just say "It's me" and allow the victim to guess who "me" is. This is obviously much more primitive, but it occasionally worked. Typically old people are targeted.
So you finally admit that Ayumi-chan was not calling you?
But rest assured it was totally Axel Rose who was calling ワイフ様. . . .
Excuse me, I have Mila Jovasomethingthingvich on the other line.