The rapid rise of artificial intelligence technology has and will continue to make life easier in multiple regards, but AI also has a dark side that continues to rear its ugly head in new ways.
A couple of weeks ago, a woman in Manitoba received a call from a private number. When she picked it up, an entity that sounded exactly like her son was on the line. The voice began asking her peculiar questions.
“Hi mom, can I tell you anything?” it said, “Without judgement?”
The call didn’t last very long before Leann Friesen got uncomfortable, said she would call back, and hung up. She then rang her son’s phone to ask him about the call and he had no idea what she was talking about.
“Mom, I didn’t call you,” he said.
Telephone scams are nothing new, but mimicking the voices of family members with such precision is a phenomenon that has only started becoming prevalent within the past couple of years.
Canadian law enforcement says that fraudsters just need to locate minimal audio voice clips on social media or elsewhere and plug them into an AI program to target unsuspecting people like Friesen. They generally hope to convince the victim to send money or reveal sensitive information.
“They only need about a three-second clip to have their system be able to recreate that voice and then create whatever they want that voice to say,” 27-year Lethbridge Police Service veteran, David Maze, said in a statement sent to the CBC in September.
Earlier this year, Thailand’s Prime Minister Paetongtarn Shinawatra received a call from an AI bot impersonating an unspecified world leader. It was trying to swindle her into making a donation by claiming that Thailand was the only country in Southeast Asia that hadn’t done so.
In the United States, a report released by the Federal Trade Commission in 2023 found that about 5,100 Americans fell victim to similar impersonation calls in 2022, leading to them being collectively robbed of approximately US$11 million.
⚠️ these scammers are getting wild gang
now they are using AI to try to sound exactly like me
This is happening more and more, it’s about to get scary as hell with all these fakes so just be careful and don’t fall for traps 🙂 pic.twitter.com/7nsqypKoRg
— T E L L E (@tellesmith) May 13, 2025
Read more: Sick of telescammers pestering you? Now there’s an ‘AI Granny’ designed to waste their time
Read more: Visa introduces an agentic artificial intelligence pilot program
Follow Rowan Dunne on LinkedIn
rowan@mugglehead.com
