close
close

Fraudsters use AI to imitate children’s voices to steal millions of pounds from unsuspecting parents

Fraudsters use AI to imitate children’s voices to steal millions of pounds from unsuspecting parents

Fraudsters are using AI to imitate children’s voices in a new phone scam targeting parents with fake calls for help.

Ministers are now warning families to develop secret “passwords” or phrases to use in times of distress so they do not get fooled by the calls and part with their money.

Home Office sources say just three seconds of speech from videos on TikTok, Instagram or other social media sites is enough to generate a clone of someone’s voice.

The AI ​​fraud is the latest incarnation of the ‘Hi Mum’ scam, which has been used in recent years to steal millions of pounds from unsuspecting parents in the UK.

In this scam, parents usually received a WhatsApp message from someone posing as their child in distress, saying they had lost their phone and urgently needed money to help them get a taxi home.

Fraudsters use AI to imitate children’s voices to steal millions of pounds from unsuspecting parents

Founder of internet forum Mumsnet, Justine Roberts (pictured), said: ‘Hi mum scams, where fraudsters pose as a child in need of help, are designed to prey on parents’ emotions.

Speaking to the Mail on Sunday, Fraud Minister Lord Hanson urged parents to agree on a 'safe phrase' that their children can always use (stock photo)

Speaking to the Mail on Sunday, Fraud Minister Lord Hanson urged parents to agree on a ‘safe phrase’ that their children can always use (stock photo)

Now, thanks to AI technology, scammers – often based abroad – are leaving recorded messages or making calls that perfectly mimic the voice of someone’s child. This means that a parent may receive a call and hear their child’s voice telling them that they are in danger and that it is urgent to do so. need money – and using AI, fraudsters can have a two-way conversation with the parent.

Their “calls for help” can range from “stealing” their bank card and phone, so they are with a friend who can withdraw money for them if the money is immediately transferred to the account of this third party.

Or they urgently need to pay, for example, their landlord or someone they have had trouble with and who is threatening them for money.

Speaking to the Mail on Sunday, Fraud Minister Lord Hanson urged parents to agree on a “safe phrase” that their children can always use when they call for help to let parents know the message is authentic. He said: “AI presents incredible opportunities for our society, but we must also remain alert to the dangers.

“We are delighted to support this initiative through the Stop! Consider the Fraud campaign and make sure the public receives practical advice on how to protect themselves against these types of scams.

A source close to Home Secretary Yvette Cooper added: “Imagine the risks when this technology allows fraudsters to steal the face and voice of the person they claim to be and have a video chat with their intended victim.

“The government is clearly struggling to control this threat, but it also shows why families also need measures like ‘safe phrases’ to protect themselves.”

In the year to March, 3.2 million incidents of fraud were reported by households to the Crime Survey in England and Wales.

Fraudsters often

Fraudsters often “spoofed” a bank’s number to gain a victim’s trust, before convincing them that their account had been the subject of fraudulent activity (file photo)

Home Office sources say just three seconds of speech from videos on TikTok, Instagram or other social media sites is enough to generate a clone of someone's voice.

Home Office sources say just three seconds of speech from videos on TikTok, Instagram or other social media sites is enough to generate a clone of someone’s voice.

During the same period, UK Finance – the umbrella body for UK banks – said 554,293 financial frauds were reported by its members, including ‘Hi Mum’ scams, an increase of 20 per cent per year. compared to the previous year when the total was 460,537.

Justine Roberts, founder of internet forum Mumsnet, said: “Hi Mum scams, in which fraudsters pose as a child in need of help, are designed to prey on parents’ emotions. If you think your son or daughter is in trouble, it’s natural to want to act quickly, and many parents who would normally consider themselves fairly scam-savvy have been surprised by anxiety-inducing texts purporting to come from their child.

“Using AI technology to imitate voices will make it even more difficult to detect fraudsters.

“Parents on Mumsnet have suggested agreeing on a password or question to use in these scenarios, so it’s great to see the Government promoting this measure.”

“Taking the time to sit down and have a conversation with your children about how you will handle an emergency reduces the risk of being surprised, and having a password established is a simple way to protect yourself. and your family. ‘

A Home Office spokesperson said: “Artificial intelligence has already started to transform the way scams are carried out against victims and their credibility.

“The government’s new anti-fraud strategy will have the threat of AI-driven scams at its heart. It is essential that we get ahead of these risks, otherwise we will see even more innocent people fall prey to fraudsters.”