Starling Bank has warned that just three seconds of audio is enough for criminals to clone your voice and attempt to scam your friends and family.
The bank has found that posting voice recordings online could lead to scam attempts by using AI technology.
These voice cloning scams involve fraudsters imitating a person’s voice using videos uploaded online or on social media.
Scammers can then identify that person’s family members and use the cloned voice to stage a phone call, voice message or voicemail to them.
Whilst impersonating their loved ones, the fraudsters ask them for money that they may stress is needed urgently.
Nearly half (46%) of people do not know this type of scam even exists, according to a 3000-person study conducted across the UK in August.
One in 12 (8%) people responded in the survey that they would send whatever money was requested - even if they thought the call seemed strange.
Nearly three in 10 (28%) people believe they have potentially been targeted by an AI voice cloning scam in the past year, according to Mortar Research's study.
How to avoid an AI voice cloning scam
People could consider agreeing a “safe phrase” with close friends and family members to help them verify the caller is genuine, Starling Bank has suggested.
However, there could be a chance that safe words are compromised.
With this in mind, the Take Five to Stop Fraud campaign urges people to pause and take time to think if it could be a scam.
People with doubts could call a trusted friend or family member to “sense check” a request or they could call 159 to speak directly to their bank.
Many banks can be reached this way, including Bank of Scotland, Barclays, Co-operative Bank, First Direct, Halifax, HSBC, Lloyds, Metro Bank, Monzo, Nationwide Building Society, NatWest, Royal Bank of Scotland, Santander, Starling Bank, Tide, TSB and Ulster Bank.
What to do if you think you have been scammed
If you believe you have been scammed, you should:
- contact their bank or payment provider
- contact the police
For more advice on how to avoid and report scams, visit the Citizens Advice website.
“People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters," Lisa Grahame, chief information security officer at Starling Bank, said.
Ms Grahame continued: “Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a safe phrase to thwart them.
"So it’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim.
“Simply having a safe phrase in place with trusted friends and family – which you never share digitally – is a quick and easy way to ensure you can verify who is on the other end of the phone.”
Recommended reading
- WhatsApp scam warning as accounts stolen by scammers
- Drivers warned of a 'very convincing' new parking ticket scam
- Warning to anyone going on holiday this year to make crucial check
Actor James Nesbitt, who is taking part in Starling Bank’s campaign said: “I have children myself, and the thought of them being scammed in this way is really scary. I’ll definitely be setting up a safe phrase with my own family and friends.”
Lord Sir David Hanson, Minister of State at the Home Office with Responsibility for Fraud, said: “AI presents incredible opportunities for industry, society and governments, but we must stay alert to the dangers, including AI-enabled fraud.
“As part of our commitment to working with industry and other partners, we are delighted to support initiatives such as this through the Stop! Think Fraud campaign and provide the public with practical advice about how to stay protected from this appalling crime.”
House Rules
We do not moderate comments, but we expect readers to adhere to certain rules in the interests of open and accountable debate.
Last Updated:
Report this comment Cancel