Fraudsters are working on using artificial intelligence to create deepfake videos featuring trusted people to target victims, it is claimed.
The videos can effectively put words into the mouths of individuals to direct people to websites that are used to hoover up personal and financial information.
In some cases their targets can be conned into handing over cash for what are bogus investment schemes.
The fact that conmen use well-known and trusted faces, such as Martin Lewis, Lorraine Kelly or Dragons’ Den investors, in online ads to target victims is well understood.
However, scammers are now taking advantage of AI to take the scams one step further by creating videos that use manipulated voices to fool people.
Santander said there is a lot of ignorance around the issue with new data showing that 53 percent of people either have not heard of the term deepfake or do not correctly understand what it involves.
The same research found that just 17 percent of people are confident they could easily identify a deepfake video.
The bank’s research found that 54 percent are worried about deepfakes being used for fraud, with 51 percent fearful that a family member could fall for a deepfake scam.
Santander is working to raise awareness by creating its own deepfake videos of financial influencer Mr Money Jar, Timi Merriman-Johnson, and Santander fraud lead Chris Ainsley, to show just how realistic they can be. The videos are being posted on social media.
A deepfake is a video, sound, or image of a real person that has been digitally manipulated through artificial intelligence (AI), to convincingly misrepresent an individual or organisation.
With deepfake generators and software widely available, fraudsters simply require authentic footage or audio of their intended victim – often found online or through social media – to create their deepfake.
Chris Ainsley, who is Head of Fraud Risk Management at Santander said: “Generative AI is developing at breakneck speed, and we know it’s ‘when’ rather than ‘if’ we start to see an influx of scams with deepfakes lurking behind them.
“We already know fraudsters flood social media with fake investment opportunities and bogus love interests, and unfortunately, it’s highly likely that deepfakes will begin to be used to create even more convincing scams of these types.
“More than ever, be on your guard and just because something might appear legitimate at first sight – doesn’t mean it is. Look out for those telltale signs and if something – or someone – appears too good to be true, it’s probably just that.”
The bank’s research found 54 percent are worried deepfake technology will be used by to steal people’s money, which was higher than the 46 percent who had concerns it could be used to manipulate elections.
Some 78 percent expect fraudsters to use the technology and 59 percent say they are already more suspicious of things they see or hear because of deepfakes.
Online ‘finfluencer’ Timi Merriman-Johnson (@mrmoneyjar) said: “The rate at which generative AI is developing is equal parts fascinating and terrifying.
“It is already very difficult to spot the difference between deepfake videos and ‘real’ ones, and this technology will only get better from this point forward. This is why it’s very important for users to be aware of the ways in which fraudsters use technology like this to scam people.
“If something sounds too good to be true, it probably is. People don’t tend to broadcast lucrative investment opportunities on the internet.
“If you are ever in doubt as to whether a company or individual is legitimate, you can always search for them on the Financial Conduct Authority Register.”
How to avoid falling victim to a deepfake scam
* Most deepfakes are still imperfect. Whether there’s blurring around the mouth, less blinking than normal, or odd reflections – look out for the giveaways.
* At some point, deepfakes will become impossible to distinguish from real videos, so context is important. Ask yourself the same common-sense questions you do now. Is this too good to be true? If this is real, why isn’t everybody doing this? If this is legitimate, why are they asking me lie to my family and/ or bank?
* Know what types of scams deepfakes are likely to be used for. Deepfakes are likely to be used by criminals to scam people through investment scams and impersonation fraud, such as romance scams. If you know the telltale signs of these scams, you’ll know how to spot them – even if a deepfake has been used.