‘I’m in big trouble and I really need your help.” These are surely the words no parent wants to hear.
Many people are aware of the “hi mum” scam texts that started doing the rounds a few years ago and remain prevalent today. The victim receives a text or WhatsApp message from a phone number not in their contacts. It is supposedly their son or daughter who has lost their phone and is messaging from a friend’s phone. They will tell you that they desperately need some money.
Money newsletter
The latest personal finance and investment news from our money team.
Sign up with one click
“I’ve been locked out of my bank”, “my rent is overdue and I’m going to be kicked out” or even “I’ve been mugged and they took my bank card”. They will then request that you send money into the friend’s account who will take it out in cash for them. The money goes straight to the criminal, of course.
While these scams are still effective, fraudsters are taking it up a notch.
AI-generated images are now being used by criminals to generate panic and have victims act through feelings of fear and urgency rather than the logic they would usually employ.
One of the most disturbing of these is the “kidnap scam”. Through social media widely it’s very easy to make links between family members, particularly on sites such as Facebook where relatives are often tagged in photos or birthday wishes. Armed with this information, a fraudster can very easily generate a photograph of someone in a life-threatening situation and send it to those that care about them. I had a photo of me being held captive created in a matter of minutes simply through loading existing photos of me on my social media into a free image generator.
AI-generated voice messages are also becoming more common. In 2020 there was a case of an attorney in Philadelphia who answered the phone and heard his son’s voice telling him he had crashed into a pregnant woman’s car, broken his nose and needed $9,000 for bail. He told his father to call an attorney assigned to his case, and after making that call the father was instructed to pay the money in bitcoin. This made the father uneasy, so he video called his son who confirmed that there was no such incident. Rather worryingly, all it takes to generate a convincing voice message is a few seconds of audio that can be taken from a social media video of the person talking.
Most frightening of all is that we are now heading into the era of “real time video manipulation” where a criminal can video call their intended victim looking exactly like the person they are pretending to be — essentially “wearing” their face. This can be used in all manner of scams — from sinister kidnap scams to sham investment schemes — and will be very effective in romance fraud. When a victim can speak to their “partner” on a video call, they are likely to form deeper connections more quickly than if simply exchanging written messages.
Aaron Pritz, chief executive of cybersecurity firm Reveal Risk, said: “Scammers will drop someone’s face onto their own and fake scenarios intended to create panic and urgency — such as kidnapping, an accident, or a fake investment opportunity — and call would-be victims directly.
“At the moment the tech is moving faster than our ability to detect it at scale, so the only real defence is awareness and a healthy pause before believing what you see on a screen.”
Protecting ourselves from fraud is increasingly difficult but there are some easy steps we can take to make it harder for criminals to target us.
Making social media accounts private is especially important, particularly those where you may “tag” other people. If you are contacted by someone saying they are a relative, hang up and call that person directly on the number you already have for them. Decide on a private “safe word” between your family members that no one else knows so that if anyone contacts you saying they are a daughter, nephew, grandson etc, you can ask them for that word. Lastly, don’t have the same password for all of your social media accounts.
The AI boom is good news in so many ways — advances in science, increases in productivity and improved data handling. However, it is also good news for fraudsters, so we need to be extra vigilant and make it as difficult as we can for them.
Becky Holmes is the author of Keanu Reeves Is Not in Love with You (Wilton Square Books £10.99) and The Future of Fraud (Melville House £9.99), which is out on April 23. Order from the timesbookshop.co.uk. Discount for Times+ members