Steps to help combat fraud in which criminals use AI-generated replica of a person’s voice to deceive victimsThe voicemail from your son is alarming. He has just been in a car accident and is highly stressed. He needs money urgently, although it is not clear why, and he gives you some bank details for a transfer.You consider yourself wise to other scams, and have ignored<a href=”https://www.theguardian.com/money/2025/may/04/hi-mum-whatsapp-text-scam-parents-friends-bank”> texts claiming to be from him and asking for cash. But you can hear his voice and he is clearly in trouble. <a href=”https://www.theguardian.com/money/2025/dec/21/ai-cloned-voicemail-scam-criminals-fraud”>Continue reading…
First seen on theguardian.com
Jump to article: www.theguardian.com/money/2025/dec/21/ai-cloned-voicemail-scam-criminals-fraud
![]()

