Deepfake
AI Voice Clone Scams Surge: How to Protect Your Family
AI Voice Clone Scams Surge: How to Protect Your Family
A new scam technique is causing panic across communities: AI-cloned voices used in fake emergency calls to relatives.
How It Works
- Data Collection: Scammers collect voice samples from social media videos, voicemail greetings, or public posts.
- AI Clone Creation: Using easily accessible AI tools, they create a synthetic version of your loved one’s voice.
- Emergency Call: You’re called by “your grandson” or “daughter” in panic, claiming to be in trouble—jail, accident, kidnapping—and needing money immediately.
Recent Cases
- Ohio woman lost $6,000 after receiving a call “from her son” claiming to be in a car accident
- Texas family wired $8,500 after “granddaughter” called begging for bail money
- Multiple reports of callers using names like “Mom help me” before the line goes dead
Red Flags
🚨 Urgency: “I need the money RIGHT NOW” 🚨 Secret keeping: “Don’t tell Dad, he’ll be mad” 🚨 Unusual payment methods: Gift cards, cryptocurrency, wire transfers 🚨 Wrong voice characteristics: Slightly off pitch, unnatural pauses, background noise