Artificial Intelligence has brought incredible advancements, but it has also given scammers a powerful new tool: high-fidelity voice cloning. By using deepfake technology, criminals can now impersonate your loved ones with terrifying accuracy.
How the Scam Works
It usually starts with a phone call. You hear the voice of a grandchild, child, or friend. They sound distressed and claim to be in an emergency—a car accident, a legal trouble, or a medical crisis. They ask for money immediately, often via wire transfer or gift cards.
What you're actually hearing is an AI-generated clone of their voice, created from a short clip found on social media or a previous robocall.
How to Recognize a Voice Clone
- Urgency and Secrecy: The "loved one" will insist on immediate payment and tell you not to tell anyone else.
- Unusual Payment Methods: Legitimate emergencies are rarely settled via cryptocurrency, wire transfers, or gift cards.
- Slightly Robotic Cadence: While technology is improving, some AI voices still have unnatural pauses or a lack of emotional nuance.
Practical Defense Strategies
- Set a Family Passcode: Establish a secret word or phrase that only family members know. If someone calls in an emergency, ask for the passcode.
- Hang Up and Call Back: If you receive a distressing call, hang up and call the person back on their known, trusted phone number.
- Verify the Location: Ask the caller specific questions that only the real person would know, such as the name of a pet or a recent shared event.
Reporting AI Scams
If you encounter a deepfake scam, report it to:
- The Federal Trade Commission (FTC).
- Your local law enforcement.
- The social media platform where the audio might have been sourced.
Stay informed and stay safe in the age of AI.
About this safety guide
Our team at Scam-Watch works tirelessly to document emerging threats. This guide was produced using real-world data and expert analysis to help you stay safe online. If you've encountered something similar, please report it.