AI Voice Cloning Scam: Is It a Scam?
Scammers use AI to clone the voice of a family member or friend, then call you pretending to be that person in distress, claiming they have been kidnapped, arrested, or in an accident and need money urgently.
How This Scam Works
Using just a few seconds of audio from social media videos, voicemails, or phone calls, scammers can create a convincing AI clone of someone's voice. They call a family member, and the voice on the line sounds exactly like their loved one, crying and begging for help. A second person then takes the phone, claiming to be a kidnapper, lawyer, or police officer, and demands ransom or bail money. The emotional shock of hearing a loved one's voice in distress overrides critical thinking. Victims are told not to hang up or call anyone else. The real family member is completely safe and unaware.
Red Flags to Watch For
- Call from an unknown number with a loved one's voice in distress
- Caller demands you stay on the line and not call anyone
- Demand for ransom or bail via wire transfer, gift cards, or crypto
- Background noise or short phrases designed to limit exposure of the AI
- Caller cannot answer specific personal questions
Example Scam Messages
What to Do If You Received This
- Hang up and call the person directly on their known number
- Establish a family code word for emergencies
- Ask the caller a question only the real person would know
- Do not send money based on a phone call alone
- Contact other family members to verify the person's safety
What to Do If You Fell For It
- Contact your bank to attempt to reverse the payment
- File a police report immediately
- If you sent gift cards, call the gift card company with the receipt
- Report to the FBI IC3 at ic3.gov
- Alert other family members so they are not targeted
How to Report This Scam
- Report to the FTC at ReportFraud.ftc.gov
- File a complaint with FBI IC3 at ic3.gov
- File a local police report
- Report to the FCC if the call came through phone lines
Last updated: February 10, 2026