SecurityApril 12, 20267 min read

How to Protect Yourself From AI Voice Cloning Scams

SC

By Sarah Chen

Head of Privacy Research

How to Protect Yourself From AI Voice Cloning Scams

A phone call from a family member in distress. A voice you recognize, panicked, asking for money immediately. It sounds exactly like your daughter, your spouse, or your grandchild — but it's not them. It's an AI-generated clone of their voice, and the scammer on the other end needs as little as three seconds of audio to create it. AI voice cloning scams surged over 1,600% in early 2025, and they're only getting more sophisticated in 2026.

How AI Voice Cloning Scams Work

The technology behind these scams is alarmingly simple and accessible. Here's the typical process:

  1. Audio harvesting: Scammers collect voice samples from social media videos, voicemails, YouTube content, podcast appearances, or any publicly available audio. Just 3 seconds of clear audio is enough to create a convincing clone.
  2. Voice cloning: Using widely available AI tools, they generate a synthetic voice that closely mimics the target's speech patterns, tone, and inflection. Researchers say these clones have crossed the "indistinguishable threshold" — most people cannot tell the difference.
  3. The attack: The scammer calls a family member, usually a parent or grandparent, using the cloned voice. They create a fabricated emergency — a car accident, an arrest, a kidnapping — and demand immediate payment via wire transfer, gift cards, or cash courier.
  4. Urgency and isolation: The caller pressures the victim to stay on the phone, not call anyone else, and send money immediately. The combination of a familiar voice and extreme urgency bypasses rational skepticism.

The Numbers Are Staggering

One in four Americans has received an AI voice clone call in the past year. Of those who lost money, 36% lost between $500 and $3,000, and 7% lost between $5,000 and $15,000. Global deepfake fraud losses exceeded $200 million in the first quarter of 2025 alone.

Real Examples of AI Voice Cloning Scams

These aren't theoretical threats — they're happening right now:

  • The Florida grandmother: Sharon Brightwell sent $15,000 in cash to a courier after receiving a call that sounded exactly like her daughter. The voice claimed to have been in a car accident and needed bail money. It was entirely fabricated by AI.
  • The UK energy company: Employees at a British energy firm transferred €220,000 after receiving phone instructions from what they believed was their CEO. The voice was a deepfake clone.
  • The Arup engineering firm: The multinational lost $25 million in a deepfake deception incident involving cloned voices and video.
  • The Italian defense minister impersonation: Scammers cloned the voice of Italy's Defense Minister to solicit ransom payments from prominent business leaders.

How to Protect Yourself and Your Family

Establish a Family Code Word

This is the single most effective defense. Agree on a secret word or phrase with your family members — something that only your family knows and that an AI would never guess. If someone calls claiming to be a family member in an emergency, ask for the code word. If they can't provide it, hang up.

Always Hang Up and Call Back

If you receive a distressing call from someone who sounds like a loved one, hang up and call them back on a number you already have saved in your phone. Never call back a number the caller provides — it may route to the scammer.

Ask Hyper-Personal Questions

If you can't remember your code word or haven't set one up, ask questions that only the real person would know: a childhood nickname, a shared memory, the name of a pet, or details from a recent conversation. AI clones can mimic a voice but can't access personal memories.

Reduce Your Public Audio Footprint

Scammers can only clone a voice if they have audio to work with. Take steps to limit publicly available recordings:

  • Set social media videos to friends-only or private
  • Limit the number of public videos you post with your voice
  • Review and remove old voicemail greetings that include your full voice
  • Be cautious about podcasts, YouTube videos, or public speaking recordings that are freely accessible

Watch for Red Flags

  • Calls from unknown or blocked numbers
  • Extreme urgency and pressure to act immediately
  • Requests for payment via gift cards, wire transfers, or cryptocurrency
  • Instructions not to tell anyone or not to hang up
  • Slight audio artifacts — robotic undertones, unnatural pauses, or background inconsistencies

Protect Elderly Family Members

Older adults are the most common targets of AI voice cloning scams, particularly through the "grandparent scam" variant. Have an explicit conversation with elderly family members about these scams, establish a code word, and encourage them to always verify by calling back on a known number.

What to Do If You're Targeted

  1. Cut off contact immediately. Hang up the phone. Do not engage further with the scammer.
  2. Verify the real person is safe. Call the family member the scammer impersonated using a number you already have saved.
  3. Contact your bank immediately. If you sent money or shared financial details, call your bank or credit union right away. Speed significantly improves recovery odds for wire transfers.
  4. File a report with the FTC. Report the scam at reportfraud.ftc.gov. This helps authorities track and combat these operations.
  5. File a local police report. Even if recovery seems unlikely, a police report creates an official record and may be needed for insurance claims or bank disputes.
  6. Alert your family. Warn other family members so they're aware the scammer has voice samples and may target them next.

Reduce the Data Scammers Can Use Against You

AI voice cloning is just one way scammers exploit your personal information. The more data that's publicly available about you — your phone number, home address, family relationships, daily routines — the more convincing any scam becomes. Data brokers make this information freely searchable online.

PrivacyOn removes your personal information from over 100 data broker sites, making it harder for scammers to find your phone number, identify your family members, or piece together the personal details that make these scams so effective. With 24/7 monitoring and family plans, you can protect your entire household from the data exposure that fuels these attacks.

SC
Sarah Chen

Head of Privacy Research

CIPP/US CertifiedIAPP MemberB.S. Computer Science

CIPP/US-certified privacy researcher with over a decade of experience helping consumers remove their personal information from data brokers.

Ready to Protect Your Privacy?

Let PrivacyOn automatically remove your personal information from data broker sites and keep it removed.