⚠️ AttorneyGeneral.net is an informational resource and is not affiliated with any government agency.

AI Impersonation Scams: Deepfakes & Voice Cloning

Quick Summary

AI impersonation scams use deepfake technology and voice cloning to impersonate real people with stunning accuracy. In 2025, these scams increased 1,400% and caused $14.2 billion in losses, making them the fastest-growing fraud category. Scammers impersonate CEOs, government officials, family members, celebrities, and trusted contacts to trick victims into sending money, sharing sensitive data, or authorizing fraudulent transactions.

Critical Threat: As of January 2026, AI voice cloning requires only 3-5 seconds of audio to create a convincing fake. Deepfake videos can be generated in under an hour. Your voice from public videos, voicemail greetings, or social media can be used against you or your loved ones.

What Are AI Impersonation Scams?

AI impersonation scams use artificial intelligence technology to create fake but highly realistic audio, video, or text communications that appear to come from real, trusted individuals. This technology enables three main types of fraud:

Voice Cloning / Audio Deepfakes

AI software analyzes a person's voice from public sources (videos, podcasts, social media, voicemail) and creates synthetic audio that sounds identical. Scammers use this to impersonate family members in distress, executives authorizing payments, or officials making demands.

Required audio sample: As little as 3-5 seconds

Creation time: Minutes to hours

Accuracy: 95%+ voice match possible

Video Deepfakes

AI manipulates video footage to make it appear someone said or did something they never did. Scammers create fake video calls, celebrity endorsements, government announcements, or executive communications to deceive victims.

Required source material: Publicly available photos/videos

Creation time: Hours to days for high quality

Accuracy: Increasingly difficult to detect without technical analysis

Synthetic Identity Fraud

AI creates entirely fake people who don't exist - complete with photos, social media profiles, voice, and video. These synthetic identities are used for romance scams, investment fraud, and business email compromise.

Components: AI-generated photos, fabricated history, fake documents

Detection difficulty: Extremely difficult; appears completely real

Use cases: Long-term relationship building for fraud

2025 FTC Ruling: In December 2025, the FTC made it illegal to use AI-generated impersonations for commercial purposes without clear disclosure. However, enforcement is challenging and scammers operating from overseas ignore these rules.

How AI Impersonation Technology Works

The Scammer's Process

Step 1: Target Selection & Research

Scammers identify targets through:

  • Social media: LinkedIn profiles reveal corporate hierarchies and relationships
  • Public records: Property records, business filings, court documents
  • Data breaches: Stolen contact information and personal details
  • Corporate websites: Executive bios, photos, video interviews
  • News articles: Information about wealthy individuals, business deals

Step 2: Voice/Video Sample Collection

Scammers collect audio and video from:

  • YouTube videos, podcasts, webinars
  • Earnings calls and conference presentations
  • Social media videos (TikTok, Instagram, Facebook)
  • News interviews and TV appearances
  • Voicemail greetings (obtained by calling businesses)
  • Video calls (some scammers join legitimate meetings to record audio)

Amount needed: As little as 3-5 seconds of clear audio for basic voice cloning; more samples improve quality.

Step 3: AI Model Training

Using readily available software (some free, some paid), scammers:

  • Upload voice samples to AI voice cloning platforms
  • Train the model to replicate speech patterns, tone, accent
  • Generate test audio to verify quality
  • For video: Use face-swapping software to map target's face onto video

Cost: Free to $100/month for sophisticated tools

Technical skill required: Minimal; many tools are user-friendly

Step 4: Execution of Scam

Scammers deploy the fake content through:

  • Phone calls: Voice clone calls victim pretending to be someone they know
  • Video calls: Real-time deepfake video on Zoom, Teams, or other platforms
  • Pre-recorded videos: Fake celebrity endorsements, government warnings
  • Social media: Fake profiles using synthetic identity

Common AI Impersonation Scam Schemes

1. CEO Fraud / Business Email Compromise

Loss amount: Average $180,000 per incident

How it works: Scammer uses AI-cloned voice to call finance department employee, posing as CEO or CFO, requesting urgent wire transfer for confidential acquisition, vendor payment, or legal settlement.

Example: In March 2025, a Hong Kong company lost $25.6 million when finance employees joined what appeared to be a legitimate video conference with their CFO and other colleagues - all deepfakes.

Red flags:

  • Unusual payment methods or recipients
  • Urgent requests bypassing normal approval processes
  • Demands for secrecy or confidentiality
  • Contact outside normal channels

2. Family Emergency / Grandparent Scam

Loss amount: Average $11,000 per victim

How it works: Scammer calls elderly person using AI-cloned voice of grandchild, claiming they're in trouble (arrested, in accident, kidnapped) and need immediate money wired or sent via gift cards.

Example: Arizona woman received call from "grandson" in January 2025 saying he was arrested in Mexico and needed $8,500 for bail. The voice was perfect. It was AI. Grandson was safe at home.

Red flags:

  • Request not to tell other family members
  • Demand for immediate payment via untraceable methods
  • Story that creates panic and urgency
  • Reluctance to answer personal questions

3. Celebrity/Influencer Endorsement Scams

Loss amount: Average $3,400 per victim

How it works: Deepfake videos of celebrities, politicians, or financial experts promote fake cryptocurrency investments, trading platforms, or products. Victims believe they're following advice from trusted public figures.

Example: Deepfake videos of Elon Musk and Warren Buffett promoting fake crypto trading platforms cost victims over $900 million in 2025.

Red flags:

  • Videos only appear on suspicious websites, not official channels
  • Promises of guaranteed returns or "get rich quick" schemes
  • Links to unofficial websites or platforms
  • No verification from the celebrity's official social media

4. Government Official Impersonation

Loss amount: Average $7,200 per victim

How it works: AI-generated voice impersonates IRS agent, Social Security Administration official, immigration officer, or law enforcement, threatening arrest, deportation, or benefit suspension unless immediate payment made.

Example: Scammers used AI voice cloning to impersonate real IRS criminal investigation division agents in 2025, using names and badge numbers from public records to add authenticity.

Red flags:

  • Threats of immediate arrest or legal action
  • Demand for payment via gift cards, cryptocurrency, or wire transfer
  • Government agencies communicate primarily via mail, not urgent phone calls
  • Real agents provide call-back numbers and verification procedures

5. Romance Scams with Synthetic Identities

Loss amount: Average $24,000 per victim (higher than traditional romance scams)

How it works: Scammer creates entirely fake person using AI-generated photos and voice, builds relationship over weeks/months on dating apps or social media, then requests money for emergencies, travel to meet, or "investment opportunities."

Example: FBI reported 4,200+ cases in 2025 where victims fell in love with people who literally don't exist - synthetic AI-generated identities with consistent photos, voices, and video calls.

Red flags:

  • "Too perfect" profile - model-quality photos, successful career, shared interests
  • Reluctance to meet in person despite proximity
  • Video calls have poor quality or are very brief
  • Quick profession of love
  • Eventually requests money

6. Tech Support Scams with Live Video

Loss amount: Average $2,800 per victim

How it works: Pop-up claims computer is infected. When victim calls number, scammer uses deepfake video to appear as legitimate Microsoft/Apple tech support rep during remote session, gaining access to computer and financial accounts.

Red flags:

  • Unsolicited pop-ups claiming virus/security threat
  • Urgency to call immediately
  • Request for remote access to computer
  • Request for payment for "support services"

Red Flags: How to Detect AI Impersonation

Audio Deepfake Detection

What to Listen For

  • Unnatural breathing patterns: AI voices often lack realistic breathing or pause patterns
  • Robotic cadence: Slight mechanical quality or consistent pacing without natural variation
  • Background noise inconsistency: Sudden changes in ambient sound or complete silence
  • Emotional mismatch: Voice tone doesn't match the emotional content of the message
  • Verbal tics missing: Person's usual speech patterns (um, ah, like, you know) are absent
  • Pronunciation anomalies: Unusual emphasis on certain words or syllables
  • Limited interaction: Caller avoids conversation, sticks to script, doesn't respond naturally to questions

Video Deepfake Detection

What to Look For

  • Unnatural eye movements: Eyes don't blink naturally or focus doesn't track properly
  • Lip sync issues: Mouth movements don't perfectly match audio
  • Lighting inconsistencies: Shadows or lighting don't match environment
  • Facial boundary blurring: Edges of face, hairline, or ears appear blurred or distorted
  • Skin texture anomalies: Overly smooth skin lacking natural texture or pores
  • Unnatural movements: Head movements appear stiff or glitchy
  • Background inconsistencies: Background doesn't match known location or appears artificial
  • Jewelry/accessories shifting: Earrings, glasses, or necklaces appear to move unnaturally
  • Limited interaction: Person doesn't respond to requests to move, turn head, or perform specific actions

Behavioral Red Flags (Universal)

Suspect AI impersonation when:

  • 🚩 Urgent requests for money, especially via wire transfer, crypto, or gift cards
  • 🚩 Contact through unexpected or unusual channels
  • 🚩 Requests to keep transaction secret from others
  • 🚩 Person won't answer personal questions or verify identity in other ways
  • 🚩 Story creates panic and demands immediate action
  • 🚩 Person refuses to video call, or video quality is suspiciously poor
  • 🚩 Communication bypasses normal corporate approval processes
  • 🚩 Person is unavailable to meet in person despite proximity
  • 🚩 No way to verify through independent, trusted channels

Real-World AI Scam Cases (2024-2025)

Hong Kong Multinational Company - $25.6 Million Loss

Date: March 2025

Method: Finance employee joined what appeared to be a video conference with company CFO and multiple colleagues. All were deepfakes. Employee authorized 15 wire transfers totaling $25.6 million.

Outcome: Partial recovery ($8.2M). Six suspects arrested in Hong Kong and China.

Lesson: Even sophisticated employees can be fooled by high-quality deepfakes. Verify ALL financial requests through independent channels.

Arizona Grandmother - $8,500 Loss

Date: January 2025

Method: Woman received call from "grandson" claiming to be arrested in Mexico, needing bail money. Voice was perfect AI clone.

Outcome: Money not recovered. Grandson was actually safe at home, had never traveled to Mexico.

Lesson: Verify family emergencies by calling the person directly at known number or contacting other family members.

European Energy Company CEO - €220,000 Loss

Date: September 2024

Method: UK subsidiary CEO received call from parent company CEO (German accent, voice pattern perfect) requesting urgent transfer to Hungarian supplier. It was AI.

Outcome: Money not recovered. This was one of the first widely publicized AI voice fraud cases.

Lesson: Even perfect voice match isn't sufficient verification for large financial transactions.

Celebrity Crypto Scam - $900+ Million Collective Loss

Date: Throughout 2025

Method: Deepfake videos of Elon Musk, Warren Buffett, Michael Saylor promoting fake cryptocurrency platforms appeared on social media and YouTube. Thousands invested in non-existent platforms.

Outcome: Some platforms shut down, minimal recovery. Scammers operate from jurisdictions with weak enforcement.

Lesson: Verify celebrity endorsements through official channels. If it's real, it will be on their verified social media accounts.

Political Deepfake Robocalls - New Hampshire Primary

Date: January 2024

Method: AI-generated voice clone of President Biden told voters not to vote in primary. 25,000+ robocalls made.

Outcome: FCC immediately banned AI-generated robocalls. Perpetrators identified and charged.

Lesson: AI impersonation isn't just financial fraud - it's also used for political manipulation and disinformation.

How to Protect Yourself and Your Organization

Personal Protection Strategies

Establish Family Verification Codes

  • Create a secret word or phrase known only to family members
  • Use this code to verify identity in emergency calls
  • Don't share the code on social media or with non-family
  • Change it periodically
  • Tell family: "If I ever call asking for money, I'll use our code word"

Verify Through Independent Channels

  • Hang up and call back at known, verified number
  • Contact person through different platform (call if they emailed, email if they called)
  • Ask questions only the real person would know
  • Reach out to mutual contacts to verify story
  • Never use contact information provided by suspicious caller

Limit Public Audio/Video

  • Review privacy settings on social media
  • Be cautious about public video posts
  • Use generic voicemail greetings at work
  • Limit conference call/webinar participation where you speak
  • Be aware that anything public can be used to clone your voice

Be Skeptical of Urgency

  • Legitimate emergencies allow time for verification
  • Resist pressure to act immediately
  • Question why normal processes are being bypassed
  • Ask yourself: "Why can't this wait 30 minutes?"
  • Remember: Urgency is the scammer's most powerful tool

Business Protection Strategies

Implement Multi-Factor Verification for Financial Transactions

  • Dual approval: Require two authorized signatures for transactions over threshold
  • Call-back verification: Finance staff must call requester at known number before processing
  • In-person verification: High-value transactions require face-to-face or verified video conference
  • Time delays: Build mandatory waiting periods into wire transfer processes
  • Code words: Executives and finance staff establish verification phrases for unusual requests

Employee Training and Awareness

  • Conduct regular training on AI impersonation threats
  • Share real-world examples and case studies
  • Practice scenarios through simulated phishing/vishing tests
  • Create culture where questioning unusual requests is encouraged
  • Establish clear reporting procedures for suspicious contacts
  • Train employees to recognize deepfake red flags

Technical Safeguards

  • Email authentication: Implement DMARC, SPF, DKIM to prevent email spoofing
  • Video conferencing security: Use meeting passwords, waiting rooms, verified participant lists
  • Communication platform verification: Verify sender identity through multiple indicators
  • AI detection tools: Consider deploying deepfake detection software for high-risk roles
  • Access controls: Limit who can initiate financial transactions

What to Do If You've Been Scammed

Immediate Actions (First 24 Hours)

  1. Stop all contact with the scammer immediately
  2. If you sent money:
    • Wire transfer: Contact your bank immediately to request recall
    • Cryptocurrency: Contact exchange (recovery unlikely but report it)
    • Credit card: Call card company to dispute charges
    • Payment app: Report to app and contact your bank
    • Check: Contact bank to stop payment if not yet cashed
  3. Document everything:
    • Save all communications (emails, texts, call logs)
    • Take screenshots of websites, profiles, messages
    • Record dates, times, amounts, and names
    • Save any audio or video recordings if possible
  4. If you shared sensitive information:
    • Change passwords on all accounts immediately
    • Enable two-factor authentication everywhere possible
    • Place fraud alerts with credit bureaus (Equifax, Experian, TransUnion)
    • Consider credit freeze
    • Monitor accounts daily for unauthorized activity

Business Response

If your organization was targeted:

  • Notify executive leadership and legal counsel immediately
  • Contact law enforcement (FBI IC3 for cyber crimes)
  • Preserve all evidence (don't delete emails or communications)
  • Review security procedures and implement additional safeguards
  • Consider whether customer/partner notification is required
  • Contact insurance provider about cyber insurance coverage
  • Conduct internal investigation to understand what happened

How to Report AI Impersonation Scams

Report to all applicable agencies to maximize chances of investigation and help prevent others from becoming victims:

Agency When to Report How to Report
Federal Trade Commission (FTC) All consumer fraud including AI impersonation ReportFraud.ftc.gov
FBI Internet Crime Complaint Center (IC3) Cyber crimes, business email compromise, significant financial losses IC3.gov
State Attorney General All consumer fraud and scams Find your state AG
Local Police To create official police report (needed for insurance, legal purposes) Visit local police station or call non-emergency number
Social Media Platforms If scam involved fake profiles, deepfake videos on their platform Use platform's fraud reporting tools
SEC Investment fraud, fake celebrity crypto endorsements SEC Complaint Center
Important: Even if you didn't lose money, report suspected AI impersonation attempts. Your report helps agencies identify patterns and take enforcement action to protect others.

Complete Complaint Filing Guide → Find Your State AG →

Related Scams

Romance Scams

Synthetic AI identities increasingly used in romance fraud

Learn more →

Cryptocurrency Scams

Deepfake celebrity endorsements promote fake crypto platforms

Learn more →

Government Impersonation

AI voice cloning used to impersonate IRS, SSA officials

Learn more →