Identity Impersonation

Detect Impersonation Deepfakes

Attackers use AI-generated videos to impersonate executives, coworkers, public figures, and even family members. Before you send money, share sensitive information, or act on a video request, verify that the recording hasn’t been manipulated.

Deepfake Detection
Fraud Prevention
Identity Safety

Drag & drop a video here

or click to browse files

Supported formats: MP4, MOV, AVI, WebM

Maximum file size: 500MB

Privacy-first: Your videos are never stored

Understanding the Threat

Anatomy of an Impersonation Deepfake Scam

Impersonation deepfakes exploit trust in familiar faces and voices. By recreating a person’s appearance or speech with AI, attackers can convince victims that a video message is real.

  1. 1

    Target Research

    Attackers collect publicly available video and audio of a person they want to impersonate. This may include executives, employees, influencers, or family members. The material often comes from interviews, social media posts, or recorded meetings.

  2. 2

    Deepfake Video Creation

    Using face swap and voice cloning technology, attackers generate AI videos that mimic the target’s appearance and speech patterns.

  3. 3

    Urgent Request

    The impersonation deepfake is delivered through a video message or live call. The attacker pressures the victim to act quickly by sending money, sharing credentials, or providing sensitive information.

  4. 4

    Rapid Extraction

    Funds or data are transferred before the victim verifies the identity through another trusted communication channel.

Detection Technology

What Our Deepfake Detector Analyzes

Our deepfake detection system analyzes facial behavior, voice alignment, and video metadata to identify AI generated impersonation videos.

Visual

Facial Expression Fidelity

Evaluates micro expressions and natural facial movement patterns. Deepfake impersonation models often struggle to replicate these subtle signals accurately.

Audio

Audio-Visual Synchronization

Checks whether speech audio matches lip movement. Voice cloned deepfake videos often show timing mismatches between sound and mouth motion.

Temporal

Scene Transition Analysis

Studies frame to frame continuity, camera angles, and environmental consistency to detect artifacts introduced during AI video generation.

Metadata

Encoding Pattern Analysis

Inspects the underlying video encoding and metadata for patterns associated with synthetic video generation or post processing manipulation.

Why It Matters

Real-World Impact

Impersonation deepfakes are already causing significant financial losses and undermining trust in digital communication. Criminals now use AI-generated video and voice cloning to impersonate executives, coworkers, and family members in scams that can bypass traditional verification methods.

Step-by-Step Guide

How to Verify a Suspicious Impersonation Video

Use this process when a video message or call appears to come from someone you know but asks for something unusual, urgent, or sensitive.

1

Capture the Video or Call Recording

Save the clip or meeting recording that contains the suspicious request or message.

Most messaging and conferencing platforms allow you to save video or record calls. Preserve the highest quality version possible so visual and audio details remain intact.

2

Upload for Deepfake Analysis

Upload the recording to our detector to check for face-swap and voice-clone indicators.

We analyze facial regions, lip-sync alignment, temporal consistency, and audio patterns that reveal impersonation deepfakes.

3

Verify Through a Trusted Channel

Independently contact the person being impersonated through a known, verified method.

Call their verified phone number, corporate directory number, or meet in person. Never rely solely on the channel where the suspicious video arrived.

4

Escalate if Necessary

If the clip appears synthetic or suspicious, report the incident to the appropriate security or fraud team.

Provide the recording and details of the request so the situation can be reviewed through appropriate channels.

Frequently Asked Questions

How can you tell if a video is an impersonation deepfake?

Signs of impersonation deepfakes can include unnatural facial movement, mismatched lip sync, inconsistent lighting, or unusual audio patterns. Detection tools can analyze these signals, but independent verification through trusted channels is always recommended.

Can your tool detect live impersonation deepfakes?

We focus on analyzing recorded clips and call recordings. For real-time calls, we recommend recording suspicious sessions when allowed and uploading them immediately for analysis.

How accurate is impersonation deepfake detection?

Accuracy depends on the quality of the recording and sophistication of the attacker. Our system performs well against many current face-swap and voice-cloning techniques and is continuously updated.

What kind of impersonation scenarios can you help with?

Common situations include fake executive requests, fraudulent vendor or client messages, and impersonated friends or relatives asking for emergency money or sensitive information.

Do you keep impersonation clips after analysis?

No. We do not permanently store uploaded videos. If you are logged in, we only keep basic analysis information so you can view your results again.

Should I treat a low score as proof that a video is real?

A low score suggests we did not detect strong signs of AI generation, but it does not prove authenticity. Combine our results with common-sense checks and organizational verification policies.