Testimonial Fraud

Detect AI-Generated Testimonial Videos

Synthetic spokesperson videos and fake review testimonials mislead consumers daily. Verify video endorsements before trusting them.

Review Verification
Spokesperson Detection
Consumer Protection

Drag & drop a video here

or click to browse files

Supported formats: MP4, MOV, AVI, WebM

Maximum file size: 500MB

Privacy-first: Your videos are never stored

Understanding the Threat

How Fake Testimonial Videos Are Produced

The fake testimonial pipeline relies on AI-generated faces and voices to create realistic endorsements for products and services that may be fraudulent.

  1. 1

    Synthetic Persona Creation

    AI tools generate a realistic-looking person who does not exist. Some services offer libraries of synthetic "actors" with different ages, ethnicities, and appearances.

  2. 2

    Script and Voice Generation

    A marketing script is written and fed to a text-to-speech engine that produces natural-sounding narration matched to the synthetic face.

  3. 3

    Video Assembly

    The synthetic face is animated to match the generated audio, producing a video testimonial from a person who never existed and never used the product.

  4. 4

    Distribution at Scale

    Dozens of unique-looking "customers" are generated to create the appearance of widespread satisfaction, then distributed across landing pages, social ads, and review platforms.

Detection Technology

What Our Detector Analyzes

Our AI identifies telltale patterns in synthetic testimonial videos that human viewers typically miss.

Visual

Synthetic Face Indicators

Detects faces generated by AI persona tools including unnatural skin texture, static hairlines, and asymmetric features that indicate a non-real individual.

Audio

Text-to-Speech Markers

Identifies characteristics of AI-generated speech including unnatural cadence, uniform pitch, and missing breath sounds between phrases.

Temporal

Animation Artifacts

Detects unnatural head movement patterns, limited body motion, and static backgrounds typical of talking-head AI video generators.

Metadata

Generation Tool Signatures

Identifies encoding patterns and resolution characteristics specific to popular AI video creation platforms like Synthesia, HeyGen, and D-ID.

Why It Matters

Real-World Impact

Fake testimonial videos are a growing problem in online advertising and e-commerce. The FTC has taken enforcement action against companies using deceptive endorsements, and AI-generated testimonials represent the latest escalation. Platforms like Fiverr and freelancer sites openly sell synthetic spokesperson videos, making it trivially easy for scammers to produce dozens of fake customer endorsements at low cost.

Step-by-Step Guide

How to Verify a Video Testimonial

Follow these steps to check whether a product review or endorsement video features a real person.

1

Capture the Testimonial

Download or screen-record the testimonial video from the website or social media ad.

Focus on the most prominent testimonial. Scammers often use the same AI tool for multiple "reviewers," so checking one is often enough to identify a pattern.

2

Run Deepfake Analysis

Upload the video to our detector to check for synthetic face and voice indicators.

Our system specifically checks for talking-head AI generators commonly used in testimonial production.

3

Check for Consistent Identity

Reverse image search a screenshot of the "reviewer" to see if the same face appears across unrelated products.

AI-generated faces are often reused across multiple scam campaigns. Finding the same face endorsing unrelated products is a strong indicator of fraud.

4

Report Fraudulent Content

Report fake testimonials to the advertising platform, the FTC, and consumer protection agencies.

Fake endorsements violate FTC guidelines and platform advertising policies. Reporting helps protect other consumers from the same scam.

Frequently Asked Questions

How common are AI-generated testimonial videos?

They are increasingly common, especially for health supplements, crypto schemes, and dropshipping products. AI video creation services make it possible to produce dozens of unique fake testimonials at very low cost.

Can your tool distinguish AI voices from real ones?

Yes. Our audio analysis component detects characteristics of text-to-speech engines including uniform cadence, synthetic breath patterns, and pitch consistency that differ from natural human speech.

What about real people reading scripts for fake reviews?

Our tool specifically detects AI-generated faces and synthetic audio. Paid actors who are real people reading scripts require different investigation methods such as checking whether the reviewer actually purchased the product.

Are AI-generated testimonials illegal?

The FTC considers fake endorsements to be deceptive advertising, which is illegal. Using AI-generated personas to create fake customer testimonials violates FTC endorsement guidelines and can result in enforcement actions.

Which products are most commonly promoted with fake video testimonials?

Health and wellness products, cryptocurrency investment schemes, weight loss supplements, get-rich-quick programs, and dropshipping stores are the most frequent users of synthetic testimonial videos.

Can companies legally use AI spokespersons?

Companies can use AI-generated content if they disclose it and do not misrepresent it as real customer experiences. The issue arises when synthetic personas are presented as real customers sharing genuine product experiences.