Synthetic Face Indicators
Detects faces generated by AI persona tools including unnatural skin texture, static hairlines, and asymmetric features that indicate a non-real individual.
Synthetic spokesperson videos and fake review testimonials mislead consumers daily. Verify video endorsements before trusting them.
Drag & drop a video here
or click to browse files
Supported formats: MP4, MOV, AVI, WebM
Maximum file size: 500MB
Understanding the Threat
The fake testimonial pipeline relies on AI-generated faces and voices to create realistic endorsements for products and services that may be fraudulent.
AI tools generate a realistic-looking person who does not exist. Some services offer libraries of synthetic "actors" with different ages, ethnicities, and appearances.
A marketing script is written and fed to a text-to-speech engine that produces natural-sounding narration matched to the synthetic face.
The synthetic face is animated to match the generated audio, producing a video testimonial from a person who never existed and never used the product.
Dozens of unique-looking "customers" are generated to create the appearance of widespread satisfaction, then distributed across landing pages, social ads, and review platforms.
Detection Technology
Our AI identifies telltale patterns in synthetic testimonial videos that human viewers typically miss.
Detects faces generated by AI persona tools including unnatural skin texture, static hairlines, and asymmetric features that indicate a non-real individual.
Identifies characteristics of AI-generated speech including unnatural cadence, uniform pitch, and missing breath sounds between phrases.
Detects unnatural head movement patterns, limited body motion, and static backgrounds typical of talking-head AI video generators.
Identifies encoding patterns and resolution characteristics specific to popular AI video creation platforms like Synthesia, HeyGen, and D-ID.
Why It Matters
Fake testimonial videos are a growing problem in online advertising and e-commerce. The FTC has taken enforcement action against companies using deceptive endorsements, and AI-generated testimonials represent the latest escalation. Platforms like Fiverr and freelancer sites openly sell synthetic spokesperson videos, making it trivially easy for scammers to produce dozens of fake customer endorsements at low cost.
Step-by-Step Guide
Follow these steps to check whether a product review or endorsement video features a real person.
Download or screen-record the testimonial video from the website or social media ad.
Focus on the most prominent testimonial. Scammers often use the same AI tool for multiple "reviewers," so checking one is often enough to identify a pattern.
Upload the video to our detector to check for synthetic face and voice indicators.
Our system specifically checks for talking-head AI generators commonly used in testimonial production.
Reverse image search a screenshot of the "reviewer" to see if the same face appears across unrelated products.
AI-generated faces are often reused across multiple scam campaigns. Finding the same face endorsing unrelated products is a strong indicator of fraud.
Report fake testimonials to the advertising platform, the FTC, and consumer protection agencies.
Fake endorsements violate FTC guidelines and platform advertising policies. Reporting helps protect other consumers from the same scam.
They are increasingly common, especially for health supplements, crypto schemes, and dropshipping products. AI video creation services make it possible to produce dozens of unique fake testimonials at very low cost.
Yes. Our audio analysis component detects characteristics of text-to-speech engines including uniform cadence, synthetic breath patterns, and pitch consistency that differ from natural human speech.
Our tool specifically detects AI-generated faces and synthetic audio. Paid actors who are real people reading scripts require different investigation methods such as checking whether the reviewer actually purchased the product.
The FTC considers fake endorsements to be deceptive advertising, which is illegal. Using AI-generated personas to create fake customer testimonials violates FTC endorsement guidelines and can result in enforcement actions.
Health and wellness products, cryptocurrency investment schemes, weight loss supplements, get-rich-quick programs, and dropshipping stores are the most frequent users of synthetic testimonial videos.
Companies can use AI-generated content if they disclose it and do not misrepresent it as real customer experiences. The issue arises when synthetic personas are presented as real customers sharing genuine product experiences.
Face-swap videos, fake celebrity endorsements, and AI-generated clips flood social media. Verify what you see before you share or believe it.
Education IntegrityAI-generated lectures, fake training materials, and synthetic instructors erode trust in online education. Verify before you learn.