Face Realness and Authenticity Checks
Looks for AI-generated face artifacts, including unnatural skin texture, lighting inconsistencies, and subtle symmetry issues.
Catfishers and scammers on dating apps and social media platforms like Hinge, Bumble, Tinder, and Instagram now use AI to send convincing videos of fake partners.
Drag & drop a video here
or click to browse files
Supported formats: MP4, MOV, AVI, WebM
Maximum file size: 500MB
Understanding the Threat
Modern romance scams on dating apps and social media layer AI-generated content on top of classic emotional manipulation tactics.
Scammers create dating app profiles with stolen photos or AI-generated faces, then add short, polished videos to appear more trustworthy and real.
They move quickly to intense conversations, sharing personal stories and sending more videos to deepen your attachment.
A sudden emergency appears, like medical bills, travel problems, or frozen accounts, and the scammer sends emotional video messages asking for urgent financial help.
Once money or sensitive information is sent, the scammer either disappears or continues the scam with new fabricated crises.
Detection Technology
Our analysis focuses on signs of AI-generated faces and voice cloning in dating app videos and romance scam clips.
Looks for AI-generated face artifacts, including unnatural skin texture, lighting inconsistencies, and subtle symmetry issues.
Analyzes speech patterns, pitch, and breathing to spot cloned voices and text-to-speech commonly used in romance scam videos.
Examines eye contact, blinking, and gesture timing to distinguish real recordings from AI-generated motion.
Inspects metadata and encoding characteristics that can reveal AI-generated videos or reused scam content.
Why It Matters
Romance scams on dating apps and social media platforms like Instagram and Facebook are among the most emotionally and financially damaging forms of online fraud. With AI-generated videos, scammers can appear on camera, cry, and beg for help in dating scams without ever revealing their true identity. Victims often lose significant savings and may feel too ashamed to report what happened.
Step-by-Step Guide
Before you send money, gifts, or personal information, run through this quick verification process.
Download or screen-record the clip from your dating app or messages.
Capture the full video, especially parts where the person speaks directly to the camera.
Upload the clip to check for AI-generated faces and synthetic audio.
We analyze the face, motion, and voice to spot signs that the person on screen may not exist or may not be the person you think they are.
Compare the video to other photos, social profiles, and background details.
Look for mismatched names, inconsistent stories, or stock-image-style photos that appear across many profiles.
If the video feels off, stop communication or move to verified platforms.
Never send money, intimate content, or sensitive personal information to someone whose identity you cannot independently verify.
Yes. Our tool analyzes short, personal clips shared on dating apps and in messages, looking for signs of manipulated footage, face swaps, or voice cloning.
Look for overly polished or generic clips, mismatched details with profile photos, and unnatural movement like stiff expressions or odd eye contact. Be cautious if the conversation shifts toward personal favors or money.
Your safety comes first. If something feels off, pause and don’t send money or personal information. Check for inconsistencies across their profile and messages, ask follow-up questions, or stop communication if needed.
No. Your video is analyzed and then deleted. If you are logged in, we only keep simple details such as the file name and analysis result, not the video itself.
Yes. You can use the detector to review suspicious videos they receive and share steps to help them avoid catfishing, romance scams, and impersonation attempts.
Attackers use AI-generated videos to impersonate executives, coworkers, public figures, and even family members. Before you send money, share sensitive information, or act on a video request, verify that the recording hasn’t been manipulated.
Identity Fraud ProtectionAI-generated faces and voices can now pass basic “hold up your ID” or “say this phrase” checks. Add deepfake detection to your identity and KYC workflows.