I Found a Deepfake of Myself: Emergency Response Checklist
April 12, 2026 · Aksahy H

Discovering a deepfake of yourself is a violation. Whether it’s a fake explicit image, a manipulated video, or a synthetic voice recording, your first reaction may be panic, shame, or rage. These emotions are valid—but they can also lead to mistakes that harm your case later.
This emergency checklist provides immediate, actionable steps to preserve evidence, remove content, and protect yourself. We’ve organized this by urgency: what to do in the first hour, the first 24 hours, and the first week.
The First Hour: Preserve Evidence
Your instinct may be to delete everything immediately. Don’t. You need evidence for takedown requests and potential legal action.
Step 1: Screenshot Everything
- Full-screen screenshots of the deepfake content
- URL bar visible in each screenshot
- Usernames, timestamps, and surrounding context
- Any comments, shares, or engagement metrics
Pro tip: Use a tool like Web Archive (archive.org) to capture a permanent record of the page.
Step 2: Document the Original
If the deepfake uses your real photos, locate the original images:
- Find the unmodified source photos on your devices
- Note when and where they were taken
- Check if they were posted publicly (Instagram, LinkedIn, etc.)
Step 3: Run a Free Detection Scan
Use our free deepfake detection tool to:
- Confirm the image is AI-generated (get a risk score)
- Generate a technical report for platforms/law enforcement
- Document the analysis for your records
The First 24 Hours: Takedown & Reporting
Platform-Specific Takedown Procedures
Google Search (De-indexing)
- Visit Google Legal Removals
- Select “Create a new request” → “Non-consensual explicit imagery”
- Provide URLs, screenshots, and confirmation you’re the subject
- Google typically responds within 2-4 days
Meta (Facebook/Instagram)
- Report the post/image directly → “Nudity or sexual activity”
- Select “Sharing private images” or “I’m in this image and don’t consent”
- Upload ID verification if requested
- Meta’s review typically takes 24-48 hours
X (Twitter)
- Report tweet → “It’s abusive or harmful”
- Select “Non-consensual intimate media”
- Requires submitting government ID
- Report post → “This is abusive or harassing”
- Select “It’s sexual or suggestive content involving a minor” (if applicable) or “It’s involuntary pornography”
- Reddit has strict policies against non-consensual imagery
Law Enforcement Reporting
In the United States, report to:
- FBI Internet Crime Complaint Center (IC3): ic3.gov
- Local police: File a report for harassment or identity theft
- National Center for Missing & Exploited Children (NCMEC): If under 18, report at cybertipline.org
The TAKE IT DOWN Act (2025)
New federal law requires platforms to remove non-consensual intimate imagery within 48 hours of a valid request. Key provisions:
- Platforms must establish 24/7 reporting mechanisms
- Failure to remove content results in FTC penalties
- Covered platforms: social media, search engines, cloud storage, messaging apps
When to Upgrade to Professional Forensic Analysis
For legal proceedings or high-stakes situations, our Human Review service (£149) provides:
- Court-admissible technical documentation
- Chain of custody protocols
- Analyst testimony availability
- Written assessment within 7 working days
Emotional Support Resources
Dealing with deepfake abuse is traumatic. Professional support is available:
- Cyber Civil Rights Initiative (CCRI): 844-878-2274 (crisis helpline)
- Without My Consent: Legal resources for online harassment
- Badass Army: Peer support for image-based abuse survivors
- Therapy: Psychology Today directory for trauma specialists
Legal Options: Civil vs. Criminal
Criminal Charges
Many states now have specific deepfake laws. Potential charges include:
- Harassment or cyberstalking
- Identity theft
- Non-consensual pornography (revenge porn laws)
- Extortion (if money is demanded)
Civil Lawsuits
You may sue for:
- Invasion of privacy
- Defamation (if the deepfake implies false actions)
- Emotional distress
- Copyright infringement (if they used your photos)
Prevention: Protecting Yourself Going Forward
- Set Google Alerts for your name
- Use reverse image search monthly
- Watermark personal photos
- Limit public social media photos
- Consider reputation monitoring services