Skip to content

I Found a Deepfake of Myself: Emergency Response Checklist

April 12, 2026 · Aksahy H

Cyberbullying victim support and prevention strategies

Discovering a deepfake of yourself is a violation. Whether it’s a fake explicit image, a manipulated video, or a synthetic voice recording, your first reaction may be panic, shame, or rage. These emotions are valid—but they can also lead to mistakes that harm your case later.

This emergency checklist provides immediate, actionable steps to preserve evidence, remove content, and protect yourself. We’ve organized this by urgency: what to do in the first hour, the first 24 hours, and the first week.

🚨 CRISIS SUPPORT: If you’re experiencing suicidal thoughts, call 988 (Suicide & Crisis Lifeline) or text HOME to 741741. You are not alone.

The First Hour: Preserve Evidence

Your instinct may be to delete everything immediately. Don’t. You need evidence for takedown requests and potential legal action.

Step 1: Screenshot Everything

  • Full-screen screenshots of the deepfake content
  • URL bar visible in each screenshot
  • Usernames, timestamps, and surrounding context
  • Any comments, shares, or engagement metrics

Pro tip: Use a tool like Web Archive (archive.org) to capture a permanent record of the page.

Step 2: Document the Original

If the deepfake uses your real photos, locate the original images:

  • Find the unmodified source photos on your devices
  • Note when and where they were taken
  • Check if they were posted publicly (Instagram, LinkedIn, etc.)

Step 3: Run a Free Detection Scan

Use our free deepfake detection tool to:

  • Confirm the image is AI-generated (get a risk score)
  • Generate a technical report for platforms/law enforcement
  • Document the analysis for your records

The First 24 Hours: Takedown & Reporting

Platform-Specific Takedown Procedures

Google Search (De-indexing)

  1. Visit Google Legal Removals
  2. Select “Create a new request” → “Non-consensual explicit imagery”
  3. Provide URLs, screenshots, and confirmation you’re the subject
  4. Google typically responds within 2-4 days

Meta (Facebook/Instagram)

  1. Report the post/image directly → “Nudity or sexual activity”
  2. Select “Sharing private images” or “I’m in this image and don’t consent”
  3. Upload ID verification if requested
  4. Meta’s review typically takes 24-48 hours

X (Twitter)

  1. Report tweet → “It’s abusive or harmful”
  2. Select “Non-consensual intimate media”
  3. Requires submitting government ID

Reddit

  1. Report post → “This is abusive or harassing”
  2. Select “It’s sexual or suggestive content involving a minor” (if applicable) or “It’s involuntary pornography”
  3. Reddit has strict policies against non-consensual imagery

Law Enforcement Reporting

In the United States, report to:

  • FBI Internet Crime Complaint Center (IC3): ic3.gov
  • Local police: File a report for harassment or identity theft
  • National Center for Missing & Exploited Children (NCMEC): If under 18, report at cybertipline.org

The TAKE IT DOWN Act (2025)

New federal law requires platforms to remove non-consensual intimate imagery within 48 hours of a valid request. Key provisions:

  • Platforms must establish 24/7 reporting mechanisms
  • Failure to remove content results in FTC penalties
  • Covered platforms: social media, search engines, cloud storage, messaging apps

When to Upgrade to Professional Forensic Analysis

For legal proceedings or high-stakes situations, our Human Review service (£149) provides:

  • Court-admissible technical documentation
  • Chain of custody protocols
  • Analyst testimony availability
  • Written assessment within 7 working days

Emotional Support Resources

Dealing with deepfake abuse is traumatic. Professional support is available:

  • Cyber Civil Rights Initiative (CCRI): 844-878-2274 (crisis helpline)
  • Without My Consent: Legal resources for online harassment
  • Badass Army: Peer support for image-based abuse survivors
  • Therapy: Psychology Today directory for trauma specialists

Legal Options: Civil vs. Criminal

Criminal Charges

Many states now have specific deepfake laws. Potential charges include:

  • Harassment or cyberstalking
  • Identity theft
  • Non-consensual pornography (revenge porn laws)
  • Extortion (if money is demanded)

Civil Lawsuits

You may sue for:

  • Invasion of privacy
  • Defamation (if the deepfake implies false actions)
  • Emotional distress
  • Copyright infringement (if they used your photos)

Prevention: Protecting Yourself Going Forward

  • Set Google Alerts for your name
  • Use reverse image search monthly
  • Watermark personal photos
  • Limit public social media photos
  • Consider reputation monitoring services
Remember: You are the victim. The shame belongs to the creator and distributor, not you. Taking action is empowering—and you have legal rights.