PRIORITY BILLS:Unable to load updates

Take Action on This Bill

Understanding HR1941: Preventing Deepfakes of Intimate Images Act

3 min read
The Preventing Deepfakes of Intimate Images Act is a proposed law aimed at stopping the spread of fake or real intimate images without someone's consent. By targeting both real and AI-generated images, this bill seeks to protect individuals from the harmful effects of nonconsensual image sharing.

What This Bill Does

The Preventing Deepfakes of Intimate Images Act, introduced in March 2025, aims to make it illegal to share intimate images of someone without their permission. This includes both real photos and those altered or created by artificial intelligence, known as deepfakes. The bill is designed to fill gaps in current laws that don't specifically address AI-generated content. The bill defines "intimate digital depictions" as images that show nudity or sexual acts, whether they are real or altered by AI. If someone shares these images without the subject's consent, they could face civil or even criminal penalties. This means that victims could potentially take legal action against those who share their images without permission. Unlike some other laws, this bill does not require platforms to remove the images within a specific timeframe. This sets it apart from the Take It Down Act, which mandates that platforms take down nonconsensual intimate images within 48 hours. Instead, this bill focuses on the act of sharing the images itself. By addressing the issue at a federal level, the bill aims to provide consistent protection across the United States. It complements existing state laws, which all have some form of protection against nonconsensual intimate images but may not cover AI-generated content.

Why It Matters

This bill is important because it addresses the growing problem of deepfakes and nonconsensual image sharing, which can have devastating effects on people's lives. Victims, often women and public figures, can suffer from psychological trauma, reputational damage, and even job loss due to these images. For everyday Americans, this bill provides a layer of protection against the misuse of their images online. It means that if someone were to share a fake or real intimate image of them without permission, there would be legal consequences. This is especially important in today's digital age, where images can be easily manipulated and spread across the internet. By making it a federal issue, the bill ensures that victims have a consistent way to seek justice, regardless of where they live. It also holds platforms accountable for the content shared on their sites, potentially reducing the spread of harmful images.

Key Facts

  • Cost/Budget Impact: No specific budget estimate is available, but it is expected to rely on existing FTC resources.
  • Timeline for Implementation: No timeline is set, but provisions would likely take effect immediately upon passage.
  • Number of People Affected: Potentially impacts all U.S. residents, particularly victims of nonconsensual image sharing.
  • Key Dates: Introduced on March 6, 2025; related Take It Down Act signed on May 19, 2025.
  • Bipartisan Origins: Introduced by Rep. Joseph Morelle (D-NY) and cosponsored by Rep. Tom Kean (R-NJ).
  • Precedents: Builds on the Take It Down Act and state laws banning nonconsensual intimate imagery.
  • Real-World Examples: Inspired by incidents like the 2024 deepfake scandals involving celebrities and students.

Arguments in Support

- Protects Victims: The bill aims to protect individuals from the psychological and reputational harm caused by nonconsensual deepfake pornography. - Nationwide Consistency: It provides uniform protection across the U.S., reducing jurisdictional issues that arise with state laws. - Addresses AI Gaps: The bill specifically targets AI-generated content, filling a gap in current legislation. - Federal Oversight: Empowers the Federal Trade Commission (FTC) to oversee and enforce the law, similar to the successful Take It Down Act. - Bipartisan Support: The bill has backing from both Democrats and Republicans, indicating broad political support.

Arguments in Opposition

- Free Speech Concerns: Critics worry that the bill could lead to over-censorship, as platforms might remove lawful content to avoid penalties. - Risk of Abuse: Without penalties for false claims, there is a concern that the system could be misused to target disliked content. - Lack of Procedural Protections: The bill does not include mechanisms for challenging removals, unlike the DMCA, which could lead to unjust takedowns. - Broad Definitions: The term "digital forgeries" might be too broad, potentially affecting satire or artistic expressions. - Mission Creep: There is a fear that the bill could eventually be used to target non-intimate content.
Sources9
Last updated 1/7/2026
  1. co
    congress.gov
  2. co
    congress.gov
  3. co
    congress.gov
  4. na
    naag.org
  5. qu
    quiverquant.com
  6. po
    poliscore.us
  7. le
    legiscan.com
  8. go
    govinfo.gov
  9. co
    congress.gov

Make Your Voice Heard

Take action on this bill and let your representatives know where you stand.

Understanding HR1941: Preventing Deepfakes of Intimate Images Act | ModernAction