PRIORITY BILLS:Unable to load updates
ModernAction Logo
Awaiting Floor Vote

STOP CSAM Act of 2025

S.1829 – STOP CSAM Act of 2025: Online child exploitation reporting, liability, and victim remedies

119th Congress

This bill changes federal criminal and civil laws to address online sexual exploitation of children. It adds new duties and penalties for online service providers and expands rights and payments for child victims. It has been reported out of the Senate Judiciary Committee with an amendment.

Bill Number
S.1829
Chamber
senate
Introduced
5/21/2025

What This Bill Does

The bill updates federal court procedures to better protect children and former children who are victims or witnesses in abuse, exploitation, or kidnapping cases. It defines terms like “psychological abuse,” “exploitation,” and “covered person,” and makes it easier to keep their personal information private in court records and hearings. It also lets courts record child testimony on video, requires victim impact information in presentence reports, and authorizes $25 million per year for guardians ad litem to help child victims. The bill changes restitution rules so that courts must order restitution for a wider set of child exploitation and certain obscene visual depiction offenses. It refines what counts as “child pornography production” and “trafficking in child pornography” for restitution purposes. Courts may appoint a trustee or other fiduciary to hold and manage restitution money for victims who are minors, incapacitated, or living abroad, and it authorizes $15 million per year to support this system. For online service providers, the bill rewrites the duty to report apparent child sexual exploitation to the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline. Providers must send more detailed reports within 60 days of gaining actual knowledge, including copies of apparent child pornography, related messages, technical data, and indicators of whether images are AI- or computer-generated, when that information is available. NCMEC must make reports available to specified U.S. and foreign law enforcement agencies. The bill creates criminal fines and civil penalties for providers that knowingly fail to report, fail to preserve material, or file false or incomplete reports, with higher penalties when individuals are harmed; these amounts go into a reserve fund for child pornography victims. The bill requires large providers (over 1 million monthly users and $50 million in yearly revenue) to file yearly public reports to the Attorney General and Federal Trade Commission about their child safety policies, reporting systems, tools, and trends. These reports must describe measures to protect children, support parents, design safer products, and track the prevalence and patterns of online child sexual exploitation. The agencies must publish these reports with required redactions to avoid revealing how to evade safety tools or commit crimes. The bill creates a new federal crime for interactive computer services that intentionally host or store child pornography or knowingly promote or facilitate certain child sexual exploitation offenses, with fines up to $1 million, or up to $5 million if the conduct risks serious injury or causes harm. It also clarifies and expands civil “private right of action” remedies for victims of trafficking and child exploitation offenses, including those depicted in child pornography or certain obscene visual depictions. A new civil cause of action (section 2255A) allows victims to sue interactive computer services and app stores that intentionally, knowingly, or recklessly promote, aid, or abet certain crimes, or host, store, or make child pornography available, with minimum liquidated damages of $300,000, possible punitive damages, and no statute of limitations. The bill states that nothing in section 230 of the Communications Act limits these new civil claims, though certain good faith compliance actions and use of encryption are not, by themselves, a basis for liability. It sets out defenses where providers remove content quickly, act in good faith but cannot remove it, or where removal would break encryption. It also provides for sanctions when parties or attorneys bring repeated bad-faith lawsuits or defenses in these civil cases. Finally, it includes a severability clause and states that existing federal, state, and Tribal child-exploitation laws and victim remedies remain in force and are not preempted if they are at least as protective of victims.

Why It Matters

The bill affects children and adults who were harmed as children by sexual exploitation, trafficking, or related crimes, especially when those harms happen or spread online. By expanding definitions, privacy protections, restitution, and civil lawsuit options, it aims to change how courts recognize harm and how money and support reach victims. This could influence how victims participate in cases and what financial recovery they may receive. The bill places new duties, reporting standards, and liability risks on online platforms, app stores, and other interactive computer services. Large technology companies would need to maintain more detailed safety practices, reporting pipelines to NCMEC, and public transparency about child safety. These changes could shape how platforms design products, moderate content, handle encryption, and interact with law enforcement, but the full impact on their operations and on overall online safety is not yet clear. The law also clarifies the relationship between these new rules and existing federal, state, and Tribal laws. By saying that other victim rights and remedies are not reduced, it may lead to more parallel or overlapping claims in different courts. How judges interpret key terms such as “promotion,” “aiding and abetting,” and “reckless” in the online context will likely affect how often victims sue and how companies respond to child safety risks on their services.

External Categories and Tags

Categories

technologycivil-rightseconomy

Tags

online-child-exploitation (100%)reporting-requirement (90%)interactive-computer-service (85%)civil-remedy (80%)restitution (70%)section-230 (65%)child-victim-protection (60%)cybertipline (55%)penalties-and-fines (50%)annual-reporting (45%)

Arguments

Arguments in support

  • The bill strengthens protections for children and former children in court by clarifying key terms, expanding who is covered, and tightening privacy safeguards around sensitive information.
  • Mandatory restitution, clearer definitions of qualifying offenses, and the ability to appoint trustees can help ensure that financial compensation reaches victims and is managed in their best interests.
  • More detailed and timely reporting to NCMEC, combined with penalties for noncompliance, may lead to faster identification and disruption of child exploitation networks online.
  • Requiring large technology companies to publish child safety transparency reports could increase public awareness and encourage stronger safety-by-design practices across the industry.
  • New criminal and civil liability for interactive computer services and app stores may create stronger incentives for platforms to prevent, detect, and remove child sexual exploitation content and activity.
  • Clarifying that section 230 does not block certain victim lawsuits may allow individuals harmed by online facilitation of abuse to seek redress directly from platforms that played a role.
  • Provisions that preserve state, Tribal, and other federal remedies help avoid weakening existing protections and allow multiple legal tools to be used against child sexual exploitation.

Arguments against

  • Expanded reporting content requirements and strict timelines may be challenging for smaller or resource-limited providers to meet, potentially increasing compliance costs and legal risk.
  • New civil and criminal liabilities for platforms and app stores could lead companies to take very cautious or broad actions, which some may view as encouraging over-removal of user content or limiting certain online services.
  • Allowing victims to sue interactive computer services and limiting section 230 protections may increase litigation against technology companies, possibly raising costs that are passed on to users or stifling innovation.
  • The requirement to include copies of apparent child pornography and extensive account data in reports, even with safeguards, may raise concerns about data security and secondary exposure to harmful material.
  • Because there is no statute of limitations for some new civil claims, companies may face long-term uncertainty and potential liability for past conduct and legacy systems.
  • Critics may worry that the law’s interaction with encryption—including evidentiary use of encryption design choices—could indirectly pressure companies about how they implement privacy and security features, even if encryption is not an independent basis for liability.

Key Facts

  • Broadens who is protected in federal court from just current child victims to any “covered person” who was under 18 when the crime occurred, including now-adult victims and child witnesses.
  • Creates a presumption that releasing a covered person’s protected information (such as name, address, or school and medical records) is harmful, and tightens standards for when courts may allow public disclosure.
  • Requires probation officers and guardians ad litem to gather and present detailed victim impact information in sentencing for child abuse and exploitation cases.
  • Mandates restitution for offenses under chapter 110 of title 18 and certain section 1466A offenses, and adjusts minimum per-victim restitution, including a $3,000 floor or 10% of total losses if total losses are under $3,000.
  • Authorizes federal courts to appoint trustees or other fiduciaries to hold and manage restitution for minors, incapacitated victims, and some foreign victims, with court-set duties, fees, and payment schedules.
  • Significantly expands the content that providers must include in mandatory CyberTipline reports, including technical identifiers, AI-generation flags, and contextual chats or messages when available.
  • Imposes criminal fines on providers that knowingly fail to file required reports or preserve material, with higher maximum fines for repeat violations and when individuals are harmed; civil penalties range from $50,000 to $1,000,000 per violation.
  • Directs that all fines and civil penalties collected under the reporting section be deposited into the Child Pornography Victims Reserve fund.
  • Requires certain large providers to submit annual child safety transparency reports; DOJ and FTC must publish them with mandatory redactions to protect safety measures and law-enforcement-sensitive details.
  • Creates a new federal offense (18 U.S.C. 2260B) for interactive computer services that intentionally host or store child pornography or knowingly promote or facilitate specific child exploitation crimes, with fines up to $5,000,000 depending on harm and risk.
  • Establishes a new civil cause of action (18 U.S.C. 2255A) allowing victims to sue interactive computer services and app stores for intentional, knowing, or reckless promotion, aiding and abetting, or hosting of child pornography and related crimes, with no statute of limitations and minimum liquidated damages of $300,000 per victim.
  • Specifies that section 230 of the Communications Act does not limit claims brought under the new civil cause of action, but preserves safe treatment for certain good-faith legal compliance and encryption practices.
  • Includes a severability clause so that if one part is found unconstitutional, the rest of the Act and amendments remain in effect.
  • Explicitly states that federal, state, and Tribal victim remedies and child-exploitation laws are not reduced or preempted by this Act, as long as they are at least as protective of victims.

Gotchas

  • The definition of “covered person” includes adults who were under 18 when the crime occurred, so confidentiality and attendant-right protections extend beyond current minors into adulthood.
  • Courts can classify additional information as “protected information” on a case-by-case basis, which may broaden what is sealed and limit public access to certain case details.
  • The bill expressly states that providers are not required to affirmatively search, screen, or scan for child pornography or related facts; duties arise when they have actual knowledge, but civil and criminal penalties attach once those duties are triggered.
  • Encryption-related design choices cannot, by themselves, create civil liability under the new section 2255A, but those choices can still be used as evidence on issues like motive or intent if they meet standard evidentiary rules.
  • The Act adds sanctions for repeated bad-faith lawsuits or defenses in the new civil actions, tying those sanctions to the existing Rule 11 framework, which may affect how both plaintiffs’ and defendants’ attorneys approach these cases.
  • Federal, state, and Tribal laws that are at least as protective of victims remain fully enforceable, so online services may need to comply with overlapping and potentially varying legal standards across jurisdictions.

Full Bill Text

We're fetching the official bill text from Congress.gov. Check back shortly.