PRIORITY BILLS:Unable to load updates
ModernAction Logo
In House Committee

SCREEN Act

H.R. 1623 (SCREEN Act): Age checks on pornographic websites to block minors

119th Congress

H.R. 1623 would require certain online platforms that host pornographic or other harmful visual content to use technology to verify that users are adults. It directs the Federal Trade Commission (FTC) to enforce these rules and to audit companies. The bill also sets rules for protecting age-verification data and orders a government study of the impacts.

Bill Number
HR1623
Chamber
house

What This Bill Does

The bill applies to "covered platforms," which are interactive computer services that do business in or target the U.S. and regularly create, host, or make available visual content considered harmful to minors for profit. "Harmful to minors" is defined using existing legal ideas about sexual and obscene material, including child pornography and certain explicit sexual images that lack serious value for minors. Starting one year after the bill becomes law, these platforms must put in place a technology-based age verification system. This system must make it more likely than not that a user is not a minor before the user can access the platform’s harmful content, and simply asking a user to click a box saying they are an adult would not be enough. Platforms must publicly explain their verification process, apply it to U.S.-based users’ IP addresses (including known VPN IP addresses, unless they determine the user is not in the U.S.), and ensure that minors cannot access any harmful content. Platforms may choose which specific age-verification technology to use and may hire third-party companies to run the checks, but the original platforms remain legally responsible for complying. They must protect any age-verification data they or their vendors collect, using reasonable security measures, and may keep the data only as long as needed to run the verification or to show they are following the law. The bill clarifies that platforms do not have to send identifying user data to the FTC to comply. The FTC must consult with experts in computer science, child online safety, privacy, age-verification technology, and data security as it enforces the law and sets standards. The FTC is required to conduct regular audits of covered platforms, publish its audit terms and processes, define what documents or materials companies must provide, and issue nonbinding guidance within 180 days to help platforms comply. Violations of the age-verification rules are treated as unfair or deceptive practices under the Federal Trade Commission Act, giving the FTC its normal enforcement tools and penalties. The bill also requires the Government Accountability Office (GAO) to submit a report to Congress two years after platforms must start complying. The report must examine how effective the verification measures are, how well companies are following the rules, how they protect data, and what behavioral, economic, psychological, and social effects the measures are having. The GAO may also recommend changes to enforcement and to the law itself. A severability clause says that if any part of the Act is ruled unconstitutional, the rest remains in force.

Why It Matters

This bill could change how adults and minors access pornographic and other explicit visual content online in the United States. If the systems work as intended, minors may find it harder to view this type of material on major adult-content platforms, which could affect their online experiences and exposure to such content. Online platforms that regularly host explicit content would face new technical, legal, and financial duties to verify ages and protect verification data. This may influence how these businesses operate, what technologies they adopt, and how they manage user privacy and security. The GAO report and FTC oversight could shape future laws or rules on age verification and online safety, but the exact long-term effects on privacy, free expression, and business models are not yet clear. Because similar past efforts to restrict minors’ access to online pornography have faced constitutional challenges, especially under the First Amendment, the bill’s legal durability and how courts interpret its provisions could also have broader effects on internet regulation and digital rights.

External Categories and Tags

Categories

technologycivil-rights

Tags

age-verification (100%)online-pornography (90%)interactive-computer-services (80%)ftc-enforcement (70%)data-security (60%)minors-online-safety (60%)reporting-requirement (40%)third-party-vendors (30%)gao-report (30%)

Arguments

Arguments in support

  • May reduce minors’ access to online pornography and other explicit visual content by replacing simple age "click-through" gates with stronger technology-based checks.
  • Gives platforms flexibility to choose among different age-verification technologies and use third-party providers, which may allow innovation and reduce implementation burdens.
  • Builds in data-protection limits by requiring reasonable security measures and minimal retention of age-verification data, which could help reduce the risk of misuse or breaches.
  • Uses the existing FTC enforcement structure for unfair or deceptive practices, which may provide a clear and familiar framework for oversight and penalties.
  • Requires consultation with experts in child safety, privacy, and technology, which could help ensure that standards are technically sound and reflect both safety and privacy concerns.
  • Mandates a GAO study on effectiveness and broader impacts, giving Congress data to adjust or improve the law in the future if needed.

Arguments against

  • Could require users to share more personal or identifying information with platforms or third-party verification services, raising privacy and data-security concerns even with protections in place.
  • Compliance costs and legal risks may be high for covered platforms, which could affect smaller or newer businesses more than large, established companies.
  • Some may see the definitions of "harmful to minors" and the blocking requirements as potentially affecting lawful adult access to content, raising free-expression and constitutional questions.
  • Treating violations as unfair or deceptive practices may lead platforms to adopt overly strict measures to avoid liability, possibly blocking more content or users than necessary.
  • Mandatory IP-based checks, including for known VPN addresses, may encourage more aggressive tracking of user locations or technical methods that some view as intrusive.
  • The effectiveness of age-verification technologies in actually preventing determined minors from accessing explicit content remains uncertain, so the benefits may be limited compared with the burdens.

Key Facts

  • Only platforms that are interactive computer services, engaged in or targeting U.S. commerce, and regularly creating, hosting, or making available harmful visual content for profit are covered.
  • "Harmful to minors" includes certain explicit depictions that appeal to prurient interest, are patently offensive for minors, and lack serious value for minors, as well as obscene material and child pornography.
  • Covered platforms must, within one year of enactment, implement a technological age-verification system that does more than simple self-attestation (for example, more than a "click to confirm" age gate).
  • Age verification must be applied to users' IP addresses, including known VPN IPs, unless a platform determines the user is not located in the United States.
  • Platforms may select their own verification technologies and may use third-party vendors, but remain fully responsible for compliance and liability.
  • Platforms must publish information describing their age verification process.
  • Any data collected for verification must be secured with reasonable measures, protected from unauthorized access, and retained only as long as reasonably needed for verification or to show compliance.
  • The FTC must issue compliance guidance within 180 days, conduct regular audits of covered platforms, and publicly describe its audit terms and processes.
  • Violations of the age-verification requirements are treated as unfair or deceptive acts or practices under the Federal Trade Commission Act, giving the FTC its standard enforcement powers and penalties.
  • The GAO must report to Congress two years after the compliance date on effectiveness, compliance rates, data security practices, broader impacts, and possible enforcement or legislative improvements.
  • A severability clause preserves the rest of the Act if any specific provision or its application is found unconstitutional.

Gotchas

  • The law covers platforms that regularly make harmful content available for profit, even if that content is not their sole or main business, which could pull in mixed-content sites or services with user-generated material.
  • Platforms are not required to send user-identifying data to the FTC, but they must still keep enough records to prove compliance in audits, which may shape how they design their systems and logs.
  • The FTC’s guidance is explicitly nonbinding and cannot itself be the basis for enforcement; only violations of the statute’s text can be charged, which may create a gap between recommended and legally required practices.
  • Age verification must cover all harmful content on the platform, not just clearly labeled adult sections, which might push some platforms to redesign or segregate content to simplify compliance.
  • The requirement to consider known VPN IP addresses could affect users who rely on VPNs for general privacy or security reasons, even when accessing lawful content.

Full Bill Text

We're fetching the official bill text from Congress.gov. Check back shortly.