PRIORITY BILLS:Unable to load updates
ModernAction Logo
In House Committee

Children and Teens’ Online Privacy Protection Act

H.R. 6291 – Children and Teens’ Online Privacy Protection Act to expand COPPA and ban targeted ads to minors

119th Congress

H.R. 6291 updates the Children’s Online Privacy Protection Act (COPPA) to cover both children under 13 and teens ages 13–16. It adds new limits on how websites, apps, and online services can collect, use, store, and share young people’s personal data and bars targeted advertising to them. The bill gives the Federal Trade Commission (FTC) new reporting duties and clarifies that federal rules override conflicting state laws in this area.

Bill Number
HR6291
Chamber
house

What This Bill Does

The bill widens who is covered by COPPA to include “teens,” defined as ages 13 to 16, not just children under 13. It updates the definition of “operator” so that most commercial websites, online services, online apps, and mobile apps that collect young users’ data are covered, but it excludes tax‑exempt charities. It also broadens the list of what counts as personal information, adding things like precise location data, biometric data (such as fingerprints or face templates), photos, videos, audio with a child’s or teen’s voice, and data that can be linked back to a child, teen, or their parent. Operators may not collect, use, disclose, or keep children’s or teens’ personal information for the purpose of individual-specific (targeted) advertising to minors. They also may not collect personal information from a child or teen unless it fits the context of the service, is needed to complete a transaction or provide a requested service, or is required or allowed by law. Operators must delete personal information when it is no longer reasonably needed for the requested service, unless law requires them to keep it. If an operator stores, transfers, or gives access to children’s or teens’ data in certain foreign “covered nations,” it must give notice to the parent or teen. The bill replaces “parental consent” with a broader “verifiable consent” standard that can come from a parent (for children) or from the teen. Operators must give clear and easy‑to‑find privacy notices and obtain verifiable consent before collecting personal information in many cases, and again before any material change in how they use or share that information. Parents of children and teens themselves gain rights to see what information has been collected, delete it, refuse further use or collection, and challenge and correct inaccurate data. Operators must have reasonable security practices to protect children’s and teens’ data against unauthorized access. For services used in schools, the bill allows operators to rely on agreements with educational agencies or schools instead of direct consent from each parent or teen, as long as data are used only for educational purposes and certain notice, access, and deletion rights are provided through the school. The bill directs the FTC to write or update regulations to carry out these requirements, including rules on when operators can end service if consent is refused and when they must continue service even if data are deleted. It also requires the FTC to study and, if feasible, allow a “common verifiable consent mechanism” that could cover multiple related services, and it gives the FTC ongoing reporting duties to Congress about enforcement and about how large social media companies comply. Finally, the bill changes preemption so that states cannot enforce or create laws that conflict with this federal act in the same subject area. It clarifies that state attorneys general can bring actions to enforce violations of the core requirements. It also requires that any FTC rulemaking under this law include an analysis of the impact on small businesses and provides that if one part of the act is struck down, the rest can stay in effect.

Why It Matters

This bill would significantly expand federal online privacy protections beyond young children to include most high‑school age teens. Many popular websites, apps, and social media platforms currently rely on collecting and analyzing user data, including for targeted advertising. Under this bill, using personal data of children and teens for individual‑specific advertising would be banned, which could change how these services are designed and funded for younger users. Families could gain clearer rights to see, delete, and correct information that services hold about their children and teens. Teens would also gain more control over their own data, including the ability to refuse further collection or use in many cases. Schools and educational technology providers would have more detailed rules on the use of student data for educational purposes, which could affect how digital tools are chosen and managed in classrooms. For companies, especially large social media platforms, the bill could lead to new compliance duties, including stronger age‑awareness systems, data‑security measures, and record‑keeping to show how they treat minors’ data. Because the bill preempts state laws in this area, it could replace a patchwork of different state requirements with a single national standard, though the exact effects on existing state privacy laws would depend on how those laws overlap and how the FTC writes its rules.

External Categories and Tags

Categories

technologycivil-rights

Tags

online-privacy (100%)children-and-teens (95%)data-collection (90%)social-media (80%)parental-consent (75%)targeted-advertising (70%)ftc-enforcement (65%)education-technology (55%)data-deletion (50%)data-security (45%)

Arguments

Arguments in support

  • Expanding coverage to teens reflects how older minors widely use the internet and social media and face similar privacy risks as younger children.
  • Banning individual-specific targeted advertising to minors could reduce data-driven profiling and marketing practices that rely on tracking and predicting young people’s behavior.
  • Stronger rights to access, delete, and correct data may give families and teens more meaningful control over their digital footprints.
  • Clear national standards and federal preemption could simplify compliance for companies that currently face differing state rules, while still providing a baseline of protections for all U.S. minors.
  • Allowing school-based agreements for educational technology could support the use of digital learning tools while keeping student data limited to educational purposes.
  • The requirement for security practices and limits on foreign storage or access may reduce risks of data breaches or exposure of minors’ data in certain foreign jurisdictions.
  • Regular FTC reporting and potential common consent mechanisms could improve transparency and reduce friction for parents and teens managing privacy across multiple services.

Arguments against

  • The ban on targeted advertising to minors may reduce ad revenue for services that rely on it, possibly leading to fewer free or low-cost online offerings for young users.
  • Compliance with new consent, notice, access, deletion, and security requirements could be complex and costly, particularly for smaller companies and app developers.
  • The broad preemption clause might limit states’ ability to adopt stronger or more tailored privacy protections for children and teens.
  • Defining and enforcing who counts as a child or teen online may require more age estimation or verification, which could increase data collection or raise other privacy concerns.
  • Exceptions for school agreements could shift important consent and oversight decisions from individual parents and students to educational agencies, which may not always reflect family preferences.
  • The expanded definition of personal information and new obligations could create legal uncertainty until the FTC issues regulations and courts interpret key terms.
  • Notice requirements for data stored in certain foreign countries may not fully address broader concerns about cross-border data flows involving minors.

Key Facts

  • Extends COPPA protections to “teens,” defined as individuals over 12 and under 17, in addition to children under 13.
  • Broadens the definition of personal information to include geolocation, biometric identifiers, persistent identifiers across sites/apps, and data linkable to a child or teen or their parent.
  • Prohibits operators from collecting, using, disclosing, or maintaining children’s or teens’ personal information for the purpose of individual-specific (targeted) advertising to minors.
  • Limits data collection from children and teens to what is consistent with the context of the service or required/authorized by law, and requires deletion once information is no longer reasonably needed for the requested transaction or service.
  • Requires notice to parents or teens if their personal information is stored in, transferred to, or accessed by a “covered nation” as defined in federal law.
  • Gives parents and teens rights to access, delete, correct, and limit further use or collection of personal information, subject to certain exceptions (such as legal retention requirements or law enforcement needs).
  • Requires operators to implement reasonable security practices to protect the confidentiality, integrity, and accessibility of children’s and teens’ personal information.
  • Allows certain school or district agreements to substitute for direct verifiable consent, if student data are used only for educational purposes and parents/teens can access and request deletion through the educational agency.
  • Instructs the FTC to assess and report on the feasibility of a “common verifiable consent mechanism” and, if feasible, to adopt rules allowing its use across multiple related services.
  • Requires the FTC to report to Congress on its enforcement of COPPA each year and on high-impact social media companies’ compliance within three years.
  • Strengthens federal preemption by barring states from enforcing or adopting laws that relate to the provisions of this act, while still allowing state enforcement of the federal standard.
  • Mandates that all FTC regulations under this title include analysis of impacts on small entities under the Regulatory Flexibility Act.

Gotchas

  • High-impact social media companies are held to a broader “knowledge” standard that includes willful disregard, which may subject them to greater enforcement risk than smaller services.
  • The bill allows operators to terminate service to a child or teen who refuses to allow further data collection, but separately bars discontinuing service solely because a user requests deletion, if service can be provided without that data.
  • Audio files containing a child’s or teen’s voice are excluded from “personal information” only under narrow conditions, including strict limits on use and retention and clear notice in the privacy policy.
  • Operators may retain limited records of deletion or correction requests and certain security-related information, even after other personal data are deleted, for purposes such as fraud prevention and ensuring deletions remain effective.
  • State attorneys general retain a role in enforcing the federal standard, but they must coordinate with the FTC, which can affect how quickly or aggressively state-level actions proceed.

Full Bill Text

We're fetching the official bill text from Congress.gov. Check back shortly.