Kids Online Safety Act
H.R. 6484 – Kids Online Safety Act for minors on internet platforms
119th Congress
H.R. 6484 sets national rules for how large online platforms must protect users under age 17. It focuses on safety tools, limits on harmful content and addictive features, audits, and clear notices to kids and parents. It would be enforced mainly by the Federal Trade Commission and State attorneys general and would take effect 18 months after enactment.
- Bill Number
- HR6484
- Chamber
- house
What This Bill Does
The bill creates new duties for certain online platforms, called “covered platforms,” that are open to the public, use accounts and user names, rely on user‑generated content, use engagement‑boosting design features, and use personal data for ads or recommendations. These platforms must create and enforce reasonable policies to address harms to minors, including threats of physical violence, sexual exploitation and abuse, illegal drugs, tobacco, cannabis, gambling, alcohol, and financial harm from deceptive practices. The bill states that minors can still search for information and get resources about these topics, and it bars government enforcement that targets speech based on viewpoint. Covered platforms must offer special safeguards for users they know are minors. These include tools to limit who can contact the minor, default limits on design features that cause compulsive use, and an easy option for minors to limit their own time on the platform. By default, minors must get the most protective privacy and safety settings the platform offers. Platforms must also give parents tools to see and, in the case of children under 13, control privacy and account settings, restrict purchases, and view and limit time spent on the platform. Platforms must notify minors when parental tools are active and, for known children, turn these tools on by default unless parents previously opted out under similar tools. The bill requires platforms to set up easy reporting systems for harms to minors, including a dedicated electronic contact point and confirmation plus a substantive response within set time frames. Platforms may not advertise narcotic drugs, cannabis, tobacco, gambling, or alcohol to users they know are minors. It sets rules so that safeguards and parental tools are clear, age‑appropriate, and in the same language and format as the service, and allows use of third‑party services to help provide these tools. It also states that platforms are not required to share minors’ browsing history, messages, or contact lists, and are not forced to collect more age data than they already do in the normal course of business. Before a known minor registers or makes a purchase, covered platforms must clearly explain their minor‑safety policies and how to use the safeguards and parental tools. For known children under 13, they must inform a parent about these tools and get verifiable parental consent, which can be combined with existing notice and consent steps under the Children’s Online Privacy Protection Act. Platforms must also clearly label when users’ endorsements of products or services are paid or made for commercial consideration. The bill requires each covered platform to undergo an independent third‑party safety audit every year. Auditors must review the platform’s safeguards, methods to reduce harms, and must consult with parents, nonprofits, health experts, and free expression experts. The audit must cover usage by minors, the number of minor users, time spent on the platform, use of safeguards and parental tools, reports of harm, how the platform handles those reports, how it collects and uses minors’ personal information, and how it designs and evaluates engagement features used by minors. Platforms must cooperate fully and submit audit results to the Federal Trade Commission. Enforcement is mainly through the Federal Trade Commission, which can treat violations as unfair or deceptive acts and use its usual powers and penalties. State attorneys general and certain State officials can also sue in State or Federal court to stop violations, seek money for residents, and obtain other relief, though they may not bring overlapping cases while a federal case is pending. The bill also creates a Kids Online Safety Council within the Department of Commerce to study online risks and benefits for minors, recommend methods to reduce harms, suggest research topics, and propose best practices for audits and reports. The Council includes experts, parents, minors, educators, platforms, civil liberties experts, and State officials, and must issue a final report within three years before it ends. Finally, the bill states it does not change the Children’s Online Privacy Protection Act, the scope or meaning of Section 230 of the Communications Act, or certain Federal Trade Commission rules. It allows platforms to cooperate with law enforcement and respond to legal demands and security threats. It preempts State and local laws that relate to the provisions of this Act, sets an 18‑month delayed effective date, and includes a standard severability clause so that if one part is struck down, the rest can remain in force.
Why It Matters
This bill would set nationwide rules for how many popular online platforms must treat users under 17, which could change how social media, apps, and websites look and work for kids and teens. It aims to reduce certain harms, like exposure to illegal drugs, sexual exploitation, gambling, and financial scams, and to give minors and parents more control over time spent online and who can contact young users. For parents and guardians, the bill would create clearer rights and tools to manage children’s accounts, spending, and screen time, especially for kids under 13. For platforms, it would add ongoing duties such as annual third‑party audits, record‑keeping, and responding quickly to reports of harm, which may affect business practices, product design, and compliance costs. The law also sets a single federal standard that could replace a patchwork of different State rules, while explicitly trying to avoid conflicts with existing privacy and speech protections. The real‑world effects on youth mental health, online behavior, and speech are not fully known and would likely depend on how platforms implement the safeguards, how strongly regulators enforce the rules, and how families choose to use the new tools. The Council’s reports and future research could lead to later changes or additions to online safety policy.
External Categories and Tags
Categories
Tags
Arguments
Arguments in support
- Sets consistent nationwide safety standards for minors on major platforms, reducing confusion from differing State rules and giving families clearer expectations.
- Increases control for minors and parents through default‑on safeguards, time limits, and parental tools, which may help manage screen time and unwanted contacts.
- Targets specific, concrete harms—such as sexual exploitation, illegal drugs, gambling, and deceptive financial practices—rather than attempting to regulate all online content.
- Uses regular independent audits and detailed reporting to increase transparency and accountability for how platforms treat young users and design engagement features.
- Preserves First Amendment protections and clarifies that the law does not change the meaning of Section 230 or key federal privacy laws, which some see as protecting free expression and existing legal frameworks.
- Establishes an expert council with diverse viewpoints, including parents, minors, platforms, and civil liberties experts, to give ongoing, research‑based guidance on online safety for kids.
- Allows platforms flexibility to tailor safety measures based on their size and technical capacity, which may help smaller services comply without identical systems to large platforms.
Arguments against
- Could encourage more age estimation or age verification practices by platforms, even though the bill says they do not have to collect new age data, raising privacy and identification concerns for all users.
- Broad duties to prevent harms and to limit “design features” that drive engagement might lead platforms to over‑restrict content or features for minors, potentially affecting access to lawful information and expression.
- The annual independent audit and detailed data and reporting requirements may be costly and complex for platforms, especially medium‑sized or fast‑growing services, which could affect innovation or market entry.
- Federal preemption of State laws may limit States’ ability to adopt stricter or more tailored protections for minors based on local needs or faster policy responses.
- Giving parents strong control over accounts, especially for older teens, may raise concerns for some families about teen privacy, autonomy, and safety in households with conflict or abuse.
- The definition of covered platforms and harms, and the requirement to respond to reports within tight deadlines, may create legal uncertainty and risk of litigation while platforms and regulators work out how to apply the standards.
Key Facts
- Applies to “covered platforms” that are public internet services with user accounts, user‑generated content, engagement‑boosting design features, and data‑driven ads or recommendations.
- Requires covered platforms to create and enforce reasonable policies to address specified harms to minors, including physical threats, sexual exploitation, certain illegal or age‑restricted products, and deceptive financial practices.
- Mandates default high‑protection privacy and safety settings for users the platform knows are minors, and default activation of parental tools for users the platform knows are children under 13.
- Requires platforms to offer minors tools to limit who can contact them, to limit compulsive design features, and to cap time spent on the platform.
- Prohibits platforms from facilitating advertising of narcotic drugs, cannabis, tobacco, gambling, or alcohol to users they know are minors.
- Requires clear notices about safeguards, parental tools, and paid endorsements, and ties parental consent duties to the existing Children’s Online Privacy Protection Act framework.
- Establishes a mandatory independent third‑party audit for each covered platform within 1 year of enactment and annually after, with detailed reporting to the Federal Trade Commission.
- Treats violations as unfair or deceptive acts under the Federal Trade Commission Act and allows enforcement by both the FTC and State attorneys general or State officials, subject to limits when a federal case is pending.
- Creates a Kids Online Safety Council within the Department of Commerce to study risks and benefits, recommend best practices, and issue a final report within 3 years, after which the Council terminates.
- Expressly preempts State and local laws that relate to the provisions of this Act and states that it does not alter COPPA or the scope or meaning of Section 230 of the Communications Act.
- Becomes effective 18 months after enactment, with existing parental opt‑out choices for some tools respected when the new rules take effect.
Gotchas
- The bill’s definition of “covered platform” excludes some online services that do not meet all criteria (for example, sites without user‑to‑user following or engagement‑driven design features), so not every service used by minors would be covered.
- While the bill imposes strong duties tied to age, it also states that platforms are not required to collect additional personal information to determine age, leaving a tension between knowing who is a minor and limiting data collection.
- The Kids Online Safety Council is exempt from the usual federal advisory committee rules (FACA), which affects transparency and procedural requirements for its meetings and operations.
- State enforcement is allowed, but if the FTC or U.S. Attorney General brings a case first, State officials cannot bring overlapping actions against the same defendant for the same alleged violations while the federal case is pending.
- The bill explicitly protects platforms’ ability to cooperate with law enforcement and respond to legal demands and security incidents, which may involve sharing user information in certain situations even while aiming to protect minors’ privacy in other areas.
Full Bill Text
We're fetching the official bill text from Congress.gov. Check back shortly.
