Algorithmic Transparency and Choice Act
H.R. 6253 – Algorithmic Transparency and Choice Act for Minors on Online Platforms
119th Congress
H.R. 6253 would require online platforms that use personalized recommendation systems to give minors clear notices and choices about how those systems work. It sets rules for what information must be shared and gives minors options to change or avoid personalized feeds. The Federal Trade Commission (FTC) would enforce the law, and it would override similar state rules on these specific requirements.
- Bill Number
- HR6253
- Chamber
- house
What This Bill Does
The bill applies to public‑facing online platforms, like social networks and video‑sharing services, that use automated systems to choose and rank content for users under age 18. These minors are called “covered users,” and the platforms are “covered online platforms.” One year after the law takes effect, these platforms must follow new rules when minors use their services. Platforms would have to show a clear notice the first time a minor interacts with a personalized recommendation system, telling them that such a system is being used to select what they see. Platforms must also include, in their terms and conditions, a plain‑language explanation of how the recommendation system works. This includes what features, inputs, and key settings it uses, what kinds of user‑specific data it collects or infers, how that data is gathered, and what choices the minor has to opt out, change their profile, or influence the system. The platform must explain what kinds of engagement or other quantities the system is designed to optimize and how important each is for ranking content. The bill requires platforms to give minors an easy way to switch between a personalized recommendation system and an “input‑transparent algorithm,” and to limit the types or categories of recommendations they receive. For minors, the default setting must be the input‑transparent algorithm, which cannot use a minor’s past behavior or inferred traits to rank content unless the minor clearly provides specific data for that purpose, like a search term or a saved preference. The bill spells out what counts as data the user expressly provides and what does not, and defines related terms like minor, user‑specific data, and geolocation information. The Federal Trade Commission would treat violations of these rules as unfair or deceptive acts under existing law, with the same powers, penalties, and procedures it already has. The bill clarifies that platforms do not have to reveal trade secrets, confidential business information, or privileged information. It also makes clear that platforms may still allow users to block or restrict other users. Finally, the bill prevents states and localities from enforcing their own laws that cover the same specific notice and choice requirements in subsection (a), giving the federal rules priority in that area.
Why It Matters
This bill focuses on how online services present content to minors and how much control those minors have over recommendation systems. Many platforms use personalized algorithms based on a user’s behavior and data to decide what posts, videos, or messages to show next. The bill would shift the default for minors to a less personalized, more transparent feed and require clearer explanations of how personalization works. For families and young users, this could affect what minors see online, how easily they can reduce personalization, and how their data is used for content ranking. For online platforms, it could require design changes to feeds, settings, and terms of service, as well as new tools to let minors opt out or limit recommendations. The exact effects on user experience, safety, or time spent online are not specified in the bill and may depend on how platforms implement these requirements. Because the bill gives the FTC enforcement authority and limits overlapping state rules on these specific requirements, it would create a single federal standard for notices and algorithm choices for minors on covered platforms. This may simplify compliance for large services that operate nationwide, while also preventing states from adopting different or stricter rules on the same narrow topic of how these notices and options are provided.
External Categories and Tags
Categories
Tags
Arguments
Arguments in support
- Could increase transparency by giving minors clearer information about how recommendation algorithms shape what they see online.
- May enhance user control by making a less personalized, input-transparent feed the default for minors and offering easy switches and limits on recommendation categories.
- Provides a uniform federal standard for notices and algorithm options, which may make compliance clearer for platforms that operate nationwide.
- Uses existing FTC enforcement structures, which may allow for quicker implementation and consistent enforcement compared with creating a new agency.
- Limits disclosure obligations to avoid exposing trade secrets or confidential business information, which may reduce security and competitiveness concerns for platforms.
Arguments against
- Could impose technical and design burdens on platforms, especially smaller services, which may find it costly to build and maintain separate default algorithms and controls for minors.
- Preemption of state laws on these specific requirements may prevent states from adopting different or stricter rules they see as better tailored to local needs.
- The definitions of input-transparent algorithms and user-specific data may be complex, making it harder for platforms to know if they are fully compliant.
- Some may be concerned that providing more settings and choices to minors will not necessarily lead to meaningful use of those options, limiting the practical impact.
- Others may worry that the bill focuses on transparency and choice rather than directly addressing harmful or sensitive content that minors might still access, even with these tools in place.
Key Facts
- Applies to “covered online platforms” that use personalized recommendation systems and provide user-generated content, excluding broadband providers and email.
- Covers “covered users,” defined as minors (under 18) who register an account or create a profile on the platform.
- Requires a clear, conspicuous notice the first time a minor interacts with a personalized recommendation system stating that such a system selects the content they see.
- Requires detailed, plain-language disclosures in the platform’s terms and conditions about how the recommendation system works, what data it uses, how that data is collected or inferred, and what options the minor has to opt out or adjust it.
- Mandates an easy option for minors to switch between a personalized recommendation system and an input-transparent algorithm.
- Requires an option for minors to limit the types or categories of recommendations they receive from personalized systems.
- Sets the input-transparent algorithm as the default setting for minors, meaning it generally cannot use behavioral or inferred data unless the minor expressly provides it for ranking.
- Defines what counts as user-supplied data for ranking (such as search terms, filters, saved preferences, and current location) and what is excluded (such as browsing history, past locations, activity, and inferences).
- Treats violations as unfair or deceptive acts under the Federal Trade Commission Act, giving the FTC full enforcement powers and penalties.
- Protects proprietary information by stating that platforms do not have to disclose trade secrets, confidential business information, or privileged material.
- Preempts state and local laws that impose their own legal requirements covering the same notice and algorithm-choice obligations in subsection (a).
Gotchas
- The default for minors is not a simple chronological feed; it is an “input-transparent algorithm,” which can still use some user-supplied data like search terms and current location.
- The bill allows use of age information for systems that restrict access to content based on age-appropriateness without counting those systems as personalized recommendation systems.
- Security, spam-filtering, and fraud-prevention systems are excluded from the definition of personalized recommendation systems, so they are not subject to these notice and choice rules.
- Platforms remain free to offer block and restrict tools that let users control who can see or interact with their accounts and content.
- The preemption clause is narrow: it only blocks state and local laws that “cover the requirements of subsection (a),” leaving room for other types of state regulation of online platforms that do not duplicate these specific notice and choice duties.
Full Bill Text
We're fetching the official bill text from Congress.gov. Check back shortly.
