Standards and guidelines to ensure AI systems in healthcare do not perpetuate racial or ethnic biases.
Answer what matters
Skip any question. Your message only uses the topics and provisions you answer.
Related legislation
5 related bills are tracked for context, but none have a time-sensitive action window right now.
Track what happens
After sending, you can choose updates as votes, cosponsorships, and related bills move.
Official status
Enter your ZIP to see recorded votes, cosponsorships, and petition signatures tied to this subject.
We show sourced records and reviewed statements. We do not infer a broader issue stance from party, ideology, or a single unrelated vote.
Tell us where you stand
Answer the policy questions below or skip any that do not fit your view. We'll map only your answers to the bills in Congress and draft your message.
1 bill on this topic
“Companies and agencies that build or use important algorithms should have clear duties, honest marketing, and written rules for how data and responsibility are handled.”
1 bill on this topic
“Companies should have to check powerful computer systems before they use them to help make major decisions about people's lives.”
1 bill on this topic
“Federal app-worker rules should clearly say which platforms must follow them and should not be used as an excuse to take away flexible work access.”
1 bill on this topic
“Companies should have to check automated tools for serious risks before those tools affect people's lives.”
1 bill on this topic
“Licensed health workers should be able to use their own judgment and overrule artificial intelligence when a patient or the law requires it.”
1 bill on this topic
“The Federal Trade Commission should have the staff, expertise, and legal power it needs to police high-stakes AI systems.”
1 bill on this topic
“Health care employers that use artificial intelligence should have clear rules, train workers, and put guardrails on how the technology is used.”
2 bills on this topic
“People should be able to learn when major decisions about them are helped by computer systems and where to go if they need to challenge a decision.”
1 bill on this topic
“States and the federal government should both be able to protect people from harmful automated decision systems.”
Optional, but recommended. Your selections come from relevant bills in Congress; if something is missing or you want a specific point included, add it here.
Example: My daughter's school closed twice last fall because of wildfire smoke.
Step 2 of 3 · Add your info next
Your message will cover 5 bills in Congress
A Yale field experiment found legislators shown actual district opinion shifted their votes to match it. The ones kept in the dark? No relationship between constituent views and how they voted.
Offices log, sort, tag, and tally incoming contact, then brief the member. Constituent communications eat roughly a third of House staff resources. Your message gets counted.
92% of staff say individualized messages influence undecided lawmakers — versus 56% for form letters. Naming a specific bill with your own reasoning puts you in a different category entirely.
When offices don’t hear from constituents, they ask lobbyists instead. Not contacting your rep doesn’t leave the scale empty — it hands the weight to someone else.
These are related bills tracked for context. None have a time-sensitive action window on this subject right now.