As screen time among children and teens reaches record highs, lawmakers in Washington are grappling with how to make the internet a safer place for young users. At the center of that debate is the Kids Online Safety Act (KOSA)—a sweeping piece of bipartisan legislation that aims to hold social media companies and tech platforms accountable for the risks their products pose to minors.
First introduced in 2022, the bill has evolved significantly over the last three years. While it has seen strong support in the U.S. Senate, it remains stalled in the House of Representatives, where concerns over free speech and parental rights continue to slow its progress. In the meantime, states are moving forward with their own versions of online safety laws, creating a patchwork of rules across the country.
What Is the Kids Online Safety Act?
The Kids Online Safety Act is designed to address the growing mental health crisis among children and teens, which experts and parents alike have increasingly linked to the rise of social media. The bill’s core goal is to create a legal obligation for tech companies to prioritize the safety of minors in the design and operation of their platforms.
At the heart of KOSA is a new “duty of care” requirement. This would compel companies to prevent and mitigate the exposure of minors to harmful content—including material that promotes self-harm, eating disorders, bullying, sexual exploitation, or substance abuse. To comply, platforms would need to re-engineer their algorithms and moderation systems to reduce the visibility and spread of such content.
The bill also includes several user-facing protections:
- Default privacy settings must be set to the most protective level for users under 17.
- Minors and parents would gain access to tools that let them disable addictive features, restrict interactions, or limit algorithmic recommendations.
- Data transparency and access requirements would allow independent researchers to study platform impacts on youth well-being.
- Annual safety audits would be mandated to ensure compliance and hold platforms accountable.
The Legislative Journey
KOSA has enjoyed broad bipartisan support, thanks in part to the growing recognition that existing regulations have not kept pace with the rapidly evolving digital landscape. In 2024, the bill passed the Senate by a wide margin, bundled with an update to the Children’s Online Privacy Protection Act (commonly known as COPPA 2.0), which would extend privacy protections to teens under 17.

The bill was reintroduced in 2025 with revised language aimed at addressing concerns from civil liberties advocates. The updated version includes specific provisions that prevent enforcement based on the viewpoint of any particular content, seeking to eliminate fears that the law could be used to censor protected speech.
Despite these revisions, the bill has not yet advanced in the House. A planned markup earlier this summer was postponed, and it’s unclear whether House leadership will prioritize the bill before the end of the legislative session.
A Divisive Debate
KOSA has drawn support from a wide range of voices, including parents, pediatricians, education groups, and even some large tech companies. Supporters argue that without regulation, platforms will continue to prioritize profit and engagement over user well-being. They point to rising rates of anxiety, depression, and suicide among teenagers—especially girls—as evidence that self-regulation by the tech industry has failed.
But the bill has also sparked strong opposition. Civil rights organizations warn that the “duty of care” language is too vague and could be interpreted in ways that suppress lawful content. LGBTQ+ advocacy groups, in particular, fear the law could be used to restrict access to identity-affirming content under the guise of protecting minors from harm.
Conservative groups have voiced different concerns. They argue that the bill could lead to increased government surveillance, reduce parental control, and place too much power in the hands of federal agencies to determine what is and isn’t harmful content.
The Federal Trade Commission, which would be tasked with enforcement, has also become a political flashpoint, with critics questioning its neutrality under shifting administrations.

State-Level Action
With KOSA stalled at the federal level, several states have taken matters into their own hands. Tennessee, Mississippi, Texas, and Utah are among the states that have passed laws requiring social media companies to verify the ages of users and obtain parental consent for minors. Many of these laws have been challenged in court on constitutional grounds, and their long-term viability remains uncertain.
Still, the growing momentum at the state level suggests a rising tide of public concern. For parents, educators, and child welfare advocates, the urgency to act has only increased as stories of online exploitation and mental health struggles continue to make headlines.
What Comes Next?
KOSA’s supporters in Congress are exploring several paths forward. One option is to attach the bill to a larger legislative package later this year—perhaps a government funding bill or broader tech regulation effort. Others are lobbying for renewed committee hearings or pushing for a standalone vote in the House.
Whether the bill moves forward as-is, gets rewritten, or remains indefinitely stalled, one thing is clear: the debate over how to protect kids online is far from over. Lawmakers on both sides of the aisle appear committed to finding a solution, but striking the right balance between safety, privacy, and free expression remains a formidable challenge.
As the internet continues to shape nearly every aspect of young people’s lives, the pressure on Congress to act—carefully and decisively—will only continue to grow.








