By Morgan Wilsmann
December 8, 2025
The House Committee on Energy and Commerce’s Subcommittee on Commerce, Manufacturing, and Trade held a hearing on December 2 to discuss a package of 19 bills, dubbed “Legislative Solutions to Protect Children and Teens Online” (“the Kids Package”). Some of these bills have been making their rounds for years and may be familiar to those even outside the tech policy world – namely the Kids Online Safety Act (KOSA) or the Children’s Online Protection Act (known as COPPA 2.0). Others are newer attempts to address very real harms kids face online, including concerns around AI chatbots.
The political pressure to act is immense, with nearly every week bringing another devastating story of a child harmed after using an online platform – yet few new federal laws have passed to regulate these platforms. A previous version of KOSA passed with an astonishing 91 votes in the Senate in July 2024, only to stall in the House when Speaker Mike Johnson blocked the floor vote over free speech concerns. Many kids-focused online safety bills find a similar fate, because these are speech platforms after all, and Congress shall make no laws abridging freedom of speech. There are fundamental tradeoffs to most of the bills in the Kids Package, which makes consensus elusive.
Complicating things, there was reporting that lawmakers considered attaching the (so far failed) AI regulation moratorium to the Kids Package, and shoehorning both into the must-pass National Defense Authorization Act (NDAA). Proponents theorized that the first attempt at an AI moratorium failed because of bipartisan concern that it could hamstring states’ ability to enact laws regulating AI and protecting kids. Under this theory, pushing through some of the bills in the Kids Package would assuage concerns about kids’ online safety. This theory is a mess for a number of reasons, even if there were a world in which many of these consistently controversial kids’ bills would miraculously gain broad support.
For one, opposition to the AI moratorium is not limited to concerns about kids’ online safety laws. Lawmakers and advocates worry about AI’s effects on employment, discrimination, surveillance, the integrity of information, and about a million other concerns. Since states are more agile than Congress, they need the flexibility to experiment with new regulations for emerging technologies to safeguard their citizens and maintain economic competitiveness – a point reiterated in the December 2 hearing by opponents of preemption language in the Kids Package bills.
After years of hearings about kids’ safety online, Congress has little to show for its efforts. Beyond KOSA and COPPA 2.0, the bills in the latest package are a scattershot approach, like throwing spaghetti at the wall and seeing what sticks. Proposals range from user safeguards, to privacy-invasive surveillance requirements, to outright bans on minors using certain platforms. Nearly every bill has problematic elements, with preemption language and Federal Trade Commission enforcement among the most common issues.
State Preemption as the Poison Pill
The broad state regulation preemption language in several bills is especially concerning. Although uniform federal standards offer benefits, they also hinder states from experimenting with various approaches to children’s online safety – a critical point given our decidedly unproductive Congress. It would be unfortunate if preemption clauses in Kids Package bills set an upper limit on online safety legislation: federal law should establish minimum standards for digital regulation, setting a baseline rather than a ceiling. Many lawmakers at the December 2 hearing believe this, too, suggesting that preemption language complicates an already contentious bill negotiation, with KOSA perhaps the most contentious.
We’ve seen many versions of KOSA since it was first introduced in 2022. This latest version of KOSA defanged the more controversial parts of the bill, specifically the “Duty of Care,” which would have held platforms responsible for reducing harms or face lawsuits. People across the political spectrum argued that the duty of care could lead platforms to overmoderate, removing broad categories of content that could be harmful for kids, but could also be useful to them. For example, a commonly discussed online harm, especially to teenage girls, is from content glorifying eating disorders. Yet, the duty of care provision could not only sweep up content that encourages eating disorders, but also content offering resources for those recovering from them.
The current House version instead requires platforms to establish “reasonable policies, practices, and procedures” to address specific harms, including threats of violence, sexual exploitation, drug sales, gambling, and financial fraud. This change, in our view, makes the bill more workable. KOSA also includes other provisions we can get behind, such as requiring default settings for high-risk features like going live or messaging with strangers, with the option for users to opt-in to these features with parental consent or age confirmation. However, the House bill also includes language that preempts state laws related to KOSA provisions, which could hamstring more comprehensive kids online safety regulation in the states, making the House version of KOSA one we cannot support. One step forward, two steps back.
The Problem With FTC Enforcement
Many of these bills rely on FTC enforcement, supplemented by state attorney general actions. However, not only do these bills add substantial new responsibilities to an agency already stretched thin, but President Trump’s FTC has made no secret of its willingness to pursue culture-war issues and political vendettas rather than act in accordance with its statutory mission. This is especially true given that the FTC still lacks Democratic commissioners and that Commissioner Rebecca Slaughter remains barred from performing her appointed duties. As we asserted when the President first terminated Commissioners Slaughter and Alvaro Bedoya, “the FTC was designed explicitly as a bipartisan, independent agency to protect consumers from industry abuses.”
Yet this FTC makes no secret of its wish to act on ideological grounds rather than fulfill its congressionally mandated duties. For instance, the FTC issued a civil investigative demand into Media Matters for America, a liberal media watchdog nonprofit, for its research on X’s advertising placement next to pro-Nazi and antisemitic content. The court found the FTC acted in “retaliatory animus” and Media Matters suffered irreparable First Amendment injury.
As Public Knowledge wrote in an amicus brief to the Supreme Court in Trump v. Slaughter, “A commission that can be directed to investigate critics, or shield allies, no longer functions as a neutral enforcer of law.” It is not hard to imagine an FTC that goes after social media companies for failing to act in ways that please the President, using its enforcement authority given by some of these kids’ safety bills as justification.
What Next
Fortunately, although the idea of tying the AI moratorium to this Kids Package and including it in the must-pass National Defense Authorization Act was considered, it was quickly abandoned. Only a few of these bills in the Kids Package have bipartisan support, while many also face bipartisan opposition. Meanwhile, the AI moratorium itself remains unpopular, even among Republicans. Two unpopular ideas do not make a popular one.
The debate over kids’ online safety will persist, with three notable issues emerging from this most recent hearing: 1) how to handle KOSA’s duty of care, 2) whether the preemption clauses in these bills are excessive, and 3) whether the FTC enforcement provisions in these bills are suitable given the current priorities and structure of the agency.
In the coming weeks, we will wade further into this conversation by providing a more in-depth analysis of the bills considered in this Kids Package, identifying which ideas we can get behind and which elements of these bills give us pause, through the lens of policy principles outlined in our paper “Kids Aren’t Alright: How To Build a Safer, Better Internet for Everyone.”
