Last week, the U.S. House Committee on Energy and Commerce advanced the Kids Online Safety Act (KOSA) for floor vote. KOSA first emerged two years ago and has undergone significant changes since then. Just hours before the Committee debated the bill, a new amendment was introduced in efforts to answer some of the pushback. The amendment narrowed the “duty of care” clause but did not ultimately solve the biggest problems with the legislation.

KOSA establishes a duty of care for media platforms to “prevent and mitigate” harms to minors. In other words, platforms can be held liable for negligence if they do not actively prevent any of the vague harms listed in KOSA. Critics of the bill have been quick to point out how broad the categories of harm are that companies will have to monitor and block.

The most recent version of KOSA includes an amendment to reframe the duty of care. It places the responsibility upon media platforms to prevent and mitigate “promotion of inherently dangerous acts that are likely to cause serious bodily harm, serious emotional disturbance, or death” instead of the previous listed harms of anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.

The new duty of care provision does not make the role and responsibility of platforms any less ambiguous. Congressman Jay Obernolte (CA-R), was quick to point out that the amendment is still “vaguely and poorly defined.” Anticipating what will cause “serious emotional disturbance” is a difficult to nearly impossible task. He continued to point out that the inevitable result would be endless litigation between social media companies and the Federal Trade Commission (FTC). Millions of dollars would be spent by both sides to distinguish what qualifies as serious emotional disturbance or serious bodily harm.

Even advocates of KOSA noted that the legislation would need work before it could be passed into law. While they are correct in acknowledging that KOSA is unable to solve the perils that children and youth could face on social media and in society in general, they miss an overriding fact that has become glaringly apparent after years of debate around the details of this bill: the federal government will never be able to mitigate and prevent all possible harms inherent to the online age any more than it can do so outside of it. That’s because these are issues of personal decision-making and behavior—in this case, that of families who bear the real duty of care in raising children.

Instead of relying on ineffective federal law targeting social media companies, parents and caregivers are able to respond to the needs of their children in real time. Education and resources for families navigating technology abound. The market also continues to evolve and respond to concerns about youth safety. Just last week, Meta unveiled “teen accounts” for their popular app, Instagram. The teen accounts emphasize a parent-guided experience to encourage communication between children and their parents on matters of content, contact, and screen time.

In order to remain competitive, private companies will always respond to the needs of their consumers in new, better, and more innovative ways—that is, unless the government gets in the way. Alternatively, KOSA’s ambiguous “duty of care” promises confusion among innovators, costly litigation for taxpayers, and a false sense of security for families.